Be part of the conversation. Join mind reader’s free community of readers analysing capitalism, technology and social change. Start right now!
It’s never been easier to do it locally. It’s just that it requires average users to invest significantly in hardware to make it work fast enough to be useful. While Apple’s commodity hardware is relatively good at this, the vast majority of users don’t have a capable GPU for AI (myself included). Linked content [1].
[1] https://fedoramagazine.org/running-generative-ai-models-locally-with-ollama-and-open-webui/