I’m kind of new to local AI and wondering what’s the move here? Are they trying to pull off a chrome/android situation? Obviously I don’t trust any of these gafam giants but I would be really interested in running a local LLM on my M1 max (briefly used deepseek last year). My use case would be mostly chat functions to help with academic and text analysis tasks (don’t worry I don’t just blindly trust LLMs, I know what I’m doing), so recommendations are welcome.


Yeah, I’m thinking they want phone devs to start adding their models to android and create a similar situation where it becomes so integrated that it’s almost impossible to get rid off. Like they did with chromium, where most browser are now stuck with it and they can change the rules whenever they want.