I’m kind of new to local AI and wondering what’s the move here? Are they trying to pull off a chrome/android situation? Obviously I don’t trust any of these gafam giants but I would be really interested in running a local LLM on my M1 max (briefly used deepseek last year). My use case would be mostly chat functions to help with academic and text analysis tasks (don’t worry I don’t just blindly trust LLMs, I know what I’m doing), so recommendations are welcome.


That’s what I’m suspecting too. They’re trying to pull a chrome-like situation and become some sort of standard so devs eventually are stuck with their tech and whatever bullshit “manifest” update they release.
While not impossible, there’s a lot of variables when it comes to LLMs that I don’t really see how they’d do that, especially since it’s not particularly better then the competition.