I’m kind of new to local AI and wondering what’s the move here? Are they trying to pull off a chrome/android situation? Obviously I don’t trust any of these gafam giants but I would be really interested in running a local LLM on my M1 max (briefly used deepseek last year). My use case would be mostly chat functions to help with academic and text analysis tasks (don’t worry I don’t just blindly trust LLMs, I know what I’m doing), so recommendations are welcome.

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    17 hours ago

    Probably because they play the same game as Mark Zuckerberg, the Chinese, to some degree OpenAI… They all release open-weights models.

    They’ll generate some hype for their company that way, so it’s advertising. They build good will. They undercut the competition. Or make it clear how they outperform them. Maybe they get some more investor money if they do expand to the local models market. I bet there’s a million reason why it makes sense from a business perspective.