I’m kind of new to local AI and wondering what’s the move here? Are they trying to pull off a chrome/android situation? Obviously I don’t trust any of these gafam giants but I would be really interested in running a local LLM on my M1 max (briefly used deepseek last year). My use case would be mostly chat functions to help with academic and text analysis tasks (don’t worry I don’t just blindly trust LLMs, I know what I’m doing), so recommendations are welcome.

  • all tomorrow's regrets@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 hours ago

    The same reason they made Android open source. To look good (Hey we aren’t like the other guys. We support open source!) and to bring in outside ideas, input, and work until they close off every avenue and it’s open in name only.

    • Yerbouti@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      Yeah, I’m thinking they want phone devs to start adding their models to android and create a similar situation where it becomes so integrated that it’s almost impossible to get rid off. Like they did with chromium, where most browser are now stuck with it and they can change the rules whenever they want.