I’m kind of new to local AI and wondering what’s the move here? Are they trying to pull off a chrome/android situation? Obviously I don’t trust any of these gafam giants but I would be really interested in running a local LLM on my M1 max (briefly used deepseek last year). My use case would be mostly chat functions to help with academic and text analysis tasks (don’t worry I don’t just blindly trust LLMs, I know what I’m doing), so recommendations are welcome.

  • SuspciousCarrot78@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    12 hours ago

    Because they love and car…sorry…I threw up in my own mouth there.

    Loss leader? Positioning? Mind share? Cock tease? All of the above?

    Given that it takes a few million dollars to train a model from scratch, it sure as shit ain’t for love.

    So, here’s a hot take - if Google’s core business is advertising, and that machine runs on data and attention, I dunno…maybe the idea is if you can’t beat em, co-opt em, set the de-facto standards and own the “attention”?

    • Yerbouti@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      co-opt em, set the de-facto standards and own the “attention”

      That’s what I’m suspecting too. They’re trying to pull a chrome-like situation and become some sort of standard so devs eventually are stuck with their tech and whatever bullshit “manifest” update they release.

      • Jakeroxs@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 minutes ago

        While not impossible, there’s a lot of variables when it comes to LLMs that I don’t really see how they’d do that, especially since it’s not particularly better then the competition.