I’m kind of new to local AI and wondering what’s the move here? Are they trying to pull off a chrome/android situation? Obviously I don’t trust any of these gafam giants but I would be really interested in running a local LLM on my M1 max (briefly used deepseek last year). My use case would be mostly chat functions to help with academic and text analysis tasks (don’t worry I don’t just blindly trust LLMs, I know what I’m doing), so recommendations are welcome.

  • Yerbouti@sh.itjust.worksOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    I agree. I’m thinking they are trying anything hoping it will stick because clearly, they are losing the AI race. So offering models that can run locally was their best bet. And deepseek might have just fuck their shit up a little more with V4.