I’m kind of new to local AI and wondering what’s the move here? Are they trying to pull off a chrome/android situation? Obviously I don’t trust any of these gafam giants but I would be really interested in running a local LLM on my M1 max (briefly used deepseek last year). My use case would be mostly chat functions to help with academic and text analysis tasks (don’t worry I don’t just blindly trust LLMs, I know what I’m doing), so recommendations are welcome.

  • ShimitarA
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    8
    ·
    15 hours ago

    “i know what I am doing” has been heard also from the overseer of that nuclear power plant, how was that called? Ah, Chernobyl…