Hi all, i am quite an old fart, so i just recently got excited about self hosting an AI, some LLM…

What i want to do is:

  • chat with it
  • eventually integrate it into other services, where needed

I read about OLLAMA, but it’s all unclear to me.

Where do i start, preferably with containers (but “bare metal”) is also fine?

(i already have a linux server rig with all the good stuff on it, from immich to forjeio to the arrs and more, reverse proxy, Wireguard and the works, i am looking for input on AI/LLM, what to self host and such, not general selfhosting hints)

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    13 hours ago

    There’s another community for this: !localllama@sh.itjust.works
    Though we mostly discuss the news and specific questions there, beginner questions are a bit more rare.

    I think you already got a lot of good answers here, LMStudio, OpenWebUI, LocalAI…
    I’d like to add KoboldCpp that’s kind of made for gaming/dialogue, but it can do everything. And from my experience it’s very easy to set up and bundles everything into one program.