There are quite a few options for running your own LLM. Ollama makes it fairly easy to run (with a big selection of models - there’s also Hugging Face with even more models to suit various use cases) and OpenWebUI makes it easy to operate.
Some self-hosting experience doesn’t hurt, but it’s pretty straightforward to configure if you follow along with Networkchuck in this video.
There are quite a few options for running your own LLM. Ollama makes it fairly easy to run (with a big selection of models - there’s also Hugging Face with even more models to suit various use cases) and OpenWebUI makes it easy to operate.
Some self-hosting experience doesn’t hurt, but it’s pretty straightforward to configure if you follow along with Networkchuck in this video.