We have LLMs at home

We have LLMs at home

A quick post on how and why i use self hosted LLMs

While I have strong feelings about the overarching "AI" umbrella, open-source models and tools are neat.

Powered my home lab, with unraid, the ollama docker, and the open-webui docker.

Right now, these are my favorite opensource models:

I have a few more at the ready when needed



When needed, I can easily feed this to the net via my cloudflared container and custom domain.
I  much prefer keeping it offline and use the strength of the model vs  just having it be an internet gopher. However, i have searx self hosted  and can easily tap the API when needed