Jan – Ollama alternative with local UI

38 maxloh 12 8/9/2025, 9:54:13 AM github.com ↗

Comments (12)

mathfailure · 25m ago
Is this an alternative to OpenWebUI?
semessier · 5m ago
still looking for vLLM to support Mac ARM Metal GPUs
biinjo · 44m ago
Im confused. Isn’t the whole premise of Ollama that its locallt ran? What’s the difference or USP when comparing the two.
hoppp · 19m ago
I think its an alternative because ollama has no UI and its hard to use for non-developers who will never touch the CLI
simonw · 17m ago
Ollama added a chat UI to their desktop apps a week ago: https://ollama.com/blog/new-app
apitman · 1m ago
Their new app is closed source right?
moron4hire · 41m ago
That's not the actual tagline being used in the repo. The repo calls itself an alternative to ChatGPT. Whoever submitted the link changed it.
roscas · 2h ago
Tried to run Jan but it does not start llama server. It also tries to allocate 30gb that is the size of the model but my vram is only 10gb and machine is 32gb, so it does not make sense. Ollama works perfect with 30b models. Another thing that is not good is that it make constant connections to github and other sites.
trilogic · 26m ago
If you looking for privacy there is only 1 app in the whole wide internet right now, HugstonOne (I challenge everyone to find another local GUI with that privacy). That said, Jan is also a good app and deserves credit for being local, there is no app without bugs.
hoppp · 16m ago
It probably loads the entire model into ram at once while ollama solves this and does not, it has a better loading strategy
SilverRubicon · 54m ago
Did you see the feature list? It does not deny that makes connections to other sites.

- Cloud Integration: Connect to OpenAI, Anthropic, Mistral, Groq, and others

- Privacy First: Everything runs locally when you want it to

bogdart · 40m ago
I tried Jan last year, but the UI was quite buggy. But maybe they fixed it.