Show HN: BaaS to build agents as data, not code (github.com)
5 points by ishita159 1d ago 0 comments
Show HN: Bringing Tech News from HN to My Community (sh4jid.me)
3 points by sh4jid 1d ago 2 comments
Jan – Ollama alternative with local UI
171 maxloh 66 8/9/2025, 9:54:13 AM github.com ↗
Main deal breaker for me when I tried it was I couldn't talk to multiple models at once, even if they were remote models on OpenRouter. If I ask a question in one chat, then switch to another chat and ask a question, it will block until the first one is done.
Also Tauri apps feel pretty clunky on Linux for me.
All of them, or this one specifically? I've developed a bunch of tiny apps for my own usage (on Linux) with Tauri (maybe largest is just 5-6K LoC) and always felt snappy to me, mostly doing all the data processing with Rust then the UI part with ClojureScript+Reagent.
I stumbled upon Jan.ai a couple of months ago when I was considering a similar app approach. I was curious because Jan.ai went way beyond what I considered to be limitations.
I haven’t tried Jan.ai yet, I see it as an implementation not a solution.
I met the team late last year. They’re based out of Singapore and Vietnam. They ghosted me after promising to have two follow-up meetings, and were unresponsive to any emails, like they just dropped dead.
Principles and manifestos are a dime a dozen. It matters if you live by them or just have them as PR pieces. These folks are the latter.
- Cloud Integration: Connect to OpenAI, Anthropic, Mistral, Groq, and others
- Privacy First: Everything runs locally when you want it to
I'm trying Jan now and am really liking it - it feels friendlier than the Ollama GUI.
I captured loopback and noticed Ollama returning an HTTP 403 forbidden message to Jan.
The solution was set environment variables:
Here's the rest of the steps:- Jan > Settings > Model Providers
- Add new provider called "Ollama"
- Set API key to "ollama" and point to http://localhost:11434/v1
- Ensure variables above are set
- Click "Refresh" and the models should load
Note: Even though an API key is not required for local Ollama, Jan apparently doesn't consider it a valid endpoint unless a key is provided. I set mine to "ollama" and then it allowed me to start a chat.
No comments yet
Can't make it work with ollama endpoint
this seems to be the problem but they're not focusing on it: https://github.com/menloresearch/jan/issues/5474#issuecommen...
I first used Jan some time ago and didn’t really like it but it has improved a lot so I encourage everyone to try it, it’s a great project