Ask HN: Why aren't local LLMs used as widely as we expected?

3 briansun 4 9/8/2025, 1:34:45 PM
On paper, local LLMs seem like a perfect fit for privacy‑sensitive work: no data leaves the machine, no margin cost, and can access local data. Think law firms, financial agents, or companies where IT bans browser extensions and disallows cloud AI tools on work machines. Given that, I’d expect local models to be everywhere by now—yet they still feel niche.

I’m trying to understand what’s in the way. My hypotheses (and I’d love corrections):

1) People optimize for output quality over privacy. 2) Hardware is far behind. 3) The tool people truly want (e.g., “a trustworthy, local‑only browser extension”) have yet to emerge. 4) No one has informed your lawyer about this—for now. 5) Or: adoption is already happening, just not visible.

It’s possible many teams are quietly using Ollama in daily work, and we just don’t hear about it.

Comments (4)

codeptualize · 4h ago
I think there are two cases:

1. Self hosting

2. Running locally on device

I have tried both, and find myself not using either.

For both the quality is less than the top performing models in my experience. Part of it is the models, part might be the application layer (chatgpt/claude). It would still work for a lot of use cases, but it certainly limits the possibilities.

The other issue is speed. You can run a lot of things even on fairly basic hardware, but the token speed is not great. Obviously you can get better hardware to mitigate that but then the cost goes up significantly.

For self hosting, you need a certain amount of throughput to make it worth it to have GPU's running. If you have spiky usage you are either paying a bunch for idle GPU's or you have horrible cold start times.

Privacy wise: The business/enterprise TOS's of all big model providers give enough privacy guarantees for all or at least most use cases. You can also get your own OpenAI infra on Azure for example, I assume with enough scale you can get even more customized contracts and data controls.

Conclusion: Quality, speed, price, and you are able to use the hosted versions even in privacy sensitive settings.

gobdovan · 4h ago
If you are a company and you want the advantages of a maintained local-like LLM, if your data already lives in AWS, you'll naturally use Bedrock for cost savings. Given most companies are on cloud, it makes sense they won't do a local setup just for the data to just go back on AWS.

For consumers, it actually requires quite powerful systems, and you won't get the same tokens per minute nor the same precision of an online LLM. And online LLMs already have infrastructure in search engine communication and agent-like behavior that simply makes them better for a wider range of tasks.

This covers most people and companies. So it's either local experience is way worse than online (for most practitioners) or that you already have a local-like LLM in the cloud, where everything else of yours already lives. Not much space left for local on my own server/machine.

just_human · 4h ago
Having worked in a (very) privacy-sensitive environment, the quality of the hosted foundation models are still vastly superior to any open weight model for practical tasks. The foundation model companies (OpenAI, Anthropic, etc) are willing to sign deals with enterprises that offer reasonable protections and keep sensitive data secure, so I don't think privacy or security is a reason why enterprises would shift to open weight models.

That said, I think there is a lot of adoption of open weight for cost-sensitive features built into applications. But i'd argue this is due cost, not privacy.

jaggs · 3h ago
Two reasons?

1. Management 2. Scalability

Running your own local AI takes time, expertise and commitment. Right now the ROI is probably not strong enough to warrant the effort.

Couple this to the fact that it's not clear how much local compute power you need, and it's easy to see why companies are hesitating?

Interestingly enough, there are definitely a number of sectors using local AI with gusto. The financial sector comes to mind.