We built a browser extension for local LLMs with Ollama

4 jjw9 1 7/31/2025, 6:35:15 AM github.com ↗

Comments (1)

jjw9 · 16h ago
I’ve been experimenting a lot with Ollama and local LLMs lately, and wanted a way to make them useful while browsing — without needing to send anything to the cloud.

We tried LM Studio, Mistral via mtsy, and a few other setups — all great in different ways — but Ollama felt the most stable and portable for casual local inference.

So our small team built NativeMind, a lightweight open-source Chrome extension that connects to your local Ollama instance and lets you: • Summarize the current page in a sidebar • Ask questions about the content you’re reading • Search across open tabs • Use any Ollama model you’ve got running

It runs fully on-device. No external API calls, no accounts, no telemetry. Just local inference in your browser.

We mostly use it to speed up reading long-form content (docs, papers, articles), and it’s been surprisingly helpful for managing tab overload during research.

Here’s the repo: https://github.com/NativeMindBrowser/NativeMindExtension

We’d love feedback — especially if you’re working on local-first AI tools or using Ollama / LM Studio / mtsy in creative ways. Open to ideas, bug reports, or just chatting about workflows.