Show HN: Simple wrapper for Chrome's built-in local LLM (Gemini Nano)

22 kstonekuan 2 7/6/2025, 5:54:27 PM github.com ↗
Chrome now includes a native on-device LLM (Gemini Nano) starting in version 138. I've been building with it since it was in origin trials, it's powerful but the official Prompt API is still a bit awkward:

- Enforces sessions even for basic usage

- Requires user-triggered downloads

- Lacks type safety or structured error handling

So I open-sourced a small TypeScript wrapper I originally built for other projects to smooth over the rough edges:

github: https://github.com/kstonekuan/simple-chromium-ai

npm: https://www.npmjs.com/package/simple-chromium-ai

- Stateless prompt() method inspired by Anthropic's SDK

- Built-in error handling and Result-based .Safe.* variants with neverthrow

- Token usage checks

- Simple initialization that provides a helper to trigger downloads (must be triggered by user action)

It’s intentionally minimal for hacking and prototyping. If you need fine-grained control (e.g. streaming, memory control), use the native API directly:

https://developer.chrome.com/docs/ai/prompt-api

Would love to hear what people build with it or any feedback!

Comments (2)

xnx · 5h ago
Could you host a static page of this on Github?
kstonekuan · 2h ago
It only works in extensions for now but I can post a demo in the repo soon

You can also join Chrome’s EPP to use it on webpages