Show HN: I made a free, open-source extension to refine prompts for AI chats

1 hughqiu 0 6/7/2025, 12:47:24 PM github.com ↗
Recently, during my intensive use of various large language models, I noticed a clear pattern: the quality of AI responses almost entirely depends on the quality of the questions we ask. A vague question can only yield a pile of technically correct but essentially useless answers.

At its core, this falls under the realm of Prompt Engineering.

I looked into several existing prompt optimization tools, but none of them really met my needs:

Unclear data flow: Many tools require sending my question and API key to their servers for processing, which raises concerns about privacy and security.

Overloaded features: I don't need chat history, document management, or other complicated features. I just want to optimize a single action — the moment of asking a question.

Commercialization: Most tools tend toward paid subscriptions or user acquisition funnels.

So, I built on all my own browser extension: HelpMeAsk. My development principles are simple: lightweight, focused, and secure.

Here’s how it works:

Model-powered self-optimization: After you type your question, the plugin calls your specified LLM (e.g., GPT-4o, Claude-3, DeepSeek, etc.) and applies a refined meta prompt that helps the model rewrite your question to be clearer and more structured.

Local API key storage: Your API key is encrypted and stored only in your browser's local storage. No third party — other than you and your AI provider — has access to it.

Direct requests: All API requests are sent directly from your browser to your chosen AI provider. The extension does not use any relay servers and does not log any data.

Comments (0)

No comments yet