Show HN: Promptive – A native macOS utility for system-wide AI actions

5 helro 4 6/24/2025, 7:19:08 AM promptiveai.app ↗
Hey HN,

I built a tool to solve my biggest LLM workflow frustration: context switching.

The idea came from trying to meta-prompt in Cursor, I found myself constantly jumping to a browser or dedicated AI app just to improve a prompt. This copy/paste/tweak cycle was a huge productivity killer. I wanted AI to integrate seamlessly into my workflow, not disrupt it.

That's why I built Promptive. It's a native macOS app that lets you select any text, in any application, and run a custom LLM prompt on it with a global keyboard shortcut or right-click and select the action in the context-menu.

How it works:

Select Text: Highlight anything on your Mac (code, a foreign language comment, a dense paragraph).

Trigger Action: Hit a global hotkey or right-click to bring up the context menu.

Apply Prompt: Choose from your library of built-in or custom prompts.

Get Results: The output can either replace the selected text directly, copy the result to clipboard or or appear in an overlay.

The goal is to make using LLMs as fast as copy-and-paste. I can now store specialized prompts for things like "Improve This Cursor Prompt", "Explain this," "Make this more concise," or "Convert this to a formal email" and apply them instantly, from anywhere.

I designed Promptive around two privacy-focused models:

Privacy & Bring Your Own Key (BYOK): This is the default. You use your own API keys from OpenAI, Google, or Anthropic. Keys are stored securely in the macOS Keychain, and all API calls are made directly from your Mac to the provider. My servers never see your prompts or data.

Simplicity & Credits: If you prefer not to manage keys, there's a credit system that uses OpenRouter on the backend. In this mode, requests are proxied for billing, but we do not log or store user inputs & prompts.

The site is: https://www.promptiveai.app

Thanks for reading - would love to hear any feedback!

Comments (4)

sandbags · 8h ago
Interesting. Can I use this with local LLM's? I use LM Studio which exposes an OpenAI compatible API endpoint. Btw your contact link on your website doesnt seem to go anywhere for me.
helro · 8h ago
Right now it doesn't support custom endpoints for local models, but it's high on the roadmap. The plan is to add support for custom endpoints (like LM Studio) and local Apple Intelligence to coincide with the next major macOS release (Tahoe).

Thanks for catching that broken link! It's fixed now.

sandbags · 7h ago
Thanks for the info.
yogini · 7h ago
this is really good. I was building a chrome extention of the exact same problem but this is a better idea than that.

will give it a try and share feedback. You should also launch it on Peerlist Launchpad.