Show HN: Pluely – open-source alternative to Cluely (~10MB, Always On Display)

4 truly_sn 4 8/19/2025, 8:06:42 AM github.com ↗
Hey HN,

A bit of backstory: I was job hunting for months, applying everywhere and racking up a ton of rejections. It got frustrating, so I decided to flip the script and build something useful instead: an open-source take on Cluely, that popular stealth AI tool from the $15M company. I figured why not recreate something that's already out there and buzzing, but make it free, transparent, and way better for privacy. Plus, it felt like a fun challenge.

This is actually my first project ever using Tauri, and I'm hooked. I started out trying Electron, but it just wasn't cutting it for something that needs to be fast and reliable - Electron apps end up bloated with all that browser overhead, sucking up RAM and starting slow, which kills the whole "instant access" vibe I was going for. Tauri, with its Rust core, keeps things lean and snappy without any of that extra weight. Anyway, Pluely is a super lightweight desktop app, just ~10MB (that's 27 times smaller than Cluely), designed to give you real-time AI help during video calls, interviews, sales talks, or pretty much any chat, all without anyone spotting it and without real-time meeting transcription(can add it based on the response).

Here's what makes it tick: An undetectable overlay: It's a see-through window that floats over any app like Zoom or Teams, and it stays hidden in screen shares, recordings, or screenshots. Support for multiple LLMs: Hooks up to OpenAI models like GPT-4, Anthropic's Claude, Google Gemini, or xAI Grok. It pulls the models dynamically, and everything runs off your own API keys stored right on your device - no servers or middlemen. Flexible inputs: Type stuff in, speak naturally (using Whisper for speech-to-text and voice detection so it just picks up when you talk), or toss in images and screenshots for the AI to break down.

Privacy and speed first: All processing is local, it uses half the CPU and RAM of similar tools, built on Tauri plus React. No tracking, no data leaving your machine. Conversation history: Kept locally so you can pick up where you left off. Works everywhere: Grab it as a .dmg for macOS, .msi for Windows, or .deb for Linux.

It's killer for stuff like nailing job interviews with discreet tips, debugging code on the spot in tech meetings, or getting strategy nudges during negotiations. Unlike the original Cluely, this one's totally free, you can tweak it however you want, and it's all out in the open.

Download now on GitHub releases: https://github.com/iamsrikanthnani/pluely/releases

I'd love your thoughts, especially on adding custom LLM providers or tweaking the voice detection. What else would you want in a stealth AI sidekick?

Comments (4)

addandsubtract · 1h ago
This looks neat. Thanks for providing an open source alternative!

However, you say everything is "100% local", but then require an API key for one of the big 4 LLM providers. Anything leaving my PC isn't local. Could you add the option to use a local model (ie. Ollama or custom API URL) for inference? That would truly make it local first, and a great companion app.

truly_sn · 1h ago
Thanks for the wonderful feedback. I was planning to add a custom provider with a custom URL and API key, which will be released in the next update.
addandsubtract · 1h ago
Awesome! Also, could you make it possible to move the widget around the screen? Right now, it's fixed at the top, potentially covering tabs or other objects.

And maybe even more critical, I can't click outside the widget to focus/work on anything else. So like, to type this message, I had to alt+tab (and hide the widget) to focus on the browser. I'm not sure if it's a limitation of Tauri, but if it could be a widget like the ChatGPT one, that would be great.

truly_sn · 48m ago
Sure, It’s actually an overlay which will always stay behind your active tab(when you click on other tabs from the left side or right side it will work). There is a particular height that I have added, which I will make dynamic in the next release. Maybe that might fix your issue. If you notice anything else or any other suggestions, let me know.