Show HN: How to make your MCP clients more context-aware

7 staranjeet 0 5/13/2025, 4:52:00 PM github.com ↗
Hey HN, we’re launching OpenMemory[1], an open source tool that lets you run a personal, portable memory layer for LLMs. Fully self-hosted and under your control. It uses standard MCP protocol and plugs into any MCP client (like Cursor, Windsurf, Claude, etc.) over Server-Sent Events (SSE).

Here’s a complete tutorial[2] that shows how to set it up locally, the underlying components involved, complete overview of architecture and with some real-world use cases with examples. It also explains the basic flow, why the project even matters, security, access control and what's actually happening behind the UI. A couple of months ago, we were experimenting with multi-agent setups using tools like Cursor and Claude, and we kept running into the same issue: Agents starting the conversation from scratch (no context). We wanted something lightweight but powerful, a memory layer that lives locally on our machine, works with any MCP client over SSE, and lets you store, search, and control long-term memory without shipping your data to the cloud. It acts as a middle layer between your LLM-powered client and a vector database, storing and recalling arbitrary chunks of text called “memories” across sessions.

Under the hood, it uses Qdrant for semantic search and relevance-based retrieval, while running entirely on your own infrastructure via Docker, Postgres and Qdrant with zero data leaving your system. A built-in Next.js & Redux dashboard lets you inspect which apps are reading or writing memories, along with a full audit trail of state changes. So happy to share learnings and get insights from your experiences. looking forward to comments!

Link: [1]: https://github.com/mem0ai/mem0/tree/main/openmemory [2]: https://mem0.ai/blog/how-to-make-your-clients-more-context-a...

Comments (0)

No comments yet