Show HN: A "Codebase" as an MCP Server
The reference impl runs on our open-source project’s codebase.
Why we built it Docs are important, but they add another abstraction layer between your code and your users. Keeping them at the right quality is hard (especially at a startup), and LLM-generated docs are often mediocre until you invest real polish.
Exposing code to the model in a structured way keeps answers grounded and current, and it’s always available. You can even switch across releases/branches.
Why share this Building this for our own repo showed us how much leverage such a tool can give and how little glue you need to ship one. We did it for ourselves, but we think every repo can (and probably should) do it too.
Bigger picture In platforms/infra, “learn a new paradigm” has been an incumbent moat. LLMs + the right MCP tools can flatten that curve so more people adopt new stacks quickly and ship faster.
How to try it Hosted over HTTP [2]. With Claude Code: claude mcp add -t http fenic-docs https://mcp.fenic.ai Then ask: “What is fenic?”
For local and more prompts, see [1].
If you want to take a look at the data without running the pipeline and performing the processing + inference, check [3].
Links [1] Example impl + prompt starter pack + local setup: https://github.com/typedef-ai/fenic/tree/main/examples/mcp/d... [2] Hosted endpoint: https://mcp.fenic.ai [3] Generated dataset used by the MCP server: https://huggingface.co/datasets/typedef-ai/fenic-0.4.0-codeb...
It’s not perfect, but it meaningfully improves the DX. We’d love feedback!
The updated README now has instructions that will work!