Show HN: An MCP server for understanding AWS costs
I know AI is peak hype right now. But it has definitely changed some of our dev workflows already. So we wanted to find a way to let our customers experiment with how they can use AI to make their cloud cost management work more productive.
The MCP Server acts as a connector between LLMs (right now only Claude, Cursor support it but ChatGPT and Google Gemini coming soon) and your cost and usage data on Vantage which supports 20+ cloud infra providers including AWS, Datadog, Mongo, etc. (You have to have a Vantage account to use it since it's using the Vantage API)
Video demo: https://www.youtube.com/watch?v=n0VP2NlUvRU
Repo: https://github.com/vantage-sh/vantage-mcp-server
It's really impressive how capable the latest-gen models are with an MCP server and an API. So far we have found it useful for:
Ad-Hoc questions: "What's our non-prod cloud spend per engineer if we have 25 engineers"
Action plans: "Find unallocated spend and look for clues how it should be tagged"
Multi-tool workflows: "Find recent cost spikes that look like they could have come from eng changes and look for GitHub PR's merged around the same time" (using it in combination with the GitHub MCP)
Thought I'd share, let me know if you have questions.
That being said, an easier-to-distribute user experience would be to leverage short-lived OAuth tokens that LLM clients such as Claude or Goose ultimately manage for the user. We’re exploring these avenues as we develop the server.
That said, given https://github.com/runsecret/rsec#aws-secrets-manager presumably in order to keep AWS credentials off disk one would then have to have this?
in contrast to the op binary that is just one level of indirection, since they already handshake with the desktop app for $(op login) purposesI agree RunSecret adds a level of indirection at this stage that op doesn’t (if you are using 1pass). This is something I plan to polish up once more vaults are supported. You’ve given me some ideas on how to do that here.
And thanks for the advice on doing a Show HN, planning to do so once a few more rough edges are smoothed out.
That is extra weird when thinking about the audience who might be Vantage.sh users (and thus have the ability to create the read-only token mentioned elsewhere) but would almost certainly be using it from their workstation, in a commercial context. Sounds like you're trying to keep someone from selling your MCP toy and decided to be cute with the licensing text
One is the MIT license does not prohibit selling. And wrapping it in a "for non-commercial uses" clause creates a contradiction difficult, if not impossible to enforce.
The biggest is giving the LLM context. On Vantage we have a primitive called a "Cost Report" that you can think of as being a set of filters. So you can create a cost report for a particular environment (production vs staging) or by service (front-end service vs back-end service). When you ask questions to the LLM, it will take the context into account versus just looking at all of the raw usage in your account.
Most of our customers will create these filters, define reports, and organize them into folders and the LLM takes that context into account which can be helpful for asking questions.
Lastly, we support more providers beyond AWS so if you wanted to merge in other associated costs like Datadog, Temporal, Clickhouse, etc.