Show HN: Vibe Coded my own chat app with Meta Llama-Stack, and MCP integrations

1 r2ob 0 5/4/2025, 4:21:12 AM github.com ↗
I needed a quick way to experiment with two very different kinds of tools at once: custom Python functions living on my laptop and MCP-server tools exposed remotely through Supergateway. Instead of juggling multiple demos, I spun up my own chat app with Gradio, wired it to Meta’s Llama-Stack, and made everything talk to the same agent. With this setup I can drop a new local tool in a folder, point to a hosted endpoint, and see both fire in the same conversation stream—no heavy front-end work, just an instant playground for prompt tweaks, tool-calling logic, and Supergateway configs. Building the app scratched my itch for fast iteration and keeps all my LLM experiments under one roof.

Comments (0)

No comments yet