Show HN: App.build, an open-source AI agent that builds full-stack apps

64 davidgomes 10 6/4/2025, 7:58:49 PM app.build ↗

Comments (10)

davidgomes · 5h ago
OP here, everything is available on GitHub:

- https://github.com/appdotbuild/agent

- https://github.com/appdotbuild/platform

And we also blogged[1] about how the whole thing works. We're very excited about getting this out but we have a ton of improvements we'd like to make still. Please let us know if you have any questions!

[1]: https://www.app.build/blog/app-build-open-source-ai-agent

zihotki · 5h ago
Important part of the context missing or was cut off - it's for building apps on top of the Neon platform (PostgreSQL open source SAAS)
gavmor · 5h ago
ie inextricably coupled to their services? Or is it a matter of swapping out a few "provider" modules?
igrekun · 2h ago
Completely agnostic, if you run it locally, we provide a docker compose, if you have other deployment preferences pointing to your DB is a matter of changing env var https://github.com/appdotbuild/agent/blob/main/agent/trpc_ag...

We have baseline cursor rules included in case you want to hack on this manually https://github.com/appdotbuild/agent/tree/main/agent/trpc_ag...

Where we are tied is the LLM provider - you will need to supply your own keys for Anthropic / Gemini.

We did a couple runs on top of Ollama + Gemma - expect support for local LLMs. Can't swear on the timeline, but one of our core contributors recently built a water cooled rig with a bunch of 3090s so my guess is "pretty soon".

ah27182 · 5h ago
The CLI for this feels extremely buggy, Im attempting to build the application but the screen is flickering like crazy: https://streamable.com/d2jrvt
csomar · 1h ago
Average experience for AI-made/related products.
davidgomes · 5h ago
Yeah, we have a PR in the works for this (https://github.com/appdotbuild/platform/issues/166), should be fixed tomorrow!
ah27182 · 5h ago
Alright sounds good. Question, what LLM model does this use out of the box? Is it using the models provided by Github (after I give it access)?
igrekun · 2h ago
If you run locally you can mix and match any anthropic / gemini models. As long as it satisfies this protocol https://github.com/appdotbuild/agent/blob/4e0d4b5ac03cee0548... you can plug in anything.

We have a similar wrapper for local LLMs on the roadmap.

If you use CLI only - we run claude 4 + gemini on the backend, gemini serving most of the vision tasks (frontend validation) and claude doing core codegen.

davidgomes · 5h ago
We use both Claude 4 and Gemini by default (for different tasks). But the idea is you can self-host this and use other models (and even BYOM - bring your own models).