I've been helping non-technical founders use AI builders (lovable/bolt etc), and I noticed a pattern: they can explain their app idea perfectly to me in conversation, but the moment they try to write prompts, everything falls apart. Outputs are generic, and look obviously AI-generated.
The problem isn't that they don't know what they want. It's that they literally don't speak the language these tools expect. "Modern but approachable" or "Duolingo, but slightly more serious" means nothing to an AI builder.
Instead of teaching them to write better prompts, I prototyped a voice AI that interviews them about their idea. You just talk (voice, not text). The AI asks clarifying questions like a technical, design-minded co-founder would:
- "Show me an app that has the feel you're going for"
- "When you say 'premium,' do you mean expensive-looking or well-crafted?"
- "Walk me through what happens in the first 30 seconds"
After 5 minutes of conversation, it outputs:
- Professional PRDs with user stories, acceptance criteria, technical requirements
- High-fidelity mockups that actually match what's in their head
- Everything formatted to work as prompts for any AI builder
The insight: Non-technical vibe-coders shouldn't need to learn what z-index or a modal is. They need their vision translated into a language / schema coding agents understand.
I'm looking for feedback from both sides:
- Non-technical folks: Does this solve a real problem for you?
- Technical folks: Have you seen this translation problem when helping non-technical friends?
The problem isn't that they don't know what they want. It's that they literally don't speak the language these tools expect. "Modern but approachable" or "Duolingo, but slightly more serious" means nothing to an AI builder.
Instead of teaching them to write better prompts, I prototyped a voice AI that interviews them about their idea. You just talk (voice, not text). The AI asks clarifying questions like a technical, design-minded co-founder would:
- "Show me an app that has the feel you're going for" - "When you say 'premium,' do you mean expensive-looking or well-crafted?" - "Walk me through what happens in the first 30 seconds"
After 5 minutes of conversation, it outputs: - Professional PRDs with user stories, acceptance criteria, technical requirements - High-fidelity mockups that actually match what's in their head - Everything formatted to work as prompts for any AI builder
The insight: Non-technical vibe-coders shouldn't need to learn what z-index or a modal is. They need their vision translated into a language / schema coding agents understand.
I'm looking for feedback from both sides: - Non-technical folks: Does this solve a real problem for you? - Technical folks: Have you seen this translation problem when helping non-technical friends?