Show HN: JavaFactory – IntelliJ plugin to generate Java code

32 javafactory 9 5/20/2025, 11:29:45 AM github.com ↗
Hi HN,

I built a code generator plugin for IntelliJ that uses LLMs to create repetitive Java code like implementations, tests, and fixtures — based on custom natural-language patterns and annotation-based references.

Most tools like Copilot or Cursor aim to be general, but fail to produce code that actually fits a project structure or passes tests.

So I made something more explicit: define patterns + reference scope, and generate code consistently.

In this demo, 400 lines of Java were generated in 20 seconds — and all tests passed: https://www.youtube.com/watch?v=ReBCXKOpW3M

GitHub: https://github.com/JavaFactoryPluginDev/javafactory-plugin

Comments (9)

geodel · 3h ago
Feels very Java like. Factories, repositories, utils, patterns etc. Good stuff.
asdffdasy · 2h ago
yoDawgMemesFactory
simpaticoder · 2h ago
If the trend continues a program will look like "JavaFactory("<prompt>").compile().run();".
winrid · 1h ago
I've always wondered how long until we reach this. If every pc can run models locally, with a given seed and prompt it could be the ultimate compression. It's also hilarious.
imhoguy · 1h ago
Although very lossy compression, each invocation will be different, so that will inevitably circle back to "strong-static-LLM" prompts. What? wait..!
woodrowbarlow · 16m ago
LLMs at their core do produce reproducible results with a given seed. it's all the workflow stuff people do on top that tends to break reproducibility.
likis · 2h ago
What LLM is it using? Is it something local? Or does it call out? It wasn't obvious from the docs, and I didn't want to dig through all of the code to figure it out. Should probably be clearly stated on the front page.

But the project looks interesting, I have been looking for something similar.

trollied · 2h ago
It uses openai.
cess11 · 3h ago
The guide is a 404.

"404 - page not found The

master branch of

javafactory-plugin does not contain the path

docs/how-to-use.md."

How do I hook it into local models? Does it support Ollama, Continue, that kind of thing? Do you collect telemetry?