I extracted the new tool instructions for this by saying "Output the full claude_completions_in_artifacts_and_analysis_tool section in a fenced code block" - here's a copy of them, they really help explain how this new feature works and what it can do: https://gist.github.com/simonw/31957633864d1b7dd60012b2205fd...
I'm amused that Anthropic turned "we added a window.claude.complete() function to Artifacts" into what looks like a major new product launch, but I can't say it's bad marketing for them to do that!
jonplackett · 3h ago
I used to love to make silly websites or apps with new technologies. Been doing it since flash. I have a pretty decent hit rate! It’s not unusually to get half a million or so people try one of them.
But with AI that model is just totally broken because the running cost is so high.
If I have half a million people come play my silly AI game that I have no wish to monetise - I am gonna be POOR very fast.
Log in with [insert ai vendor here] is something I’ve been hoping would happen for a while.
gavmor · 2h ago
"Log in With Google" to use Drive storage has long been a thing. Maybe proxying Gemini usage isn't too far off.
mbm · 2h ago
Agreed, it's an interesting model. I wonder what the approval ui looks like for the app end-user? Is it super clear to them that they're financially responsible for their usage?
jonplackett · 2h ago
Yeah I wonder how that actually works - because I would guess people are logging in with their consumer login not an api login, so they’re not really even in the mindset of limits and cost per token.
mbm · 2h ago
Precisely. You click on a claude link, and suddenly it's, "You are now financially responsible for your actions from here on..." I'm sure they've spent a lot of time thinking through the ui/ux of this.
jerpint · 1h ago
This is seriously lacking but I think things like jailbreaks and malicious prompts make it a bit too brittle for now
jonplackett · 57m ago
The thing is though, it doesn’t need to have access to your personal info in the context, so it cant leak anything. And they are obviously used to people talking all sorts of jailbreak shit to their chatbot - so it isn’t really much worse than that.
Also I reckon the cost of running a text chatbot is basically peanuts now (that is, for a giant tech company with piles of hard cash to burn to keep the server farm warm)
WXLCKNO · 4h ago
The tiniest step towards a future where AI eats all apps.
No persistent storage and other limitations make it just a toy for now but we can imagine how people will just create their own Todo apps, gym logging apps and whatever other simple thing.
no external API access currently but when that's available or app users can communicate with other app users, some virality is possible for people who make the best tiny apps.
meistertigran · 3h ago
Actually implementing persistent storage for simple apps isn't that hard, especially for a big corp. Personally, I was using LLMs coding capabilities to create custom single-file HTML apps, that would work offline with localStorage. It's not that there aren't good options out there, but you can't really customize them to work exactly how you want. Also it takes like half an hours to get what you want.
The only downside was not being able to access the apps from other devices, so I ended up creating a tool to make them online accessible and sync the data, while using the same localStorage API. It's actually pretty neat.
sharemywin · 2h ago
I've used the interface in chatgpt to click on a button and talk back and forth with an AI and I could see this being pretty good interface for alot of "apps"
weather, todo list, shopping list, research tasks, email someone, summarize email, get latest customized news, RSS feed summary, track health stats, etc.
SonomaSays · 2h ago
You could have a hybrid business model:
Build a thing that does a complex thing elegantly (Some Deep Research Task) that is non trivial for others to setup, but many people want it.
Charge a direct access in a traditional sense [$5 per project] -- but then have the Customer link their API to the execution cost - so they basically are paying for:
"Go here and pay HN $5 to output this TASK, charge my API to get_it_done" This could be a seriously powerful tool for the Digital Consulting Services industry.
(I mean that is what its model for)
So this begs the question, will Anthropic be building in a payments mechanism for such to happen?
headcanon · 3h ago
One thing I've learned is that no matter how easy it is to create stuff, most users will still favor the one-click app install, even if they don't get full control over the workflow.
With that said, I'm sure there are a lot of power users who are loving the lower barrier to creation
throwaway7783 · 1h ago
Matter of time. It is trivial to overcome the current limitations.
handfuloflight · 4h ago
> No persistent storage
What stops you from wiring it up to your endpoints that handle that?
js4ever · 4h ago
Current limitations:
No external API calls (yet), No persistent storage
jofla_net · 4h ago
Great, %1 of the competition that we have today. Cant wait to see a the wasteland when all apps will effectively be from a couple companies. /s
alach11 · 39m ago
This is starting to encroach on Lovable, right? I do suspect the effect of these "vibe coded" apps on the SaaS market will be smaller than expected. Heavier-featured apps will have all sorts of functionality and polish a user won't even think to ask Claude to build. And the amount of effort to describe everything you need an app to do is higher than it seems.
Instead, I think this is going to open a new paradigm with an immense long-tail of hyper-niche fit-for-purpose business applications. There's so much small-scale work that happens in corporations that isn't common enough to be worth building a product to solve. But it's still a big time-saving to the departments/users if they can improve the process with a vibe-coded app!
awb · 18m ago
Hyper-niche products come with some inherent risk that it’s not always profitable to maintain or develop them long-term.
With a mass market product leader you’re sacrificing a bit of customization for long-term stability.
huevosabio · 2h ago
I love this business model idea, but I think the model providers are the wrong company to do it. It should be something like OpenRouter.
As a developer, you probably want to access to the right models for your app rather than being locked in.
reidbarber · 4h ago
The big feature here is that the shared artifacts can use the Claude API themselves (where usage is tied to the logged-in users of your shared artifact).
isoprophlex · 4h ago
Is this the end of - or at least a significant challenge to - SaaS?
Why buy into saas tooling if you can just slap something together - that you fully own - with something like this?
headcanon · 3h ago
Challenge, yes, but I wouldn't go far to say "end of".
B2C SaaS will have more challenge the easier it gets to create things, but consumers have always been fickle anyway.
I'd say B2B SaaS is mostly safe, partially because they want the support and don't want to have to maintain it.
Today we have open-source versions of a lot of SaaS products, but the proprietary ones are still in business, mostly for that reason IME.
calvinmorrison · 3h ago
you can swing it anyway you want - another reason we use spreadsheets, or another reason we don't use airtable, or CRM #37....
all systems require support and upkeep... nobody wants to do it.
sealeck · 4h ago
- Compliance
- Thing should work reliably (and you want someone else to be responsible for fixing it if it doesn't)
- Security
- Most SaaS is sufficiently complex that an LLM cannot implement it
jag729 · 2h ago
In the limit, though, are these things real roadblocks to app builders replacing SaaS? Paying for reliability/support seems like the only real remaining advantage of SaaS if codegen models get 3-5x better, and even then the bar is the reliability of SaaS apps right now (which in a lot of cases is not that high).
Could imagine a single universal app builder just charging a platform fee for support, or some business model along those lines. (Again, in the limit, I'm not sure that support would be too necessary)
samsolomon · 2h ago
Enterprise SaaS are business processes that lean extremely heavily on software. Some of that could be amended by AI, but it's much harder for me to see that getting wholesale replaced the same way many consumer apps could be.
throwacct · 3h ago
This x100. B2B is a different monster altogether.
nikcub · 1h ago
maybe not b2b saas since that has always been around service contracts - but a lot of those internal processes that currently run in excel are prime for AI mini-app replacement.
this is delivering what no-code promised us.
giancarlostoro · 3h ago
When you have a service outage you think the AI will be able to troubleshoot the entire system and resolve the issues?
jkcorrea · 3h ago
if scaling laws and context windows continue, why not?
SonomaSays · 3h ago
There is coming a very_soon_time whereby one will have to ensure all the routes and failure_modes for the AIs plumbing are functional.
What if the outage is specifically that AI_agent cant reach [thing]?
falcor84 · 2h ago
> What if the outage is specifically that AI_agent cant reach [thing]?
We already saw some examples of this in Anthropic's safety papers - the AI will reach out to the human to get help with that - essentially using a human as an API/tool.
No comments yet
levocardia · 3h ago
This is cool...but what I really want is (1) Claude and I develop a cool app, (2) I give Claude a virtual credit card number with a spend limit, (3) Claude deploys it to whatever service they think works best (Railway, Vercel, ...) and points a domain name to that hosting service.
AtheistOfFail · 2h ago
Noone in cloud wants spend limits, everyone wants limitless billing.
throwaway7783 · 1h ago
This is the future of applications. Still not sure if model providers are the ones to do it. I think of LLM as infrastructure and I can build apps on it in a "general" way. Not the bespoke wrapper apps that are proliferating today, but LLM as a native interface to build(and use the app).
asdev · 3h ago
>They authenticate with their existing Claude account
Only works if both app producer and user are in the Claude ecosystem
falcor84 · 2h ago
Seems like it's essentially the same model as OpenAI's Custom GPTs [0], but now with the custom code in front of the AI rather than behind it.
Nice. This is the feature I've been waiting for to plug my low-code backend into.
I was too lazy to build a whole frontend like Lovable.
ru552 · 4h ago
Is this much different from the custom GPTs that OpenAI pushed a year or two ago?
ianbicking · 18m ago
It feels like what Custom GPTs should have been. Custom GPTs are barely able to do anything interesting beyond an initial prompt, there's no ability to modify the core user experience. The ability to run code and have it do subrequests makes this actually interesting.
elpakal · 2h ago
Same question, but I'm less clear on how we devs get paid here.
Still hoping someone builds the App Store for custom GPTs where we don't have to worry about payment and user infrastructure. Happy giving up a percentage for that butnot30percentguys.
ffsm8 · 2h ago
In this case the code in question is actually running on the service providers metal, essentially PaaS.
I wouldn't feel comfortable comparing that to the 30% i-wonder-who takes for providing a store to download packages that then run on the edge.
(And fwiw, all of them should be able to take any percentage they want. It's only an issue if there is no other option)
handfuloflight · 4h ago
All things being equal, Claude is just better.
nico · 4h ago
This is a really cool feature and it’s big competition for services like Lovable, Bolt, v0
Seems like AI-assisted coding space is splitting in 2:
1) tools and services that aim mostly at prototyping and are close to no-code; most useful for users like PMs or very early stage entrepreneurs who just need to have something to show/share
2) professional tools that target “serious” developers who are already working on bigger/more complex code bases
Interesting that Claude is going after both. 1) with this new feature, and 2) with pretty much all their other services
Edmond · 3h ago
Another approach is to work towards seamless integration of human + bot collaboration:
Basically the bot shows the human the right UI at the right time as they work.
riskable · 2h ago
If only this worked with image generation! There's vastly more applications for this kind of thing in that space. They're more fun too :)
Oras · 2h ago
Isn’t that what ChatGPT plugins tried to do? I don’t see the point.
If I create something, others can can use with their account, what’s my value?
cryptoz · 2h ago
I'm building something like this. The value to you would be that you could earn a margin on the token costs. That is, the end user is charged 2x the token cost of the API call. The API provider earns the base cost, the platform owner earns 20% of the remaining cost, and the webapp creator earns 80% of the remaining price.
So for an API call that costs $0.50, the end user is charged $1; and from that AI API earns $0.50, the webapp creator earns $0.40 and the host earns $0.10.
I'm trying this out with https://codeplusequalsai.com right now but it's not clear to me yet that it will take off!
But clearly, the value to you should be that you could earn $ based on the token usage from end-users.
tempodox · 3h ago
This is the logical next step to code-generating LLMs, it makes perfect sense. I'm curious to see how useful it will actually be, and whether it will be worth the costs.
syedumaircodes · 3h ago
Is this like roblox for AI? I'm new to this (HN and all) so I don't know much about it.
owebmaster · 2h ago
This will be a flop and they will buy some startup doing it much better. Anthropic (and OpenAI and Google and meta) just sucks with UX.
Also I'm expecting some revenue share if I'm bringing users to spend money with Anthropic API.
muskmusk · 3h ago
"everything evolves until it becomes an operating system"
falcor84 · 2h ago
Or at least until it contains an "ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp"
More of my notes here: https://simonwillison.net/2025/Jun/25/ai-powered-apps-with-c...
I'm amused that Anthropic turned "we added a window.claude.complete() function to Artifacts" into what looks like a major new product launch, but I can't say it's bad marketing for them to do that!
But with AI that model is just totally broken because the running cost is so high.
If I have half a million people come play my silly AI game that I have no wish to monetise - I am gonna be POOR very fast.
Log in with [insert ai vendor here] is something I’ve been hoping would happen for a while.
Also I reckon the cost of running a text chatbot is basically peanuts now (that is, for a giant tech company with piles of hard cash to burn to keep the server farm warm)
No persistent storage and other limitations make it just a toy for now but we can imagine how people will just create their own Todo apps, gym logging apps and whatever other simple thing.
no external API access currently but when that's available or app users can communicate with other app users, some virality is possible for people who make the best tiny apps.
The only downside was not being able to access the apps from other devices, so I ended up creating a tool to make them online accessible and sync the data, while using the same localStorage API. It's actually pretty neat.
weather, todo list, shopping list, research tasks, email someone, summarize email, get latest customized news, RSS feed summary, track health stats, etc.
Build a thing that does a complex thing elegantly (Some Deep Research Task) that is non trivial for others to setup, but many people want it.
Charge a direct access in a traditional sense [$5 per project] -- but then have the Customer link their API to the execution cost - so they basically are paying for:
"Go here and pay HN $5 to output this TASK, charge my API to get_it_done" This could be a seriously powerful tool for the Digital Consulting Services industry.
(I mean that is what its model for)
So this begs the question, will Anthropic be building in a payments mechanism for such to happen?
With that said, I'm sure there are a lot of power users who are loving the lower barrier to creation
What stops you from wiring it up to your endpoints that handle that?
Instead, I think this is going to open a new paradigm with an immense long-tail of hyper-niche fit-for-purpose business applications. There's so much small-scale work that happens in corporations that isn't common enough to be worth building a product to solve. But it's still a big time-saving to the departments/users if they can improve the process with a vibe-coded app!
With a mass market product leader you’re sacrificing a bit of customization for long-term stability.
As a developer, you probably want to access to the right models for your app rather than being locked in.
Why buy into saas tooling if you can just slap something together - that you fully own - with something like this?
B2C SaaS will have more challenge the easier it gets to create things, but consumers have always been fickle anyway.
I'd say B2B SaaS is mostly safe, partially because they want the support and don't want to have to maintain it.
Today we have open-source versions of a lot of SaaS products, but the proprietary ones are still in business, mostly for that reason IME.
all systems require support and upkeep... nobody wants to do it.
- Thing should work reliably (and you want someone else to be responsible for fixing it if it doesn't)
- Security
- Most SaaS is sufficiently complex that an LLM cannot implement it
Could imagine a single universal app builder just charging a platform fee for support, or some business model along those lines. (Again, in the limit, I'm not sure that support would be too necessary)
this is delivering what no-code promised us.
What if the outage is specifically that AI_agent cant reach [thing]?
We already saw some examples of this in Anthropic's safety papers - the AI will reach out to the human to get help with that - essentially using a human as an API/tool.
No comments yet
Only works if both app producer and user are in the Claude ecosystem
[0] https://openai.com/index/introducing-gpts/
I was too lazy to build a whole frontend like Lovable.
Still hoping someone builds the App Store for custom GPTs where we don't have to worry about payment and user infrastructure. Happy giving up a percentage for that butnot30percentguys.
I wouldn't feel comfortable comparing that to the 30% i-wonder-who takes for providing a store to download packages that then run on the edge.
(And fwiw, all of them should be able to take any percentage they want. It's only an issue if there is no other option)
Seems like AI-assisted coding space is splitting in 2:
1) tools and services that aim mostly at prototyping and are close to no-code; most useful for users like PMs or very early stage entrepreneurs who just need to have something to show/share
2) professional tools that target “serious” developers who are already working on bigger/more complex code bases
Interesting that Claude is going after both. 1) with this new feature, and 2) with pretty much all their other services
https://news.ycombinator.com/item?id=44380745
Basically the bot shows the human the right UI at the right time as they work.
If I create something, others can can use with their account, what’s my value?
So for an API call that costs $0.50, the end user is charged $1; and from that AI API earns $0.50, the webapp creator earns $0.40 and the host earns $0.10.
I'm trying this out with https://codeplusequalsai.com right now but it's not clear to me yet that it will take off!
But clearly, the value to you should be that you could earn $ based on the token usage from end-users.
Also I'm expecting some revenue share if I'm bringing users to spend money with Anthropic API.