99% of AI Startups Will Be Dead by 2026 – Here's Why

22 georgehill 23 5/24/2025, 10:18:26 AM skooloflife.medium.com ↗

Comments (23)

stranded22 · 9h ago
Whilst I liked some of the article, I got very bored with the structure and after about half way, I skimmmed it.

If you are going to write about AI companies going to be extinct next year, could you please write it without the use of AI? It turned very formulaic.

And the fact it was calling something a scam because it was packaged up? That’s the same as anything that’s packaged - may as well buy 6 apples and take home to wash/cut rather than the prepackaged/cut ones.

Some thought provoking ideas though - spoiling by the link to get early access to a local AI

isoprophlex · 10h ago
Rant incoming. I know it's bad form to critique anything but the content... but I wish the story wasn't padded with those bland GenAI eyesores of an image. It's a dumb kneejerk reaction I observe in myself, but the presence of generated graphics anywhere immediately turn me off.

GenAI padded blog post? Guess your content isn't interesting enough. GenAI album cover? Artist must be equally lazy at making music. GenAI graphics on some flyer someone hands me? Please, could have just slapped nothing but text on there & let your content, whatever it is, do the talking.

I know it's there to "make things pop" or whatever but I'm so put off by the ubiquitous blandness, the samey high contrasts, subtle artifacts... Milking peoples' attention is the new smoking, or at least it should be, IMO. Especially if it's done in the most aggravatingly bland style, that of the GenAI image generator.

glimshe · 10h ago
Most of the negative reactions to GenAI graphical content is for images used "as is". I've seen artists using GenAI content who process, compose and enhance what comes out of the AI for truly striking results.

We'll soon have artists whose skills will be more similar to editors than content generators. People who will be good at selecting the good parts of AI content while cutting out the bad ones.

conartist6 · 9h ago
Still turns me off. If you can't do art, stop fronting like you can.

Art is about having something to say. If your concern in writing is style over content, that says to me your goal is to hack my brain not help me think

dingnuts · 6h ago
> Please, could have just slapped nothing but text on there & let your content, whatever it is, do the talking.

Maybe this explains some of the success of brat by Charli xcx last summer

skywhopper · 10h ago
Pretty clear to me the article text itself was largely LLM generated as well. Incredibly repetitive and built on the same basic points over and over. List-heavy. There is a good, if not particularly insightful, article idea here, but this is a very poor version of it.
christina97 · 10h ago
We might well be at the cusp of a huge bubble caused by investor hubris, but this article hasn’t convinced me.

The difference between the stated podcast app and the dot com bubble is that one is making serious revenue at almost 100% profit, whereas one did not even have a revenue model.

Also I think everyone knows at this point that foundation models are a commodity and not a particularly profitable business.

mosura · 10h ago
What does this guy look at on Instagram to get a feed like that?

It sounds like he sees what he does because that is all he looks for.

sitzkrieg · 7h ago
sure, but sometimes it will randomly flood you with some garbage topic. maybe its an escape hatch for low engagement accounts but everytime i (accidentally) see the search or reels list its usually some concerted normie theme
skywhopper · 10h ago
Yeah, folks don’t realize they are telling on themselves when they complain about the content of their Insta or TikTok feeds.
louthy · 9h ago
Exactly, my feed on Instagram is nothing but boxer dogs.

I hate instagram, but love boxer dogs. So when I’m forced to use Instagram I make sure I click on nothing else than boxer dog videos and pictures.

It’s remarkable how quickly the algorithm switches to your preferences. If you engage it will come back to you 10 fold.

yoouareperfect · 10h ago
Openai owns the intelligence until it doesnt, and the open source model is good enough
insane_dreamer · 5h ago
Not gonna sign up for a Medium account just to read this
dvfjsdhgfv · 10h ago
> Wrappers rely on OpenAI. OpenAI relies on Microsoft. Microsoft needs NVIDIA. NVIDIA owns the chips that power it all

So this is the model that investors see. The reality is quite different. People and orgs are not stupid and want to avoid vendor lock-in.

So in reality:

* Wrapper don't only rely on OpenAI. In fact, in order to be competitive, they have to avoid OpenAI because it's terribly expensive. If they can get away with other models, the savings can be enormous as some of these can be 10x cheaper.

* Local models are a thing. You don't need proprietary models and API calls at all for certain uses. And these models get better and better each year.

* Nvidia is still the dominant player and this won't change in the next years but AMD is really making huge progress here. I don't mention TPUs as they seem to be much Google-specific.

* Microsoft is not in any special position here - I was implementing OpenAI API integrations with various API gateways and it's by no means something related to Azure only.

* OpenAI's business model is based on faith at this moment. This was debated ad nauseam so it makes no sense to repeat all arguments here but the fact is that they used to be the only one game in town, then the leader, and now are neither, but still claim to be.

delichon · 10h ago
> Local models are a thing. You don't need proprietary models and API calls at all for certain uses. And these models get better and better each year.

They are getting better so fast that I'm considering building a business that depends on much lower cost LLM inference. So betting years of effort on it.

But the bet is also that the proprietary models won't run away with faster improvements that make local models uncompetitive even while they improve. Can the local models keep up? They seem to be closing the gap now. Is that the rule or an artifact of the early development phase?

The safer plan may be to pass the inference cost through to the user and let them pick premium or budget models according to their need almost per request, as Zed editor does now.

hackingonempty · 9h ago
Outside of giant tech companies, there are many researchers with access to little more than a single consumer GPU card. They are highly motivated to reduce the cost of training and inference.
largbae · 9h ago
It might not matter that proprietary models stay ahead of local, as long as the local models are strong enough for your use case.
delichon · 9h ago
The use case is structuring arbitrary natural language, e.g. triple extraction. That seems to benefit from as much context and intelligence as can be applied. "Good enough" remains a case by case judgment.
dvfjsdhgfv · 7h ago
> The safer plan may be to pass the inference cost through to the user and let them pick premium or budget models according to their need almost per request, as Zed editor does now.

I'm working on a solution right now that is using a local/cheap model first, does some validation, and if this validation fails, use the expensive SOTA model. This is the most reasonable approach if you have a way to verify the results somehow (which might not be easy depending on the use case).

dvfjsdhgfv · 10h ago
> And they’re charging 50–100/month to do what anyone could replicate for pennies. It’s not just overpriced — it’s dishonest. The entire business model relies on the user not knowing how simple it really is.

But this is general SaaS model. Wrap thing that are being done by lower level software such as FFmpeg and expose them in a nice GUI ready for use by people who are not technical.

So what can change in the example above is the amount of markup going down, not the SaaS service going away entirely.

isoprophlex · 10h ago
If vibe coding works well enough, maybe the entire saas industry can be disrupted out of existence.

We'll replace bland, uninspired, rent-seeking but convenient ffmpeg-as-a-service SaaS tools at a 100x markup by automatically generated, vibe coded tools you can let an AI produce and host in some centralized cloud location at a 10x markup. Thus advancing the inexorable process of disintermediation by technology, turning everyone into a consumer and cutting out middle men everywhere.

Until it's only you and Jensen Huang sitting on top of a pile of your cash, shitting out NVIDIA cards like the sandworms in Dune shit out Spice.

skywhopper · 10h ago
Nah, I expect vibe-coding tooling will soon enough be directing users to use “partner” services and away from free tools. The non-techie users won’t know ffmpeg exists, they’ll just know about the video-oriented SaaS subscriptions the chatbot suggests when they ask.
jopsen · 7h ago
Yeah, you can also make the argument many saas things are just postgres with some templates :)