Ask HN: Go deep into AI/LLMs or just use them as tools?
154 points by pella_may 13h ago 118 comments
Ask HN: Selling software to company I work for as an employee
40 points by apohak 3d ago 51 comments
99% of AI Startups Will Be Dead by 2026 – Here's Why
21 georgehill 23 5/24/2025, 10:18:26 AM skooloflife.medium.com ↗
If you are going to write about AI companies going to be extinct next year, could you please write it without the use of AI? It turned very formulaic.
And the fact it was calling something a scam because it was packaged up? That’s the same as anything that’s packaged - may as well buy 6 apples and take home to wash/cut rather than the prepackaged/cut ones.
Some thought provoking ideas though - spoiling by the link to get early access to a local AI
GenAI padded blog post? Guess your content isn't interesting enough. GenAI album cover? Artist must be equally lazy at making music. GenAI graphics on some flyer someone hands me? Please, could have just slapped nothing but text on there & let your content, whatever it is, do the talking.
I know it's there to "make things pop" or whatever but I'm so put off by the ubiquitous blandness, the samey high contrasts, subtle artifacts... Milking peoples' attention is the new smoking, or at least it should be, IMO. Especially if it's done in the most aggravatingly bland style, that of the GenAI image generator.
We'll soon have artists whose skills will be more similar to editors than content generators. People who will be good at selecting the good parts of AI content while cutting out the bad ones.
Art is about having something to say. If your concern in writing is style over content, that says to me your goal is to hack my brain not help me think
Maybe this explains some of the success of brat by Charli xcx last summer
The difference between the stated podcast app and the dot com bubble is that one is making serious revenue at almost 100% profit, whereas one did not even have a revenue model.
Also I think everyone knows at this point that foundation models are a commodity and not a particularly profitable business.
It sounds like he sees what he does because that is all he looks for.
I hate instagram, but love boxer dogs. So when I’m forced to use Instagram I make sure I click on nothing else than boxer dog videos and pictures.
It’s remarkable how quickly the algorithm switches to your preferences. If you engage it will come back to you 10 fold.
So this is the model that investors see. The reality is quite different. People and orgs are not stupid and want to avoid vendor lock-in.
So in reality:
* Wrapper don't only rely on OpenAI. In fact, in order to be competitive, they have to avoid OpenAI because it's terribly expensive. If they can get away with other models, the savings can be enormous as some of these can be 10x cheaper.
* Local models are a thing. You don't need proprietary models and API calls at all for certain uses. And these models get better and better each year.
* Nvidia is still the dominant player and this won't change in the next years but AMD is really making huge progress here. I don't mention TPUs as they seem to be much Google-specific.
* Microsoft is not in any special position here - I was implementing OpenAI API integrations with various API gateways and it's by no means something related to Azure only.
* OpenAI's business model is based on faith at this moment. This was debated ad nauseam so it makes no sense to repeat all arguments here but the fact is that they used to be the only one game in town, then the leader, and now are neither, but still claim to be.
They are getting better so fast that I'm considering building a business that depends on much lower cost LLM inference. So betting years of effort on it.
But the bet is also that the proprietary models won't run away with faster improvements that make local models uncompetitive even while they improve. Can the local models keep up? They seem to be closing the gap now. Is that the rule or an artifact of the early development phase?
The safer plan may be to pass the inference cost through to the user and let them pick premium or budget models according to their need almost per request, as Zed editor does now.
I'm working on a solution right now that is using a local/cheap model first, does some validation, and if this validation fails, use the expensive SOTA model. This is the most reasonable approach if you have a way to verify the results somehow (which might not be easy depending on the use case).
But this is general SaaS model. Wrap thing that are being done by lower level software such as FFmpeg and expose them in a nice GUI ready for use by people who are not technical.
So what can change in the example above is the amount of markup going down, not the SaaS service going away entirely.
We'll replace bland, uninspired, rent-seeking but convenient ffmpeg-as-a-service SaaS tools at a 100x markup by automatically generated, vibe coded tools you can let an AI produce and host in some centralized cloud location at a 10x markup. Thus advancing the inexorable process of disintermediation by technology, turning everyone into a consumer and cutting out middle men everywhere.
Until it's only you and Jensen Huang sitting on top of a pile of your cash, shitting out NVIDIA cards like the sandworms in Dune shit out Spice.