Siri is a really bad product when compared to GPT. Why has it taken Apple so long to get AI right?
Comments (6)
sky2224 · 16m ago
If you look at a lot of products out there that are basically what people want Siri to be (a truly integrated AI assistant), they either fall super flat (see rabbit r1) or are prohibitively expensive (see motion ai).
Additionally, I think another one people forget is that Apple positioned itself with their marketing as the last bastion of privacy (I'm emphasizing marketing, not reality here), which gets contradicted with AI's current public perception. There's also the issue of choosing whether to keep things in the cloud or on device.
One final major thing that I think Apple is considering: how could AI integration inadvertently cannibalize their app store profits? If they're able to offer AI integration that other calendar, todo, etc. apps can't, how will that affect a user's purchasing decision? If doing this would potentially eat into app store profits, then the next logical step is to create a system that allows developers to essentially have an API into the user's trained data, which is likely no small feat, especially if Apple is trying to obscure the data exposed to the app developer.
aristofun · 21m ago
It sounds like a low hanging fruit, but it's far from it.
You can make LLMs work _anecdotically_, but the brutal truth is they don't scale. Money-wise but more importantly for Apple - accuracy and reproducibility-wise.
Siri is dumb, but it's very predictibly dumb and it does those few primitive things it does reliably well with zero cost for apple.
al_borland · 1h ago
People expect Siri to have deep integration and knowledge of their data, not just a generic LLM. People will also hold Apple and Siri to a higher standard to actually be correct, watch is another thing LLMs don’t care about.
They could have rushed something out, but it would fall short in these areas. I’d rather than take their time and get it right. If people want a generic LLM that hallucinates a lot, there are plenty of apps for that.
PaulHoule · 3h ago
Siri has access to all your personal data and has to be right. If you're going to ask it when you need to be at the airport for your flight and it say 7pm and it's really 7am you are really screwed. If you're asking Copilot for an opinion about your not-so-hot take on Curtis Yarvin
Do you regularly use chatGPT as a voice assistant?
Like, I kind of wonder what else I might want SIRI to do that isn’t hard. I would like it to occasionally summarize facts like who was the president of France in 1975, and I might use it for that, but I wonder… does Apple make any money if they add that feature? Do they _want_ people using Siri? Yes, it is actually a valuable feature for me to be able to get directions while driving or play a song while cooking. Everything else has a weird cost-benefit ratio to Apple I imagine. The things that make you think the iPhone is substantively more valuable because Siri can do them are hard.
It probably comes down to the fact that Apple likes to make money, and OpenAI isn’t quite as concerned.
Additionally, I think another one people forget is that Apple positioned itself with their marketing as the last bastion of privacy (I'm emphasizing marketing, not reality here), which gets contradicted with AI's current public perception. There's also the issue of choosing whether to keep things in the cloud or on device.
One final major thing that I think Apple is considering: how could AI integration inadvertently cannibalize their app store profits? If they're able to offer AI integration that other calendar, todo, etc. apps can't, how will that affect a user's purchasing decision? If doing this would potentially eat into app store profits, then the next logical step is to create a system that allows developers to essentially have an API into the user's trained data, which is likely no small feat, especially if Apple is trying to obscure the data exposed to the app developer.
You can make LLMs work _anecdotically_, but the brutal truth is they don't scale. Money-wise but more importantly for Apple - accuracy and reproducibility-wise.
Siri is dumb, but it's very predictibly dumb and it does those few primitive things it does reliably well with zero cost for apple.
They could have rushed something out, but it would fall short in these areas. I’d rather than take their time and get it right. If people want a generic LLM that hallucinates a lot, there are plenty of apps for that.
https://news.ycombinator.com/item?id=44980305
it doesn't really matter if it makes a mistake of that magnitude.
Google shares rise on report of Apple using Gemini for Siri
https://www.cnbc.com/2025/08/22/google-shares-rise-on-report... (https://news.ycombinator.com/item?id=44994585)
Like, I kind of wonder what else I might want SIRI to do that isn’t hard. I would like it to occasionally summarize facts like who was the president of France in 1975, and I might use it for that, but I wonder… does Apple make any money if they add that feature? Do they _want_ people using Siri? Yes, it is actually a valuable feature for me to be able to get directions while driving or play a song while cooking. Everything else has a weird cost-benefit ratio to Apple I imagine. The things that make you think the iPhone is substantively more valuable because Siri can do them are hard.
It probably comes down to the fact that Apple likes to make money, and OpenAI isn’t quite as concerned.