> But now there’s a new paradigm shift. The iPhone was perfect for the mobile era, which is why it hasn’t changed much over the last decade.
> AI unlocks what seems to be the future: dynamic, context-dependent generative UIs or something similar. Why couldn’t my watch and glasses be everything I need?
> The other problem is that at its core, AI is two things: 1) software and 2) extremely fast-moving/evolving, two things Apple is bad at.
Idk my MacBook Pro is pretty great and runs well. Fast moving here implies that as soon as you release something there's like this big paradigm shift or change that means you need to move even faster to catch up, but I don't think that's the case, and where it is the case the new software (LLM) still need to be distributed to end users and devices so for a company like Apple they pay money and build functionality to be the distributor of the latest models and it doesn't really matter how fast they're created. Apple's real threat is a category shift in devices, which AI may or may not necessarily be part of.
I'm less certain about Amazon but unless (insert AI company) wants to take on all the business risk of hosting governments and corporations and hospitals on a cloud platform I think Amazon can just publish their own models, buy someone else's, or integrate with multiple leading AI model publishers.
Spooky23 · 3h ago
I think the bet here is that AI is like Dropbox — a feature. Operating globally, these models are going to be a regulatory tar pit. The industry hype train is 100% reliant on courts ignoring the law - that didn’t work out well for Napster.
That makes the “category shift” difficult for Apple to execute well and difficult for competitors to gun for them. Microsoft is even worse off there because the PC OEMs relied on dying companies like Intel to deliver engineering for innovative things.
AWS, Azure, and GCP are doing the same stuff in different flavors. Google and Microsoft approach human facing stuff differently because they own collaboration platforms.
Apple and Microsoft are both flailing at the device level. Apple is ahead there as at least I can tell you what they are not doing well. Microsoft’s approach is so incoherent that it struggles to tell you what they are doing, period.
walterbell · 2h ago
> at the device level. Apple is ahead
Apple could turn everything around overnight by quietly re-enabling the jailbreak community for a few years, or restoring the 2022 Hypervisor API entitlement for arbitrary VMs. Hopefully this does not have to wait for leadership changes.
Either of those actions would take the shackles off Apple's underutilized hardware and frustrated developers. The resulting innovations could be sherlocked back into new OS APIs under Apple guardrails, whence they could generate revenue via App Store software. Then retire the jailbreaks and silently thank OutsideJobs for uncredited contributions to Apple upstream.
At present, the only industry participants maximizing usage of Apple hardware are zero-day hoarders. Meanwhile, every passing day allows Qualcomm, Nvidia and Arm-generic/Mediatek to improve their nascent PC hw+OS stacks, whittling away at Apple's shrinking hardware lead.
andsoitis · 2h ago
> Apple and Microsoft are both flailing at the device level. Apple is ahead there as at least I can tell you what they are not doing well. Microsoft’s approach is so incoherent that it struggles to tell you what they are doing, period.
Can you elaborate? I don't see what you're seeing.
Spooky23 · 2h ago
There’s a whole industry around critiquing Apple and their misadventures on iOS with respect to AI. We understand what is happening - there’s even podcasters castigating individual executives!
What is the story with Copilot as an on device feature of Windows? How dos that relate to an “AI PC”? In my business, what is Copilot (on the PC) do? How about Copilot Chat? How do they both relate to Copilot for Office 365?
Answer: I have no fucking idea. It’s a big soup of stuff with the same name that dumps everything in a bowl that the company makes. In a business, you’re going to make product decisions within your enterprise than fundamentally change the products based on your privacy and security needs and what countries you are operating in.
Apple has articulated a vision/framework for what they are delivering on device, with outside 1st party help and with 3rd parties. They’ve laid out how they are accessing your proprietary data. They have also failed to deliver.
andsoitis · 2h ago
Ah, by "Apple and Microsoft are failing at the device level", you are specifically saying they are failing specifically with respect to AI inference executing on edge devices (rather than in the cloud)?
Spooky23 · 1h ago
Yes, that and edge platform integration with cloud services.
It’s complicated and difficult - I say fail in the “fail fast” sense, not as an insult. Where are the line(s) between Excel as a component of Windows, as a web service and as a node on the office graph?
If I need AI help integrated with the product to write Excel formulas, I think the way to get that from Microsoft is with Copilot for Office 365, which also accesses all of my data on the graph and can potentially leak stuff with web grounding. (Which for companies means you need to fix SharePoint governance and do lots of risk assessment #godbless)
I just go to ChatGPT.
vonneumannstan · 3h ago
>I'm less certain about Amazon but unless (insert AI company) wants to take on all the business risk of hosting governments and corporations and hospitals on a cloud platform I think Amazon can just publish their own models, buy someone else's, or integrate with multiple leading AI model publishers.
Amazon is capturing massive amounts of the value in AI via AWS. They'll be fine. But for real I don't see a reason why Alexa is not using a good LLM now. Could just be infinitely better...
xena · 3h ago
> But for real I don't see a reason why Alexa is not using a good LLM now.
Large language models are too slow to use as real-time voice assistants. ChatGPT voice only barely works because they have to use a much worse (but faster) model to do it.
coredog64 · 2h ago
Amazon has a commercial Speech-to-Text model (Nova Sonic) that is passable. I used it to create a post-sales call assistant and was surprised that the underlying model was able to do a bunch of stuff I thought I was going to have to use Claude for.
vonneumannstan · 2h ago
At least on paper OpenAI claims the Voice models are actually the ones you are picking i.e. GPT 4o, 5. In any case even a GPT 3.5 would be superior to current alexa...
taeric · 3h ago
I'm not clear on what would be better about my Echos having an LLM interface. I want to like the idea, but I'm also growing increasingly annoyed at how things that used to work are beginning to not.
If they could gate it behind a "start chat session" or something, I would be more excited. Doing it by cannibalizing how well basic "play radio/start time/read from audible" worked for the longest time, everything they do that causes friction there is frustrating, to the extreme.
vonneumannstan · 2h ago
>Doing it by cannibalizing how well basic "play radio/start time/read from audible" worked for the longest time, everything they do that causes friction there is frustrating, to the extreme.
Theres absolutely no reason why plugging in an LLM would break any of those features but asking generic questions would be 100x better than "Searching the web for a shitty Quora or Alexa answers question."
taeric · 2h ago
In principle, I agree with you. In practice, it keeps getting worse.
I also don't typically ask generic questions. Ever, that I can remember.
Again, I don't want to dislike the idea. If people are really getting value from it, I would like them to continue to do so. But it seems to be a more expensive way to service use cases that were working just fine.
qcnguy · 2h ago
Alexa was a huge money loser for Amazon even before LLMs. They can't afford it.
aleph_minus_one · 3h ago
> But for real I don't see a reason why Alexa is not using a good LLM now. Could just be infinitely better...
Alexa would "a higher order infinity" better if it wasn't spying on you ...
coredog64 · 3h ago
> I think Amazon can just publish their own models, buy someone else's, or integrate with multiple leading AI model publishers.
This is exactly what they've done: They offer SageMaker (and similar capabilities) for hosting smaller models that fit into a single instance GPU, and they have Bedrock that hosts a metric crap-ton of AWS and third party models. Many of the model architectures are supported for hosting fine-tuned versions.
madmax96 · 2h ago
Came here to say pretty much this. Hardware seems more valuable than a model.
I think AI could be commoditized. Look at DeepSeek stealing OpenAI's model. Look at the competitive performance between Claude, ChatGPT, Grok, and Gemini. Look at open weight models, like Llama.
Commoditized AI need used via a device. The post argues that other devices, like watches or smart glasses, could be better posed to use AI. But...your point stands. Given Apple's success with hardware, I wouldn't bet against them making competitive wearables.
Hardware is hard. It's expensive to get wrong. It seems like a hardware company would be better positioned to build hardware than an AI company. Especially when you can steal the AI company's model.
Supply chains, battery optimization, etc. are all hard-won battles. But AI companies have had their models stolen in months.
If OpenAI really believed models would remain differentiated then why venture into hardware at all?
rickdeckard · 2h ago
Moreover, Apple owning the access to the device-hardware AND to the data those models would need to create value for an Apple-user makes the company even more robust.
They could manage years of AI-missteps while cultivating their AI "marketplace", which allows the user to select a RevShare'd third party AI if (and only if) Apple cannot serve the request.
It would keep them afloat in the AI-space no matter how far they are behind, as long as the iPhone remains the dominant consumer mobile device.
The only risks are a paradigm shift in mobile devices, and the EU which clearly noticed that they operate multiple uneven digital markets within their ecosystem...
bigyabai · 39m ago
"The only risks" lol. Tim Cook thought those were the only risks to the App Store too, now look where he is. You lack imagination.
What if [Japan|EU|US DOJ|South Korea] passes a law preventing OEMs from claiming user data as their property? If Apple really tries to go down the road of squeezing pre-juiced lemons like this, I think they're going to be called out for stifling competition and real innovation.
saurik · 2h ago
> > Why couldn’t my watch and glasses be everything I need?
(I am mostly going to comment on the Watch issue, as I have one.)
Apple makes a watch, yes. But is it an AI watch? Will they manage to make it become one? Intel made all kinds of chips. Intel's chips even could be used for mobile devices... only, Intel never (even still, to today) made a great mobile chip.
I have an Apple Watch--and AirPods Pro, which connect directly to it--with a cellular plan. I already found how few things I can do with my Watch kind of pathetic, given that I would think the vast majority of the things I want to do could be done with a device like my watch; but, in a world with AI, where voice mode finally becomes compelling enough to be willing to use, it just feels insane.
I mean, I can't even get access to YouTube Music on just my watch. I can use Apple's Music--so you know this hardware is capable of doing it--but a lot of the content I listen to (which isn't even always "Music": you can also access podcasts) is on YouTube. Somehow, the Apple Watch version of YouTube access requires me to have my phone nearby?! I can't imagine Google wanted that: I think that's a limitation of the application model (which is notoriously limited). If I could access YouTube Music on my watch, I would've barely ever needed my iPhone around.
But like, now, I spend a lot of time using ChatGPT, and I really like its advanced voice mode... it is a new reason to use my iPhone, but is a feature that would clearly be amazing with just the watch: hell... I can even use it to browse the web? With a tiny bit of work, I could have a voice interface for everything I do (aka, the dream of Siri long gone past).
But, I can't even access the thing that already works great, today, with just my watch. What's the deal? Is it that OpenAI really doesn't want me to do that? These two companies have a partnership over a bunch of things--my ChatGPT account credentials are even something embedded into my iPhone settings--so I'd think Apple would be hungry for this to happen, and should've asked them, thrown it in as a term, or even done the work of integrating it for them (as they have in the past for Google's services).
This feels to me like Apple has a way they intend me to use the watch, and "you don't need to ever have your phone with you" is not something they want to achieve: if they add functionality that allows the Watch to replace an iPhone, they might lose some usage of iPhones, and that probably sounds terrifying (in the same way they seem adamant that an iPad can't ever truly compete with a MacBook, even if it is only like two trivial features away).
glitchc · 3h ago
I'm not sure I follow: Both Apple and Amazon are working on AI as we speak. They're just not following the popular approach of releasing a chatbot in the wild.
Apple is focusing on a privacy-first approach with smaller models that run locally. Amazon is tying it's models to an AWS subscription and incentivizing use by offering discounts, making it cheaper to use their models over GPT, Opus, etc.
twobitshifter · 2h ago
Apple is not focusing on AI with any real emphasis. The engineers asked for $50B to train a model and Apple instead did stock buybacks. The stock kept underperforming and so they touted apple intelligence and a revamped Siri, only for it to fall flat. Siri was underinvested in for many years and should be at least as good as Claude or ChatGPT. ‘They’re not investing in a chatbot’ is a huge miss by apple who had a chatbot on everyone’s devices and a headstart on the whole concept.
andsoitis · 2h ago
> The engineers asked for $50B to train a model and Apple instead did stock buybacks.
It is probably cheaper to simply integrate with OpenAI or Anthropic or whoever might unseat them in the future, than spend $50B on training a model. Not only is it cheaper, but it also gives them the flexibility to ride the wave of popularity, without ceding hardware or software sales.
twobitshifter · 1h ago
And in that is the issue, Apple does not believe they could do better than Google, Meta, xAI, Anthropic, or OpenAI. They are paying Google rather than building out their own products. Pre-Tim, Apple was pouring profits back into R&D but now the priority is rewarding shareholders.
bigyabai · 26m ago
> They are paying Google rather than building out their own products.
This is the real death knell people should focus on. Apple buried their AI R&D to rush complete flops like Vision Pro out the door. Now that the dust has settled, the opportunity cost of these hardware ventures was clearly a mistake. Apple had more than a decade to sharpen their knives and prepare for war with Nvidia, and now they're missing out on Nvidia's share of the datacenter market. Adding insult to injury, they're probably also ~10 years behind SOTA in the industry unless they hire well-paid veterans at great expense.
Apple's chronic disdain for unprofitable products, combined with boneheaded ambition, will be the death of them. They cannot obviate real innovation and competition while dropping nothingburger software and hardware products clearly intended to bilk an unconscious userbase.
zimpenfish · 2h ago
> Not only is it cheaper, but it also gives them the flexibility to ride the wave
And also to hop off without any penalty if/when the wave collapses.
dpoloncsak · 2h ago
Didn't Apple's research lab release some open source/weights diffusion-based LLM that was blowing away all the benchmarks?
Edit: Yes it exists, seems to be built off qwen2.5 coder. Not sure it proves the point I thought it was, but diffusion LLMs still seem neat
glitchc · 1h ago
> The engineers asked for $50B to train a model and Apple instead did stock buybacks.
Source?
gmays · 3h ago
Right, but remember Microsoft was 'working on' mobile also. The issue is that they're working on it the wrong way. Amazon is focused on price and treating it like a commodity. Apple trying to keep the iPhone at the center of everything. Thus neither are fully committing to the paradigm shift because they say it is, but not acting like it because their existing strategy/culture precludes them from doing so.
9rx · 3h ago
> The issue is that they're working on it the wrong way.
So is everyone else, to be fair. Chat is a horrible way to interact with computers — and even if we accept worse is better its only viable future is to include ads in the responses. That isn't a game Apple is going to want to play. They are a hardware company.
More likely someday we'll get the "iPhone moment" when we realize all previous efforts were misguided. Can Apple rise up then? That remains to be seen, but it will likely be someone unexpected. Look at any successful business venture and the eventual "winner" is usually someone who sat back and watched all the mistakes be made first.
jpadkins · 2h ago
> Chat is a horrible way to interact with computers
Chat is like the command line, but with easier syntax. This makes it usable by an order of magnitude more people.
Entertainment tasks lend themselves well to GUI type interfaces. Information retrieval and manipulation tasks will probably be better with chat type interfaces. Command and control are also better with chat or voice (beyond the 4-6 most common controls that can be displayed on a GUI).
kemayo · 1h ago
> Chat is like the command line, but with easier syntax.
I kinda disagree with this analogy.
The command line is precise, concise, and opaque. If you know the right incantations, you can do some really powerful things really quickly. Some people understand the rules behind it, and so can be incredibly efficient with it. Most don't, though.
Chat with LLMs is fuzzy, slow-and-iterative... and differently opaque. You don't need to know how the system works, but you can probably approach something powerful if you accept a certain amount of saying "close, but don't delete files that end in y".
The "differently-opaque" for LLM chatbots comes in you needing to ultimately trust that the system is going to get it right based on what you said. The command line will do exactly what you told it to, if you know enough to understand what you told it to. The chatbot will do... something that's probably related to what you told it to, and might be what it did last time you asked for the same thing, or might not.
For a lot of people the chatbot experience is undeniably better, or at least lets them attempt things they'd never have even approached with the raw command line.
9rx · 2h ago
> Chat is like the command line
Exactly. Nobody really wants to use the command-line as the primary mode of computing; even the experts who know how to use it well. People will accept it when there is no better tool for the job, but it is not going to become the preferred way to use computers again no matter how much easier it is to use this time. We didn't move away from the command-line simply because it required some specialized knowledge to use.
Chatting with LLMs looks pretty good right now because we haven't yet figured out a better way, but there is no reason to think we won't figure out a better way. Almost certainly people will revert to chat for certain tasks, like people still use the command-line even today, but it won't be the primary mode of computing like the current crop of services are betting on. This technology is much too valuable for it to stay locked in shitty chat clients (and especially shitty chat clients serving advertisements, which is the inevitable future for these businesses betting on chat — they can't keep haemorrhaging money forever and individuals won't pay enough for a software service).
bobbylarrybobby · 2h ago
My experience with Claude Code is a fantastic way to interact with a (limited subset) of my computer. I do not think Claude is too far off from being able to do stuff like read my texts, emails, and calendar and take actions in those apps, which is pretty much what people want Siri to (reliably) do these days.
nailer · 2h ago
> Chat is a horrible way to interact with computers
Why? We interact with people via chat when possible. It seems pretty clear that's humanity's preferred ineraction model.
9rx · 2h ago
We begrudgingly accept chat as the lowest common denominator when there is no better option, but it's clear we don't prefer it when better options are available. Just look in any fast food restaurant that has adopted those ordering terminals and see how many are still lining up at the counter to chat with the cashier... In fact, McDonalds found that their sales rose by 30% when they eliminated chatting from the process, so clearly people found it to be a hinderance.
We don't know what is better for this technology yet, so it stands to reason that we reverted to the lowest common denominator again, but there is no reason why we will or will want to stay there. Someone is bound to figure out a better way. Maybe even Apple. That business was built on being late to the party. Although, granted, it remains to be seen if that is something it can continue with absent of Jobs.
nailer · 17m ago
> In fact, McDonalds found that their sales rose by 30% when they eliminated chatting from the process, so clearly people found it to be a hinderance.
That's a good supporting argument, but I don't think McDonald's adequately represents more complex discussions.
glitchc · 3h ago
> Apple trying to keep iPhone at the centre of everything.
Mac, iPad and iPhone, eventually Watch and Vision. Which makes sense since Apple is first and foremost a hardware company.
fruitworks · 3h ago
it is a commodity. that's the paradigm shift. there is no moat
ninetyninenine · 3h ago
Well no Alexa plus is the first LLM to integrate with the smart home in a big way.
Aws is making strides but in a different area.
amgreg · 3h ago
The author makes no effort to explain why AI :isn’t: a commodity as Apple and Amazon says. I was looking forward to that. I think the article is weak for not defending its premise. Everything else is fluff.
gmays · 2h ago
That's fair, but it wasn't the point of the article because it's messy. Many would argue that core LLMs are 'trending' toward commodity, and I'd agree.
But it's complicated because commodities don't carry brand weight, yet there's obviously a brand power law. I (like most other people) use ChatGPT. But for coding I use Claude and a bit of Gemini, etc. depending on the problem. If they were complete commodities, it wouldn't matter much what I used.
A part of the issue here is that while LLMs may be trending toward commodity, "AI" isn't. As more people use AI, they get locked into their habits, memory (customization), ecosystem, etc. And as AI improves if everything I do has less and less to do with the hardware and I care more about everything else, then the hardware (e.g. iPhone) becomes the commodity.
Similar with AWS if data/workflow/memory/lock-in becomes the moat I'll want everything where the rest of my infra is.
bloggie · 3h ago
I agree - and if the article is correct and Apple and Amazon are the losers, I fail to glean who the winners will be or how their business model will be different.
mackopes · 3h ago
For some time I have a feeling that Apple actually IS playing the hardware game in the age of AI. Even though they are not actively innovating on the AI software or shipping products with AI, their hardware (especially the unified memory) is great for running large models locally.
You can't get a consumer-grade GPU with enough VRAM to run a large model, but you can do so with macbooks.
I wonder if doubling down on that and shipping devices that let you run third party AI models locally and privately will be their path.
If only they made their unified memory faster as that seems to be the biggest bottleneck regarding LLMs and their tk/s performance.
ChocolateGod · 3h ago
> You can't get a consumer-grade GPU with enough VRAM to run a large model, but you can do so with macbooks.
You can if you're willing to trust a modded GPU with leaked firmware from a Chinese backshop
Firerouge · 1h ago
Short of flying to China and buying in person, how can an American find/get one of these?
gmays · 3h ago
True, but Apple is a consumer hardware company, which requires billions of users at their scale.
We may care about running LLMs locally, but 99% of consumers don't. They want the easiest/cheapest path, which will always be the cloud models. Spending ~$6k (what my M4 Max cost) every N years since models/HW keep improving to be able to run a somewhat decent model locally just isn't a consumer thing. Nonviable for a consumer hardware business at Apple's scale.
No comments yet
orbifold · 3h ago
I think it is a given that they are aiming for a fully custom training cluster with custom training chips and inference hardware. That would align well with their abilities and actually isn't too hard to pull off for them given that they have very decent processors, GPUs and NPUs already.
billbrown · 1h ago
They're working—almost done—on a CUDA backend for their Apple Silicon framework:
>I think it is a given that they are aiming for a fully custom training cluster with custom training chips and inference hardware.
It is? I haven't seen anything about this.
csomar · 2h ago
This. If we plateau around current SOTA LLM performance and 192/386Gb of memory can run a competitive model, Apple computers could become the new iPhone. They have a unique and unmatched product because of their hardware investment.
Of course nobody knows how this will eventually play out. But people without inside information on what these big organizations have/possess, cannot make such predictions.
stefan_ · 3h ago
Memory is not in any way or shape some sort of crucial advantage, you were just tricked into thinking that because it's used for market segmentation and nobody would slaughter their datacenter profits cash cow. The inference and god forbid training on consumer Apple hardware is terrible and behind.
mackopes · 3h ago
Show me another consumer hardware that handles inference and/or training better. How many RTX5090s would you need?
NitpickLawyer · 3h ago
For local inference macs have indeed shined through this whole LLM thing, and came out as the preferred device. They are great, the dev experience is good, speeds are ok-ish (a bit slower w/ the new "thinking" models / agentic use with lots of context, but still manageable).
But nvda isn't that far behind, and has already moved to regain some space with their PRO6000 "workstation" GPUs. You get 96GB of VRAM for ~7.5k$, which is more than a comparable RAM mac, but not 30k you previously had to shell for top of the line GPUs. So you get a "prosumer" 5090 with a bit more compute and 3x VRAM, in a computer that can sell for <10k$ and beat any mac at both inference and training, for things that "fit" in that VRAM.
Macs still have the advantage for larger models, tho. The new DGX spark should join that market soon(tm). But they allegedly ran into problems on several fronts. We'll have to wait and see.
looks like there will be several good options "soon"?
owebmaster · 1h ago
this is cool! Nvidia should sell notebooks, too.
spogbiper · 33m ago
i think Nvidia is trying to create a sort of reference platform and have other OEMs produce mass market products, so a laptop might happen even if nvidia doesn't make one themselves
joshstrange · 3h ago
> Why couldn’t my watch and glasses be everything I need?
People like screens. They like seeing IG pictures,they like scrolling through TikTok, they like seeing pictures/videos their friends/family send/post. I doubt many people will want to see pictures/videos on a watch screen or in glasses (which still have a ways to go).
Also I don't buy the premise of this article that Apple is deciding to take a backseat in AI, they were late to the party but they are trying (and failing it seems) to build foundational models. Reaching for OpenAI/Anthropic/etc while they continue to work on their internal models makes a lot of sense to me. It acknowledges they are behind and need to rely on a third-party but doesn't mean they won't ever use their own models.
Unless something changes (which is absolutely possible) it does seem we are headed towards LLMs being comedities. We will see what OpenAI/Ive end up releasing but I don't see a near-future where we don't have screens in our pockets and for that Google and Apple are best placed. With the GPT-5 flop (It's 4.6 at best IMHO) I have less concerns with LLMs growing as quickly as predicted.
jbverschoor · 3h ago
Yet I still don’t get an external monitor with a proper desktop experience on my phone
lelanthran · 3h ago
AI tokens are a commodity. They're the bottom of the value-chain and they're all converging onto the same performance.
What matters for the future is what killer apps can be built on this commodity (tokens)?
Right now we've got content/text generation and ... nothing else?
owebmaster · 1h ago
> Right now we've got content/text generation and ... nothing else?
Software operators. LLMs can operate any kind of software (check MCP). If you reduce this to just "text generation", what else is left?
ajsnigrutin · 3h ago
> What matters for the future is what killer apps can be built on this commodity (tokens)?
AI chatbots in pdf viewers!
Oh wait.. we already have that and it's useless.
blitzar · 2h ago
Hi, I'm Clippy! It looks like you are trying to read a document. Do you need assistance?
qcnguy · 3h ago
Basic point seems sound, analysis slightly off.
> Interestingly, Intel still reached new highs for a decade after missing mobile before it all collapsed
That's because their problems weren't due to missing mobile but rather taking too much risk on a fab tech transition that they then fumbled. This put them permanently behind and they were unable to catch up.
> Amazon’s AWS is predicated on the idea of commoditized infrastructure at scale where price is the priority
Since when does AWS compete on price? AWS is predicated on the idea of many proprietary services running on commodity hardware, and charging high prices for the privilege of not spending time on sysadmin work.
gmays · 2h ago
OP here, good points.
Your comment on Intel is correct, but it's also true that TSMC could invest billions into advanced fabs because Apple gave them a huge guaranteed demand base. Intel didn’t have the same economic flywheel since PCs/servers were flat or declinig.
That's a good clarification on Amazon, running on commodity hardware with competitive pricing != competing on price alone. It would have been better to clarify this difference when pointing out that they're trying the same commodity approach in AI.
trenchpilgrim · 3h ago
> But with AI, quality/performance is the priority. That’s why we see so many new AI startups and use cases lighting up as AI improves and finally becomes ‘good enough’ for new use cases.
But also, we are seeing models leapfrog each other; the best model this week is often not the best model next month and certainly not next quarter. I still see merit to the idea that cloud providers should focus on being the place where companies put their data, and make it easy for companies to bring models and tools to that data.
I agree with the article that Apple is probably cooked if the continue their current path for another couple of years.
resters · 3h ago
AI will become a commodity -- see Deepseek. The handwriting is on the wall and the open source models and practices to get to Claude 4.1 or GPT-5 are all in the public domain.
Apple could simply sell earpods with an AI voice interface, the UI/UX and solution space is evolving too quickly for a company like Apple to sensibly try to define how it will use the technology yet.
See Google and Microsoft's failed integrations of AI with search for an example of how it should not be done.
The more I use AI the more I think I don't need a laptop and would settle for a comfortable VR setup as I do far less typing of symbols.
No comments yet
stego-tech · 3h ago
Again, this sort of booster content speaks of Generative AI dominance as an inevitability despite mounting data that, and I really hate typing these words, Amazon and Apple are correct in their strategy.
This blog post only really makes sense if you wholesale buy into the concept that Generative AI is going to do everything its most ardent boosters claim it will, on the timeline they say it will, none of which has really bore out as true thus far. For anything less, Apple and Amazon’s strategy of commoditization of models themselves makes sense.
That being said, do I have nitpicks over their respective strategies? You betcha. Amazon is so focused on an Apple-like walled-garden approach for enterprise compute that they run the risk of being caught up in shifting tides of geopolitics and nationalism along with increased attention on cost reductions. Apple, on the other hand, is far too exposed with the iPhone being the center of their ecosystem that a fundamentally new device could easily render them the next Nokia.
Between the two, Apple at least seems to be keen innovating at its own pace and hoovering up competition that hits the true moonshots - not that I expect that strategy to keep working as antitrust scrutiny mounts against Big Tech. AWS, by comparison, is seemingly taking the Microsoft and VMware approach of just adding yet another service icon to the catalog and letting customers figure it out for themselves, which, well, just go ask the old guard of Big Tech how that strategy worked out for them when everyone ran into the public cloud.
Neither strategy has long (or even mid) term viability, but AI almost certainly won’t be the tech that pierces their moat.
malfist · 2h ago
You want to hear something crazy? Amazon is the biggest capex spender on AI in the world. Amazon is on track to spend over $155 this year alone on AI build out. They've already spent $100b
stego-tech · 2h ago
Yup, all in support of others’ models and the commoditization model of AI deployment. The benefit of this approach is that if AI is a bubble, they have a glut of hardware they can rent at discounts to other customers and startups who can reach for moonshots.
It’s a smart approach. Get the CAPEX done while there’s appetite and political will for it.
blitzar · 3h ago
100% disgree - My biggest bet for a long time is that Apple and Google will win by far the most on Ai.
The phone is where the data is and Ai will find its usefulness in that data (and interface).
gmays · 3h ago
I'm somewhat bullish on Google as well, they have the opportunity if they can figure out the product (which they are bad at) and they have the edge in cloud with their models + TPUs.
But your comment about the phone could have been about horses, or the notepad or any other technology paradigm we were used to in the past. Maybe it'll take a decade for the 'perfect' AI form factor to emerge, but it's unlikely to remain unchanged.
blitzar · 3h ago
Yes brain chips and implants will be the next form factor. Until then the slab of battery and screen in your pocket is going to be present (and probably remain for a while even after we get brain implants).
mg · 3h ago
Apple does not need to develop AI software itself in order to remain successful.
We more and more turn into cyborgs, wearing all kinds of processors and sensors on our bodies. And we need this hardware and the software that runs on it to be trustworthy.
Being the hardware producer and gatekeeper for trustworthy software to run on it is probably big enough of a market for Apple.
Even more so if their business of getting 15% to 30% of the revenue apps generate on the iPhone continues.
It has yet to be seen what type of company becomes most valuable in the AI stack. It could very well be that it does not operate LLMs. Similar to how the most valuable company in the hospitality industry does not operate hotels. It just operates a website on which you book hotels.
reactordev · 3h ago
Not to mention developer hardware. As more and more AI eats the world, more and more developers will become developers and they need machines capable of at least running quantized models. While it’s not as good as having your very own A20 or H100, the M4 Max is above all else on the desktop save the RTX5090 w/ a beast ryzen.
It’s also an opportunity to disrupt… build hardware specifically for ai tasks and reduce it down to just an asic.
kcb · 3h ago
The vast majority by an order of magnitude of those developers will be using AI models on a server somewhere.
reactordev · 1h ago
maybe, maybe not. I don't want to live in a world where I can't develop software without the support of a massive corporation 3rd party. I think local model inference is going to pick up. Will it ever compete with the clusters? no, but it's good enough for a solo to get work done.
I could be wrong and we could be seeing the ladders being pulled up now, moats being filled with alligators and sharks, and soon we'll have no choice but to choose between providers. I hope we can keep the hacking/tinkering culture alive when we can no longer run things ourselves.
drob518 · 3h ago
Apple and Amazon have huge resources and a large installed base to which they can readily sell almost anything. Couple that with a narrow or even non-existent moat for AI and I’m thinking Apple and Amazon will do just fine. In fact, I think the reason that Apple is delaying is that they are finding that AI just isn’t what it’s being sold as by the market and they’re afraid of getting a black eye like OpenAI just did with GPT5. Worst case, they spend a few billion dollars and buy someone.
dfedbeef · 3h ago
I think Apple will come out looking great because they are less likely to release things that suck. It seems smarter to hoard cash right now rather than spending it on hardware for an unproven workload.
garbthetill · 2h ago
Yep, Apple is great at this. There have been times when random Android phones were ahead feature wise, but this doesn’t cause Apple to lose its customer base, even when Apple releases the same feature 2–4 years later. The same thing happened with smartwatches, Apple entered that industry late, yet the Apple Watch now dominates the market against well established players
9rx · 1h ago
Even the iPhone itself. They played around a bit with the Newton and Rokr E1, sure, but for the most part they sat back and let everyone else make all the mistakes first.
cameldrv · 2h ago
We will see. One thing with AI is that access to data/APIs is key. There’s been a long term trend where services are locking down their APIs so that you can only interact with them through their native UI, either phone or web, sometimes just phone. Many of the most promising uses of AI are to automate around various annoying things service providers do to drive their own revenue, and the service providers aren’t taking this lying down.
Apple is in a very favorable position with its control of arguably the most important platform of all. They can force app developers to allow Apple AI to automate their apps, and prevent other AI providers from doing the same, and they make a strong privacy argument about why this is necessary.
hacker_yacker · 3h ago
Just wrong. Apple just acquired 7 AI companies (talent pool). And amazon's Bedrock has been solid and pushing envelope for years.
The problem is with your expectations. Apple is no longer winning all the time, so relatively, it feels like it's losing. And Amazon is the quiet beast lurking, it's just doing a poor job at marketing.
juujian · 2h ago
Apple made the mistake of making a grand announcement and then not following through. They may have the talent, and even the hardware, but the product is still missing. OpenAIs open weight model runs great on my iMac, when is apple going to make a move?
kemayo · 3h ago
> AI unlocks what seems to be the future: dynamic, context-dependent generative UIs or something similar. Why couldn’t my watch and glasses be everything I need?
Voice input isn't suitable for many cases, and physical input seems generally superior to AR -- I've used a Vision Pro, and it's very impressive, but it's nowhere near the input-performance of a touchscreen or a mouse and keyboard. (To its credit: it's not aiming for that.)
Unless the argument is that you will never have to be precise, or do something that you don't want everyone within earshot to know about?
Also, a "dynamic, context-dependent generative UI" sounds like another way to describe a UI that changes every time you use it depending on subtle qualities of exactly how you reached it this time, preventing you from ever building up any kind of muscle-memory around using it.
clickety_clack · 2h ago
This article doesn’t really seem to have any insight at all into Apple or Amazon’s AI strategies. The result that “Apple and Amazon will miss AI” seems to be a premise of the argument, and the rest of the article doesn’t really seem to even corroborate that.
Are phones and AI on the opposite ends of some axis where success in one precludes success in the other? Does the use of AI reduce the use of compute and data? I have my own opinions on the topic, but beyond the eye-catching title this article didn’t inform one way or the other.
rickdeckard · 2h ago
I don't see how Apple is that much at risk here.
They continue building a distributed data-acquisition and edge data-processing network with their iPhone sales, where the customer keeps paying for both hardware and traffic.
They constantly take measures to ensure that the data their customers generate is only available to Apple and is not siphoned away by the users themselves.
The moment they finish the orchestration of this distributed worldwide iOS AI-cluster, they will suddenly have both the data and the processing at comparatively low operational cost.
The biggest risk? A paradigm-shift where the Smartphone is no longer the main device (smart glasses being a major risk here), and some other player takes the lead here.
mizzao · 2h ago
> Nvidia owns the platform and OpenAI owns the general consumer interface.
I'd have to disagree with this. There are a number of general consumer interfaces and none of them have much of a moat or brand loyalty.
Also, Apple is in the best position by far to make consumer hardware that can do inference on-device. No other company even comes close.
malshe · 3h ago
> Regardless of what the final form factor ends up being, there’s no doubt spending our lives hunched over staring at a piece of glass in our hand is suboptimal.
As opposed to what exactly? If the author really believes that a watch and glasses are preferred over a smartphone, I don't know what to say. Also, note that the author doesn't know what this new form factor will look like. In that case, what's the point in declaring that Apple and Amazon will miss the boat?
I think a much bigger threat to Apple is a smartphone with an AI driven operating system. But what do I know?
anon191928 · 3h ago
This is mostly true. Sure apple can make great cpu, gpu and innovate on airpods etc. But they are literally so behind on AI. This time they can't make that innovation jump like they did with iphone because there is no Steve jobs as a tech leader.
Apple now only cares about reveneue and retirement of all those who made Iphone and Macs great. They are rich so they don't need to innovate big until they are like Intel now. But they try creating toys like Vison Pro, and self driving car that was coming for a decade. Just all for the fun of it.
Old company with old leader and 0 hunger for succees. Opposite of all big AI startups today.
parineum · 3h ago
> But they are literally so behind on AI
Which AI products are Apple lacking that put them so far behind?
anon191928 · 2h ago
coding, on device photo editing vs Samsung and voice thing from any AI company. Siri is 10 years behind them. do you want more?
bayindirh · 3h ago
Apple will integrate AI in a not user facing fashion. They'll weave it underneath the OS and the ecosystem. Like how they locally update memories and do other stuff on-device.
What we gonna see from Apple, IMHO, is a horde of smaller models working on different tasks on the devices itself, plus heavier stuff is running on "personal cloud compute" instances.
We'll not see Apple AI besides a sticker, it'll just try to do its work at the background.
Oh, people balk at auto-word-complete in the latest macOS, but I find it very useful. Spending little time here and there and it counts.
nolok · 3h ago
Amazon is the only company I regularly use that has implemented AI as a front that users very often interact with, and it's not a total disaster. Their customer service AI (when you have an issue) actually works well, and in many cases I didn't need to escalate to a human to get something sorted. YMMV, of course, but I hate about 99.9999999% of those "AI" customer service crap usually, and the recent "smart" ones are somehow way worse than the tree based ones of old.
ninetyninenine · 3h ago
Have you tried the new Alexa?
An LLM controls all my house lights right now.
nolok · 1h ago
It's not available here in Europe, but I have homeassistant coupled with Gemini and it works rather well.
taneq · 2h ago
Huh, interesting. I periodically ask Siri "hey siri, are you actually smart yet?" (or variants on this depending on how I feel). So far I've always received a polite "I'm sorry, I don't know how to do that". No worries, you'll get there.
It's funny how the unexpected is way more impressive. I tried out the voice commands on my new car (BYD) and after it correctly interpreted and implemented my request, I politely said "thank you!" (as I always do, it's important to set a good example) and the car responded "you're welcome!". 100% could just have been a lucky setup for a scripted response... but...
OtherShrezzing · 2h ago
> AI unlocks what seems to be the future: dynamic, context-dependent generative UIs or something similar. Why couldn’t my watch and glasses be everything I need?
If you earnestly believed that dynamic generative UIs are the future, surely you'd be betting on Apple.. The company with fully integrated hardware already capable of cost effectively running near-frontier generative models on the end user hardware.
grishka · 3h ago
Except it was always clear that smartphones are here to stay, while AI is an unsustainable fad that will go out of fashion, the sooner the better.
dcre · 3h ago
Totally unwarranted level of confidence here about both claims.
jpalomaki · 3h ago
The most useful AI feature for iOS I can image is actually intelligent Siri, which could use other applications installed on the phone as its tools to accomplish the tasks.
Apps would expose their functionality to the Smart Siri. Maybe there's already something like this with the shortcuts. Maybe I could give Claude Code style blanket permissions to some actions, some ones I would like to review.
eviks · 3h ago
> AI unlocks what seems to be the future: dynamic, context-dependent generative UIs or something similar. Why couldn’t my watch and glasses be everything I need?
Because this is not a science fiction future, but a corporate one, where neither do those magical UIs exist, nor do you have enough power in wearables for them to be everything you need?
an0malous · 3h ago
If Elon/xAI can catch up within a year, I doubt it would take Apple and Amazon longer
picafrost · 3h ago
The gold rush metaphor loves to highlight the shovel sellers, but the actually valuable commodity (gold) still needs a broker. I cannot see any other hardware vendor being better positioned than Apple for consumer facing AI.
mandeepj · 3h ago
> Apple and Amazon will miss AI like Intel missed mobile.
I think they’ll be fine. Amazon is an investor in Anthropic and Apple has an agreement with OpenAi. I’d consider that as an inorganic growth.
camgunz · 3h ago
People are gonna pretty quickly quit paying for AI--we're well into the "let me see what everyone's talking about" phase and that'll wear off soon. The price is already skyrocketing way ahead of quality or utility, so that'll accelerate the decline. Businesses incorporating AI into their products will scale that back as costs increase, or as they replace the most commonly used functionality with purpose-built code.
The real question is how do we continue the grift? AI's a huge, economy-sustaining bubble, and there's currently no off-ramp. My guess is we'll rebrand ML: it's basically AI, it actually works, and it can use video cards.
AI is a great feature funnel in terms of like, "what workflows are people dumping into AI that we can write purpose-built code for", but it has to transition from revenue generator to loss leader. The enormity of the bubble has made this very difficult, but I have faith in us.
hbarka · 3h ago
Apple’s AI opportunity is in AirPods if they can be the one to deliver perfect full-duplex or half-duplex voice communication choices with an LLM.
qzw · 3h ago
Or will they miss it like they missed self-driving cars? It remains to be seen.
dfedbeef · 3h ago
I think both realize that nobody wants to buy an Apple or Amazon car. They want a Mercedes that drives itself. Mercedes is going to release one and Tesla is going to go bankrupt because of their ongoing negligence and disregard for safety. There's going to be a tidal wave of civil litigation at some point in the next 10 years.
Apple has "missed" multiple trends according to bloggers, and a few years later they end up releasing something that reaches the top of the market. Remember that they weren't the first MP3 players, tablets, headphones, smartphones, or smart watches... they lagged in these markets for years and have been at the top of them for decades since.
atonse · 1h ago
I think their huge mistake was announcing a bunch of stuff (Apple Intelligence) that ended up being vaporware.
My guess is that they're also adapting to the changing ecosystem, and since they move very slowly, the trends seem archaic (like Apple Intelligence featuring image and summary generation 12-18 months after it was found to be novel by anyone).
I'm hoping that they lean in hard on their intents API forming a sort of "MCP" like ecosystem. That will be very powerful, and worth spending extra time perfecting.
micromacrofoot · 1h ago
Yeah they announced way too early, they probably didn't expect so much movement in the market (and to be fair it doesn't seem like it needs to be so volatile, GPT5's lackluster release feels like evidence of this)
leaks seem to indicate they're working on some home AI assistant, which kind of lends to their typical lagging timeline with the goal of quality
dfedbeef · 3h ago
1000%
They make the good versions of things when the technology gets to a point that it is genuinely useful for users.
hbarka · 3h ago
AWS Kiro came out with a bang and now it’s a waitlisted whimper.
qwertytyyuu · 2h ago
Amazon servers run LLMs, they’ll be fine
jdefr89 · 3h ago
Its sad content like this makes it too front page. AI has been developed and used at both companies far before LLM TRANSFORMERS made such a hit. The author clearly doesn't have even a basic understanding of what AI is... AI is a giant umbrella term. The hype is essentially all about Transformer based models and I time will show its a hype bubble they don't need to be part of. I will never understand how people with so little technical knowledge on a subject can make grandiose claims like "Apple is losing out on AI!" and be taken seriously. Every company is high on their own damn supply with this pushing AI nonsense. I don't need language models shoved in my face by yet another company. I respect Apples approach far more than everyone else trying to ride the LLM bandwagon.
DSingularity · 3h ago
I don’t think anybody other than Google and Apple have a chance to win AI and it’s because they are the only companys with both the user base and the capacity to design and produce hardware for models.
Facebook has a data advantage and a use base but at the end of the day they will always need to make Nvidia or a cloud provider rich to run/train their models.
OrvalWintermute · 2h ago
> Amazon’s AWS is predicated on the idea of commoditized infrastructure at scale where price is the priority. But with AI, quality/performance is the priority.
cost is not the priority with AWS. To quote my collaborator, "I just scaled up to 600 PB for our event"
When I think AWS I think speed, scale-up, scale-out, capacity & dynamic flexibility are more key.
AWS is not the "cheapo commodity option", nor is Azure
bgwalter · 3h ago
You could also say "like they missed virtual reality". Apple is selling the shovels with hardware that can be used to run home models.
I don't think they are missing out on anything. Everyone wants products from Apple or Amazon, only some power users and managers want "AI".
ryandrake · 3h ago
I'm sure they'll be fine. Apple and Amazon also "missed" blockchain, and nobody cares. You don't have to bet the house on every technology that comes along in order to be successful.
mrcwinn · 2h ago
I think this analysis is directionally right but the supportive arguments are quite weak. Maybe the iPhone is the wrong form factor, but do we really doubt Apple's ability to create a new hardware device and tightly integrate software?
The deeper issue, in my view, is that Apple is violating its own philosophical belief, articulated by Steve Jobs himself: that Apple should always own the primary technology behind its products (i.e., multi-touch, click-wheel). This is central to Apple's "we need to make it ourselves" approach.
Camera lenses are commodities. AI models are foundational. Apple's own executive leadership likened models to the internet, and said, well surely we wouldn't try to build and own the internet! This misplaces AI as infrastructure when in fact it's foundational to building and serving useful applications.
They further see "chat" (and by extension) as an app, but I think it's more like a foundational user interface. And Apple's certainly always owned the user interface.
When Siri was first announced, there was excitement that voice might be the next paradigm shift in user interface. Partly because Siri is so bad for a decade now, and partly because people didn't feel like talking to their screens, Apple may have learned a very unhelpful lesson: that Siri is just a feature, not a user interface. In this age though, chat and voice are more than features, and yet Apple doesn't own this either.
Apple should not buy Perplexity. Perplexity is smoke and mirrors, and there's nothing defensible about its business. Apple cannot get the talent and the infrastructure to catch up on models.
So what then?
OpenAI is not for sale. Anthropic is likely not for sale, but even if it were, Apple wouldn't buy it: Anthropic is very risky to Apple's profit margin profile and Apple can't unlock Anthropic's bottleneck (capacity).
In fact, to Apple's advantage, why not let the VCs and competitors like Microsoft and Amazon and Google light their money on fire while this industry takes shape, provided you have an option in the end to sell a product to the consumer?
The best option, in my view, is Search Deal Redux: partner deeply with Google and Gemini. Google very obviously wants as much distribution as possible, and Apple has more hardware distribution than any company in history.
Partnership is the only path because, yes, Apple missed this one badly. There is one area where I agree with Tim Cook, though: being late doesn't guarantee you lose, even in AI.
9rx · 1h ago
> I think it's more like a foundational user interface.
I don't. The foundational interface hasn't been created yet. Let's be honest, chat isn't great. It is the best we have right now to leverage the technology, yes, but if it is our future — that we cannot conceive of anything better, we've failed miserably. We're still in the "command-line age". No doubt many computing pioneers thought that the command-line too was foundational, but nowadays most users will never touch it. In hindsight, it really wasn't important — at best a niche product for power users.
> Apple may have learned a very unhelpful lesson: that Siri is just a feature, not a user interface.
That is the right lesson, though. Chat sucks; be that through voice, typing, or anything else. People will absolutely put up with it absent of better options, but as soon as there is a better option nobody is preferring chat. This is where Apple has a huge opportunity to deliver the foundational interface. Metaphorically speaking, they don't need to delver the internet at all, they just need to deliver the web browser. If they miss that boat, then perhaps there is room for concern, but who knows what they have going on in secret?
dvfjsdhgfv · 2h ago
A lot of assumptions and confident statements with nothing to back it up.
ngcc_hk · 2h ago
Is Intel lost the mobile? Or a better question has arm lost mobile … the key is today the most profitable part is not arm even it has the isa.
Intel lost the hardware fight. But it lost all along as the most profitable part is software or who control the ecosystem. Microsoft and Apple. Not Intel.
It is amd cannot beat nvidia but the ecosystem of Ai (but not game) is what the lost is.
Now what ecosystem apple and Amazon lost … can it get to avoid nvidia controlling their Ai ecosystem. I doubt very much.
rossdavidh · 3h ago
Or, they will miss AI like Toyota missed electric. Oh, wait...
cft · 3h ago
One way out for Apple is facilitate running small local specialized models on its hardware, like Macbooks - with unlimited free use, like basic writing, translations, spreadsheets and maybe coding.
The macbook could come with some models, and brew ecosystem would supply others.
j45 · 3h ago
Apple has performant/power efficient hardware for AI few others do. Can't totally count them out for that reason.
mihaaly · 3h ago
So what? Like there is no computing beyond gigantic screens held with both hands with weird look and 'features', or there will be no services to provide but AI?
Maybe they won't be so big in monetary value anymore? So what? There are endless states between huge and nothing, actually most organizations live there and most serve people well.
> AI unlocks what seems to be the future: dynamic, context-dependent generative UIs or something similar. Why couldn’t my watch and glasses be everything I need?
> The other problem is that at its core, AI is two things: 1) software and 2) extremely fast-moving/evolving, two things Apple is bad at.Idk my MacBook Pro is pretty great and runs well. Fast moving here implies that as soon as you release something there's like this big paradigm shift or change that means you need to move even faster to catch up, but I don't think that's the case, and where it is the case the new software (LLM) still need to be distributed to end users and devices so for a company like Apple they pay money and build functionality to be the distributor of the latest models and it doesn't really matter how fast they're created. Apple's real threat is a category shift in devices, which AI may or may not necessarily be part of.
I'm less certain about Amazon but unless (insert AI company) wants to take on all the business risk of hosting governments and corporations and hospitals on a cloud platform I think Amazon can just publish their own models, buy someone else's, or integrate with multiple leading AI model publishers.
That makes the “category shift” difficult for Apple to execute well and difficult for competitors to gun for them. Microsoft is even worse off there because the PC OEMs relied on dying companies like Intel to deliver engineering for innovative things.
AWS, Azure, and GCP are doing the same stuff in different flavors. Google and Microsoft approach human facing stuff differently because they own collaboration platforms.
Apple and Microsoft are both flailing at the device level. Apple is ahead there as at least I can tell you what they are not doing well. Microsoft’s approach is so incoherent that it struggles to tell you what they are doing, period.
Apple could turn everything around overnight by quietly re-enabling the jailbreak community for a few years, or restoring the 2022 Hypervisor API entitlement for arbitrary VMs. Hopefully this does not have to wait for leadership changes.
Either of those actions would take the shackles off Apple's underutilized hardware and frustrated developers. The resulting innovations could be sherlocked back into new OS APIs under Apple guardrails, whence they could generate revenue via App Store software. Then retire the jailbreaks and silently thank OutsideJobs for uncredited contributions to Apple upstream.
At present, the only industry participants maximizing usage of Apple hardware are zero-day hoarders. Meanwhile, every passing day allows Qualcomm, Nvidia and Arm-generic/Mediatek to improve their nascent PC hw+OS stacks, whittling away at Apple's shrinking hardware lead.
Can you elaborate? I don't see what you're seeing.
What is the story with Copilot as an on device feature of Windows? How dos that relate to an “AI PC”? In my business, what is Copilot (on the PC) do? How about Copilot Chat? How do they both relate to Copilot for Office 365?
Answer: I have no fucking idea. It’s a big soup of stuff with the same name that dumps everything in a bowl that the company makes. In a business, you’re going to make product decisions within your enterprise than fundamentally change the products based on your privacy and security needs and what countries you are operating in.
Apple has articulated a vision/framework for what they are delivering on device, with outside 1st party help and with 3rd parties. They’ve laid out how they are accessing your proprietary data. They have also failed to deliver.
It’s complicated and difficult - I say fail in the “fail fast” sense, not as an insult. Where are the line(s) between Excel as a component of Windows, as a web service and as a node on the office graph?
If I need AI help integrated with the product to write Excel formulas, I think the way to get that from Microsoft is with Copilot for Office 365, which also accesses all of my data on the graph and can potentially leak stuff with web grounding. (Which for companies means you need to fix SharePoint governance and do lots of risk assessment #godbless)
I just go to ChatGPT.
Amazon is capturing massive amounts of the value in AI via AWS. They'll be fine. But for real I don't see a reason why Alexa is not using a good LLM now. Could just be infinitely better...
Large language models are too slow to use as real-time voice assistants. ChatGPT voice only barely works because they have to use a much worse (but faster) model to do it.
If they could gate it behind a "start chat session" or something, I would be more excited. Doing it by cannibalizing how well basic "play radio/start time/read from audible" worked for the longest time, everything they do that causes friction there is frustrating, to the extreme.
Theres absolutely no reason why plugging in an LLM would break any of those features but asking generic questions would be 100x better than "Searching the web for a shitty Quora or Alexa answers question."
I also don't typically ask generic questions. Ever, that I can remember.
Again, I don't want to dislike the idea. If people are really getting value from it, I would like them to continue to do so. But it seems to be a more expensive way to service use cases that were working just fine.
Alexa would "a higher order infinity" better if it wasn't spying on you ...
This is exactly what they've done: They offer SageMaker (and similar capabilities) for hosting smaller models that fit into a single instance GPU, and they have Bedrock that hosts a metric crap-ton of AWS and third party models. Many of the model architectures are supported for hosting fine-tuned versions.
I think AI could be commoditized. Look at DeepSeek stealing OpenAI's model. Look at the competitive performance between Claude, ChatGPT, Grok, and Gemini. Look at open weight models, like Llama.
Commoditized AI need used via a device. The post argues that other devices, like watches or smart glasses, could be better posed to use AI. But...your point stands. Given Apple's success with hardware, I wouldn't bet against them making competitive wearables.
Hardware is hard. It's expensive to get wrong. It seems like a hardware company would be better positioned to build hardware than an AI company. Especially when you can steal the AI company's model.
Supply chains, battery optimization, etc. are all hard-won battles. But AI companies have had their models stolen in months.
If OpenAI really believed models would remain differentiated then why venture into hardware at all?
They could manage years of AI-missteps while cultivating their AI "marketplace", which allows the user to select a RevShare'd third party AI if (and only if) Apple cannot serve the request.
It would keep them afloat in the AI-space no matter how far they are behind, as long as the iPhone remains the dominant consumer mobile device.
The only risks are a paradigm shift in mobile devices, and the EU which clearly noticed that they operate multiple uneven digital markets within their ecosystem...
What if [Japan|EU|US DOJ|South Korea] passes a law preventing OEMs from claiming user data as their property? If Apple really tries to go down the road of squeezing pre-juiced lemons like this, I think they're going to be called out for stifling competition and real innovation.
> https://www.apple.com/watch/
(I am mostly going to comment on the Watch issue, as I have one.)
Apple makes a watch, yes. But is it an AI watch? Will they manage to make it become one? Intel made all kinds of chips. Intel's chips even could be used for mobile devices... only, Intel never (even still, to today) made a great mobile chip.
I have an Apple Watch--and AirPods Pro, which connect directly to it--with a cellular plan. I already found how few things I can do with my Watch kind of pathetic, given that I would think the vast majority of the things I want to do could be done with a device like my watch; but, in a world with AI, where voice mode finally becomes compelling enough to be willing to use, it just feels insane.
I mean, I can't even get access to YouTube Music on just my watch. I can use Apple's Music--so you know this hardware is capable of doing it--but a lot of the content I listen to (which isn't even always "Music": you can also access podcasts) is on YouTube. Somehow, the Apple Watch version of YouTube access requires me to have my phone nearby?! I can't imagine Google wanted that: I think that's a limitation of the application model (which is notoriously limited). If I could access YouTube Music on my watch, I would've barely ever needed my iPhone around.
But like, now, I spend a lot of time using ChatGPT, and I really like its advanced voice mode... it is a new reason to use my iPhone, but is a feature that would clearly be amazing with just the watch: hell... I can even use it to browse the web? With a tiny bit of work, I could have a voice interface for everything I do (aka, the dream of Siri long gone past).
But, I can't even access the thing that already works great, today, with just my watch. What's the deal? Is it that OpenAI really doesn't want me to do that? These two companies have a partnership over a bunch of things--my ChatGPT account credentials are even something embedded into my iPhone settings--so I'd think Apple would be hungry for this to happen, and should've asked them, thrown it in as a term, or even done the work of integrating it for them (as they have in the past for Google's services).
This feels to me like Apple has a way they intend me to use the watch, and "you don't need to ever have your phone with you" is not something they want to achieve: if they add functionality that allows the Watch to replace an iPhone, they might lose some usage of iPhones, and that probably sounds terrifying (in the same way they seem adamant that an iPad can't ever truly compete with a MacBook, even if it is only like two trivial features away).
Apple is focusing on a privacy-first approach with smaller models that run locally. Amazon is tying it's models to an AWS subscription and incentivizing use by offering discounts, making it cheaper to use their models over GPT, Opus, etc.
It is probably cheaper to simply integrate with OpenAI or Anthropic or whoever might unseat them in the future, than spend $50B on training a model. Not only is it cheaper, but it also gives them the flexibility to ride the wave of popularity, without ceding hardware or software sales.
This is the real death knell people should focus on. Apple buried their AI R&D to rush complete flops like Vision Pro out the door. Now that the dust has settled, the opportunity cost of these hardware ventures was clearly a mistake. Apple had more than a decade to sharpen their knives and prepare for war with Nvidia, and now they're missing out on Nvidia's share of the datacenter market. Adding insult to injury, they're probably also ~10 years behind SOTA in the industry unless they hire well-paid veterans at great expense.
Apple's chronic disdain for unprofitable products, combined with boneheaded ambition, will be the death of them. They cannot obviate real innovation and competition while dropping nothingburger software and hardware products clearly intended to bilk an unconscious userbase.
And also to hop off without any penalty if/when the wave collapses.
Edit: Yes it exists, seems to be built off qwen2.5 coder. Not sure it proves the point I thought it was, but diffusion LLMs still seem neat
Source?
So is everyone else, to be fair. Chat is a horrible way to interact with computers — and even if we accept worse is better its only viable future is to include ads in the responses. That isn't a game Apple is going to want to play. They are a hardware company.
More likely someday we'll get the "iPhone moment" when we realize all previous efforts were misguided. Can Apple rise up then? That remains to be seen, but it will likely be someone unexpected. Look at any successful business venture and the eventual "winner" is usually someone who sat back and watched all the mistakes be made first.
Chat is like the command line, but with easier syntax. This makes it usable by an order of magnitude more people.
Entertainment tasks lend themselves well to GUI type interfaces. Information retrieval and manipulation tasks will probably be better with chat type interfaces. Command and control are also better with chat or voice (beyond the 4-6 most common controls that can be displayed on a GUI).
I kinda disagree with this analogy.
The command line is precise, concise, and opaque. If you know the right incantations, you can do some really powerful things really quickly. Some people understand the rules behind it, and so can be incredibly efficient with it. Most don't, though.
Chat with LLMs is fuzzy, slow-and-iterative... and differently opaque. You don't need to know how the system works, but you can probably approach something powerful if you accept a certain amount of saying "close, but don't delete files that end in y".
The "differently-opaque" for LLM chatbots comes in you needing to ultimately trust that the system is going to get it right based on what you said. The command line will do exactly what you told it to, if you know enough to understand what you told it to. The chatbot will do... something that's probably related to what you told it to, and might be what it did last time you asked for the same thing, or might not.
For a lot of people the chatbot experience is undeniably better, or at least lets them attempt things they'd never have even approached with the raw command line.
Exactly. Nobody really wants to use the command-line as the primary mode of computing; even the experts who know how to use it well. People will accept it when there is no better tool for the job, but it is not going to become the preferred way to use computers again no matter how much easier it is to use this time. We didn't move away from the command-line simply because it required some specialized knowledge to use.
Chatting with LLMs looks pretty good right now because we haven't yet figured out a better way, but there is no reason to think we won't figure out a better way. Almost certainly people will revert to chat for certain tasks, like people still use the command-line even today, but it won't be the primary mode of computing like the current crop of services are betting on. This technology is much too valuable for it to stay locked in shitty chat clients (and especially shitty chat clients serving advertisements, which is the inevitable future for these businesses betting on chat — they can't keep haemorrhaging money forever and individuals won't pay enough for a software service).
Why? We interact with people via chat when possible. It seems pretty clear that's humanity's preferred ineraction model.
We don't know what is better for this technology yet, so it stands to reason that we reverted to the lowest common denominator again, but there is no reason why we will or will want to stay there. Someone is bound to figure out a better way. Maybe even Apple. That business was built on being late to the party. Although, granted, it remains to be seen if that is something it can continue with absent of Jobs.
That's a good supporting argument, but I don't think McDonald's adequately represents more complex discussions.
Mac, iPad and iPhone, eventually Watch and Vision. Which makes sense since Apple is first and foremost a hardware company.
Aws is making strides but in a different area.
But it's complicated because commodities don't carry brand weight, yet there's obviously a brand power law. I (like most other people) use ChatGPT. But for coding I use Claude and a bit of Gemini, etc. depending on the problem. If they were complete commodities, it wouldn't matter much what I used.
A part of the issue here is that while LLMs may be trending toward commodity, "AI" isn't. As more people use AI, they get locked into their habits, memory (customization), ecosystem, etc. And as AI improves if everything I do has less and less to do with the hardware and I care more about everything else, then the hardware (e.g. iPhone) becomes the commodity.
Similar with AWS if data/workflow/memory/lock-in becomes the moat I'll want everything where the rest of my infra is.
You can't get a consumer-grade GPU with enough VRAM to run a large model, but you can do so with macbooks.
I wonder if doubling down on that and shipping devices that let you run third party AI models locally and privately will be their path.
If only they made their unified memory faster as that seems to be the biggest bottleneck regarding LLMs and their tk/s performance.
You can if you're willing to trust a modded GPU with leaked firmware from a Chinese backshop
We may care about running LLMs locally, but 99% of consumers don't. They want the easiest/cheapest path, which will always be the cloud models. Spending ~$6k (what my M4 Max cost) every N years since models/HW keep improving to be able to run a somewhat decent model locally just isn't a consumer thing. Nonviable for a consumer hardware business at Apple's scale.
No comments yet
https://github.com/ml-explore/mlx/pull/1983
It is? I haven't seen anything about this.
Of course nobody knows how this will eventually play out. But people without inside information on what these big organizations have/possess, cannot make such predictions.
But nvda isn't that far behind, and has already moved to regain some space with their PRO6000 "workstation" GPUs. You get 96GB of VRAM for ~7.5k$, which is more than a comparable RAM mac, but not 30k you previously had to shell for top of the line GPUs. So you get a "prosumer" 5090 with a bit more compute and 3x VRAM, in a computer that can sell for <10k$ and beat any mac at both inference and training, for things that "fit" in that VRAM.
Macs still have the advantage for larger models, tho. The new DGX spark should join that market soon(tm). But they allegedly ran into problems on several fronts. We'll have to wait and see.
looks like there will be several good options "soon"?
People like screens. They like seeing IG pictures,they like scrolling through TikTok, they like seeing pictures/videos their friends/family send/post. I doubt many people will want to see pictures/videos on a watch screen or in glasses (which still have a ways to go).
Also I don't buy the premise of this article that Apple is deciding to take a backseat in AI, they were late to the party but they are trying (and failing it seems) to build foundational models. Reaching for OpenAI/Anthropic/etc while they continue to work on their internal models makes a lot of sense to me. It acknowledges they are behind and need to rely on a third-party but doesn't mean they won't ever use their own models.
Unless something changes (which is absolutely possible) it does seem we are headed towards LLMs being comedities. We will see what OpenAI/Ive end up releasing but I don't see a near-future where we don't have screens in our pockets and for that Google and Apple are best placed. With the GPT-5 flop (It's 4.6 at best IMHO) I have less concerns with LLMs growing as quickly as predicted.
What matters for the future is what killer apps can be built on this commodity (tokens)?
Right now we've got content/text generation and ... nothing else?
Software operators. LLMs can operate any kind of software (check MCP). If you reduce this to just "text generation", what else is left?
AI chatbots in pdf viewers!
Oh wait.. we already have that and it's useless.
> Interestingly, Intel still reached new highs for a decade after missing mobile before it all collapsed
That's because their problems weren't due to missing mobile but rather taking too much risk on a fab tech transition that they then fumbled. This put them permanently behind and they were unable to catch up.
> Amazon’s AWS is predicated on the idea of commoditized infrastructure at scale where price is the priority
Since when does AWS compete on price? AWS is predicated on the idea of many proprietary services running on commodity hardware, and charging high prices for the privilege of not spending time on sysadmin work.
Your comment on Intel is correct, but it's also true that TSMC could invest billions into advanced fabs because Apple gave them a huge guaranteed demand base. Intel didn’t have the same economic flywheel since PCs/servers were flat or declinig.
That's a good clarification on Amazon, running on commodity hardware with competitive pricing != competing on price alone. It would have been better to clarify this difference when pointing out that they're trying the same commodity approach in AI.
But also, we are seeing models leapfrog each other; the best model this week is often not the best model next month and certainly not next quarter. I still see merit to the idea that cloud providers should focus on being the place where companies put their data, and make it easy for companies to bring models and tools to that data.
I agree with the article that Apple is probably cooked if the continue their current path for another couple of years.
Apple could simply sell earpods with an AI voice interface, the UI/UX and solution space is evolving too quickly for a company like Apple to sensibly try to define how it will use the technology yet.
See Google and Microsoft's failed integrations of AI with search for an example of how it should not be done.
The more I use AI the more I think I don't need a laptop and would settle for a comfortable VR setup as I do far less typing of symbols.
No comments yet
This blog post only really makes sense if you wholesale buy into the concept that Generative AI is going to do everything its most ardent boosters claim it will, on the timeline they say it will, none of which has really bore out as true thus far. For anything less, Apple and Amazon’s strategy of commoditization of models themselves makes sense.
That being said, do I have nitpicks over their respective strategies? You betcha. Amazon is so focused on an Apple-like walled-garden approach for enterprise compute that they run the risk of being caught up in shifting tides of geopolitics and nationalism along with increased attention on cost reductions. Apple, on the other hand, is far too exposed with the iPhone being the center of their ecosystem that a fundamentally new device could easily render them the next Nokia.
Between the two, Apple at least seems to be keen innovating at its own pace and hoovering up competition that hits the true moonshots - not that I expect that strategy to keep working as antitrust scrutiny mounts against Big Tech. AWS, by comparison, is seemingly taking the Microsoft and VMware approach of just adding yet another service icon to the catalog and letting customers figure it out for themselves, which, well, just go ask the old guard of Big Tech how that strategy worked out for them when everyone ran into the public cloud.
Neither strategy has long (or even mid) term viability, but AI almost certainly won’t be the tech that pierces their moat.
It’s a smart approach. Get the CAPEX done while there’s appetite and political will for it.
The phone is where the data is and Ai will find its usefulness in that data (and interface).
But your comment about the phone could have been about horses, or the notepad or any other technology paradigm we were used to in the past. Maybe it'll take a decade for the 'perfect' AI form factor to emerge, but it's unlikely to remain unchanged.
We more and more turn into cyborgs, wearing all kinds of processors and sensors on our bodies. And we need this hardware and the software that runs on it to be trustworthy.
Being the hardware producer and gatekeeper for trustworthy software to run on it is probably big enough of a market for Apple.
Even more so if their business of getting 15% to 30% of the revenue apps generate on the iPhone continues.
It has yet to be seen what type of company becomes most valuable in the AI stack. It could very well be that it does not operate LLMs. Similar to how the most valuable company in the hospitality industry does not operate hotels. It just operates a website on which you book hotels.
It’s also an opportunity to disrupt… build hardware specifically for ai tasks and reduce it down to just an asic.
I could be wrong and we could be seeing the ladders being pulled up now, moats being filled with alligators and sharks, and soon we'll have no choice but to choose between providers. I hope we can keep the hacking/tinkering culture alive when we can no longer run things ourselves.
Apple is in a very favorable position with its control of arguably the most important platform of all. They can force app developers to allow Apple AI to automate their apps, and prevent other AI providers from doing the same, and they make a strong privacy argument about why this is necessary.
The problem is with your expectations. Apple is no longer winning all the time, so relatively, it feels like it's losing. And Amazon is the quiet beast lurking, it's just doing a poor job at marketing.
Voice input isn't suitable for many cases, and physical input seems generally superior to AR -- I've used a Vision Pro, and it's very impressive, but it's nowhere near the input-performance of a touchscreen or a mouse and keyboard. (To its credit: it's not aiming for that.)
Unless the argument is that you will never have to be precise, or do something that you don't want everyone within earshot to know about?
Also, a "dynamic, context-dependent generative UI" sounds like another way to describe a UI that changes every time you use it depending on subtle qualities of exactly how you reached it this time, preventing you from ever building up any kind of muscle-memory around using it.
Are phones and AI on the opposite ends of some axis where success in one precludes success in the other? Does the use of AI reduce the use of compute and data? I have my own opinions on the topic, but beyond the eye-catching title this article didn’t inform one way or the other.
They continue building a distributed data-acquisition and edge data-processing network with their iPhone sales, where the customer keeps paying for both hardware and traffic.
They constantly take measures to ensure that the data their customers generate is only available to Apple and is not siphoned away by the users themselves.
The moment they finish the orchestration of this distributed worldwide iOS AI-cluster, they will suddenly have both the data and the processing at comparatively low operational cost.
The biggest risk? A paradigm-shift where the Smartphone is no longer the main device (smart glasses being a major risk here), and some other player takes the lead here.
I'd have to disagree with this. There are a number of general consumer interfaces and none of them have much of a moat or brand loyalty.
Also, Apple is in the best position by far to make consumer hardware that can do inference on-device. No other company even comes close.
As opposed to what exactly? If the author really believes that a watch and glasses are preferred over a smartphone, I don't know what to say. Also, note that the author doesn't know what this new form factor will look like. In that case, what's the point in declaring that Apple and Amazon will miss the boat?
I think a much bigger threat to Apple is a smartphone with an AI driven operating system. But what do I know?
Apple now only cares about reveneue and retirement of all those who made Iphone and Macs great. They are rich so they don't need to innovate big until they are like Intel now. But they try creating toys like Vison Pro, and self driving car that was coming for a decade. Just all for the fun of it.
Old company with old leader and 0 hunger for succees. Opposite of all big AI startups today.
Which AI products are Apple lacking that put them so far behind?
What we gonna see from Apple, IMHO, is a horde of smaller models working on different tasks on the devices itself, plus heavier stuff is running on "personal cloud compute" instances.
We'll not see Apple AI besides a sticker, it'll just try to do its work at the background.
Oh, people balk at auto-word-complete in the latest macOS, but I find it very useful. Spending little time here and there and it counts.
An LLM controls all my house lights right now.
It's funny how the unexpected is way more impressive. I tried out the voice commands on my new car (BYD) and after it correctly interpreted and implemented my request, I politely said "thank you!" (as I always do, it's important to set a good example) and the car responded "you're welcome!". 100% could just have been a lucky setup for a scripted response... but...
If you earnestly believed that dynamic generative UIs are the future, surely you'd be betting on Apple.. The company with fully integrated hardware already capable of cost effectively running near-frontier generative models on the end user hardware.
Apps would expose their functionality to the Smart Siri. Maybe there's already something like this with the shortcuts. Maybe I could give Claude Code style blanket permissions to some actions, some ones I would like to review.
Because this is not a science fiction future, but a corporate one, where neither do those magical UIs exist, nor do you have enough power in wearables for them to be everything you need?
I think they’ll be fine. Amazon is an investor in Anthropic and Apple has an agreement with OpenAi. I’d consider that as an inorganic growth.
The real question is how do we continue the grift? AI's a huge, economy-sustaining bubble, and there's currently no off-ramp. My guess is we'll rebrand ML: it's basically AI, it actually works, and it can use video cards.
AI is a great feature funnel in terms of like, "what workflows are people dumping into AI that we can write purpose-built code for", but it has to transition from revenue generator to loss leader. The enormity of the bubble has made this very difficult, but I have faith in us.
My guess is that they're also adapting to the changing ecosystem, and since they move very slowly, the trends seem archaic (like Apple Intelligence featuring image and summary generation 12-18 months after it was found to be novel by anyone).
I'm hoping that they lean in hard on their intents API forming a sort of "MCP" like ecosystem. That will be very powerful, and worth spending extra time perfecting.
leaks seem to indicate they're working on some home AI assistant, which kind of lends to their typical lagging timeline with the goal of quality
Facebook has a data advantage and a use base but at the end of the day they will always need to make Nvidia or a cloud provider rich to run/train their models.
cost is not the priority with AWS. To quote my collaborator, "I just scaled up to 600 PB for our event"
When I think AWS I think speed, scale-up, scale-out, capacity & dynamic flexibility are more key.
AWS is not the "cheapo commodity option", nor is Azure
I don't think they are missing out on anything. Everyone wants products from Apple or Amazon, only some power users and managers want "AI".
The deeper issue, in my view, is that Apple is violating its own philosophical belief, articulated by Steve Jobs himself: that Apple should always own the primary technology behind its products (i.e., multi-touch, click-wheel). This is central to Apple's "we need to make it ourselves" approach.
Camera lenses are commodities. AI models are foundational. Apple's own executive leadership likened models to the internet, and said, well surely we wouldn't try to build and own the internet! This misplaces AI as infrastructure when in fact it's foundational to building and serving useful applications.
They further see "chat" (and by extension) as an app, but I think it's more like a foundational user interface. And Apple's certainly always owned the user interface.
When Siri was first announced, there was excitement that voice might be the next paradigm shift in user interface. Partly because Siri is so bad for a decade now, and partly because people didn't feel like talking to their screens, Apple may have learned a very unhelpful lesson: that Siri is just a feature, not a user interface. In this age though, chat and voice are more than features, and yet Apple doesn't own this either.
Apple should not buy Perplexity. Perplexity is smoke and mirrors, and there's nothing defensible about its business. Apple cannot get the talent and the infrastructure to catch up on models.
So what then?
OpenAI is not for sale. Anthropic is likely not for sale, but even if it were, Apple wouldn't buy it: Anthropic is very risky to Apple's profit margin profile and Apple can't unlock Anthropic's bottleneck (capacity).
In fact, to Apple's advantage, why not let the VCs and competitors like Microsoft and Amazon and Google light their money on fire while this industry takes shape, provided you have an option in the end to sell a product to the consumer?
The best option, in my view, is Search Deal Redux: partner deeply with Google and Gemini. Google very obviously wants as much distribution as possible, and Apple has more hardware distribution than any company in history.
Partnership is the only path because, yes, Apple missed this one badly. There is one area where I agree with Tim Cook, though: being late doesn't guarantee you lose, even in AI.
I don't. The foundational interface hasn't been created yet. Let's be honest, chat isn't great. It is the best we have right now to leverage the technology, yes, but if it is our future — that we cannot conceive of anything better, we've failed miserably. We're still in the "command-line age". No doubt many computing pioneers thought that the command-line too was foundational, but nowadays most users will never touch it. In hindsight, it really wasn't important — at best a niche product for power users.
> Apple may have learned a very unhelpful lesson: that Siri is just a feature, not a user interface.
That is the right lesson, though. Chat sucks; be that through voice, typing, or anything else. People will absolutely put up with it absent of better options, but as soon as there is a better option nobody is preferring chat. This is where Apple has a huge opportunity to deliver the foundational interface. Metaphorically speaking, they don't need to delver the internet at all, they just need to deliver the web browser. If they miss that boat, then perhaps there is room for concern, but who knows what they have going on in secret?
Intel lost the hardware fight. But it lost all along as the most profitable part is software or who control the ecosystem. Microsoft and Apple. Not Intel.
It is amd cannot beat nvidia but the ecosystem of Ai (but not game) is what the lost is.
Now what ecosystem apple and Amazon lost … can it get to avoid nvidia controlling their Ai ecosystem. I doubt very much.
The macbook could come with some models, and brew ecosystem would supply others.
Maybe they won't be so big in monetary value anymore? So what? There are endless states between huge and nothing, actually most organizations live there and most serve people well.