"The Liquid Glass effects are not expensive and anyone claiming they are has no idea how modern GPUs and animation work. Anyone saying it is is either just parroting or is an idiot."
I'm inclined to believe what I have experienced. I have never before experienced my 2020 iPad Pro to be remotely slow. I use it for some web browsing and YouTube viewing, so I really don't need a lot of computing power.
Now that I'm running the iOS 26 beta, I frequently feel animations going slowly or hitching. It's not terrible, but for the first time, I have the feeling that my experience using my iPad would be noticeably improved if I bought a new and more powerful one.
But I guess this makes me an idiot according to Mitchell?
zuhsetaqi · 1d ago
Beta versions are always slow and sluggish. Just install the latest beta of iPadOS 18. It will be sluggish. The reason is that in beta versions there is a lot of logging and reporting running in the background which can not be disabled.
mort96 · 1d ago
We will see. It feels worse than earlier betas; I have always put the public betas on my iPad, and this is the first time we're this late in the cycle and my iPad feels too slow. But nothing would make me happier than if this all just goes away when iOS 26 is released properly and all animations run at a smooth 120 FPS again.
hulitu · 5h ago
> Beta versions are always slow and sluggish. Just install the latest beta
I never though HN will be a source for paradoxes.
andai · 1d ago
I experience this basically any time I upgrade my phone OS. There's never anything new that makes me happy, it's always either they removed something I used, made something uglier, and always it's 2-3x slower than it used to be.
Same thing with Windows. If they just stopped touching it 20 years ago, it would be 50x more responsive now.
Melatonic · 1d ago
Just turn them all off along with transparency and whatnot in the vision impaired setting. I believe there's also a setting for scrolling or how pages move back in forth (seems to be faster to me)
I always so this with all phones as it saves battery life and feels way snappier to me than some random animation between windows.
Voultapher · 1d ago
> I believe there's also a setting for scrolling or how pages move back in fort
I'm struggling to parse that sentence, please elaborate.
ikidd · 22h ago
"move back and forth" I think is what they were going for, but proofreading isn't a thing anymore.
mananaysiempre · 20h ago
Ehh more like spellcheckers aren’t something you only get in a word processor anymore, and autocorrect doesn’t help either. I’m getting the impression that there are much more malapropisms on the Internet (and much, much fewer outright typos and spelling errors) than there used to be, say, a decade ago, and I strongly suspect spellcheckers are to blame.
(Proofreading in professional publishing is, indeed and to that industry’s great shame, much less of a thing than it used to be, but that’s a different story.)
alluro2 · 1d ago
Sounds like it's working exactly as intended.
NaomiLehman · 1d ago
That 2020 iPad didn't have an M SoC. That's a massive milestone upgrade in 2021.
depr · 1d ago
Yes. The beta is discussed in the replies.
mort96 · 1d ago
X isn't showing me any replies, so I can't read them.
fainpul · 1d ago
If that's so, can't he explain it ELI5 style instead of calling people idiots?
I have a hard time believing that the GPU is somehow magically energy efficient, so that computing this glass stuff uses barely any energy (talking about battery drain here, not "unused cycles").
DougBTX · 1d ago
Here's an attempt at that: The GPU is responsible for blending layers of the interface together. Liquid glass adds a distortion effect on top of the effects currently used, so that when the GPU combines layers, it takes values from (x + n, y + m) rather than just (x, y). Energy efficiency depends on how much data is read and written, but a distortion only changes _which_ values are read, not how many.
vrighter · 1d ago
It needs to read more than one value. Otherwise blurring cannot happen. That's automatically more work. And also, considering the effect being physics based, even in your example, were it correct, calculating what n and m are is not trivial.
DougBTX · 1d ago
These UI elements (including the keyboard!) already blur their background, so that’s not a new cost. My 5 year old phone handles those fine. The distortion looks fancy, but since the shape of the UI elements is mostly-constant I’d expect them to be able to pre-calculate a lot of that. We’ll see when it ships!
1718627440 · 1d ago
But when you only get redraw requests for what's actually visible. When the upper layers are transparent you constantly need to redraw everything.
No comments yet
JKCalhoun · 1d ago
My generous interpretation is that he means the GPU is magically energy efficient compared to the CPU. I wouldn't dispute that.
But Apple went down that xPU-taxing path a long time ago when they added the blur to views beneath other views (I don't remember what that was called).
sillywalk · 1d ago
The translucency goes all the way back to the original Aqua interface in Mac OS X. I believe the compositing started getting some GPU acceleration (Quartz Extreme) in Mac OS X 10.2 Jaguar all the way back in 2002.
GCUMstlyHarmls · 1d ago
Translucency.
conradev · 1d ago
I agree with Mitchell.
Gaussian blurs are some of the most expensive operations you can run, and Apple has been using them for a long time. They’re almost always downscaled because of this.
The first retina iPad (the iPad 3 if I recall) had an awfully underpowered GPU relative to the number of pixels it had to push. Since then, the processors have consistently outpaced pixels.
Your device is easily wasting more time on redundant layout or some other inefficiency rather than Liquid Glass. Software does get slower and more bloated over time, often faster than the hardware itself, not in the ways you might expect.
The GPU has so much headroom that they fit language models in there!
arghwhat · 1d ago
The problem with these kinds of blur effects is not the cost of a gaussian blur (this isn't gaussian blur anyway as it has a lens effect near the edges). It's damage propagation and pipeline stalls.
When you have a frosted glass overlay, any pixel change anywhere near the overlay (not just directly underneath) requires the whole overlay to be redrawn, and this is stalled waiting for the entire previous render pass to complete first for the pixels to be valid to read.
The GPU isn't busy in any of this. But it has to stay awake notably longer, which is the worst possible sin when it comes to power efficiency and heat management.
conradev · 1d ago
Yes, that all makes sense! My understanding is that the damage propagation gets worse with depth (no limit) in addition to breadth (screen size). If the compositor has N layers, a blur layer, N more layers, another blur layer, etc. then there are a lot of "offscreen render passes" where you have to composite arbitrary sets of layers exclusively for the purpose of blurring them.
It's true that GPU is itself not busy during a lot of this because it's waiting on pixels, but whatever is preparing the pixels (copying memory) is super busy.
Downscaling is a win not just for the blurring, but primarily the compositing. KDE describes the primary constraint as the number of windows and how many of them need to be blended:
The performance impact of the blur effect depends on the number of open and translucent windows
As long as the lower blur layers are not fully occluded by opaque content, then yes - they all need to be evaluated, and sequentially due to their dependency. This is also true if there is transparency without blur for that matter, but then you're "just" blending.
Note that there are some differences when it's the display server that has to blur general output content on behalf of an app not allowed to see the result, vs. an app that is just blurring its own otherwise opaque content, but it's costly regardless.
(There isn't really anything like on-screen vs. off-screen, just buffers you render to and consume. Updating window content is a matter of submitting a new buffer to show, updating screen content is a matter of giving the display device a new buffer to show. For mostly hysterical raisins, these APIs tend to still have platform abstractions for window/surface management, but underneath these are just mini-toolkits that manage the buffers and hand them off for you.)
The buffers are not that different, it really just means “extra allocation”
Melatonic · 1d ago
Thankfully you can probably turn it off as usual
ccapitalK · 1d ago
Gaussian Blur isn't the most efficient way of doing a frosted glass blur effect though. IIRC the current state of the art is the Dual Kawase blur, which is what KDE uses for its blurred transparency effect, I've never observed performance issues having it running on my machine.
thfuran · 1d ago
A Gaussian blur is separable, making it far more efficient than many other convolutional filters, and convolutions are hardly the most expensive sorts of operations you could run.
latexr · 1d ago
Alan Dye introduced the design by stating:
> Now, with the powerful advances in our hardware, silicon, and graphics technologies, we have the opportunity (…)
Coupled with the reports of sluggish performance from the early betas, it’s understandable people would reach the conclusion that the new design pushes the hardware significantly more than before.
rickdeckard · 1d ago
Depends on whether "expensive" refers to
a.) Compute-cycles: Some added passes to apply additional shading on top of the completed render, or
b.) Power-consumption: Some added delay in putting components to sleep (reducing CPU/GPU-clock) on every screen update.
Deferred sleep for a portable, battery-powered device because of a longer UI-rendering pipeline can easily add up over time.
--
I'd be quite interested to see some technical analysis on this (although more out of technical curiosity than the assumption that there is something huge to be uncovered here...).
There's also the aspect of iOS prioritizing GUI-rendering over other processing to maintain touch-responsiveness, fluidity, etc. Spending more xPU-time on the GUI potentially means less/later availability for other processes.
For sure non-native apps trying to emulate this look (i.e. Flutter) will create a significantly higher impact on the power-profile of a device than a native app.
It would be hard to find a more "ad hominem and not ad rem" quote.
ohdeargodno · 1d ago
No, Hashimoto is a dumbass and definitely has no idea how GPUs work either, or how compositors work.
Sampling has a cost. Doing it with the previous frame makes it cheaper, but when half your products are running at 60FPS you can't afford to do that, so they're sampling the current frame.
Additional composition work has a cost, additional pixel shader work has a cost, additional video memory bandwidth has a cost. Glass effects that might sample pixels way outside just neighboring pixels have a cost.
Is liquid glass as expensive as some heavy duty visual effects like HBAO+, or lighting a scene with a hundred light sources? No. But pretending it's "free" is a dumbass take, and Hashimoto writing half baked shaders once for ghostty does not make him an expert in GPU performance. GPU running for longer and having dependencies on other passes is, fundamentally, more work.
(Additionally, the top answers to that tweet are a who's who of dogshit tech influencers and wannabe rendering dumbasses regularly having the worst opinions one can imagine. Quite telling)
theshrike79 · 1d ago
IMO the actual reason for Liquid Glass is that they can do it natively. 3rd party UI frameworks can't copy it without a massive performance hit.
This brings a more clear divide between fully native iOS applications and React Native -style "build once and cross-compile" -platforms.
I wrote about this topic earlier, I believe there is one main reason for Apple to go in this direction: universal design language (spatial computing, Vision Pro). Of course, there can be many side-effects (or reasons) that can be the actual main reason instead, such as 'innovation' or like you mention, guiding people to use the native systems, we'll never know. In my eyes, the universal design language is the main reason. Apple is and always has been design-focused.
rickdeckard · 1d ago
While this may be a nice side-effect of that direction, I am quite sure that Apple was not even considering such 3rd party impacts when they decided.
Looking back at the announcement, it's more likely that the Management had to decide on the key message for the unveiling prior to the event, and there wasn't much "media-disruptive" to choose from.
So Liquid Glass got elevated to top priority and then all teams got the order to ensure it is present in as many apps as possible.
aatd86 · 1d ago
Any third party UI frameworks which does NOT reuse the platform.
There are different views on this.
Perhaps that in the long run, that may make more sense since you may want to facilitate the same user experience across devices anyway.
Especially for branded apps.
So Liquid glass or not would not matter that much, especially if this is simply a design fad like the numerous ones before it.
Now my personal inclination is to integrate with the platform still, as someone currently in the process of building such 3rd party frameworks.
Less work for me as the platform handles rendering, keyboard events etc.
The only issue is that on iOS, UIKit is far easier to interact with than SwiftUI.
And Swift is too large a language and SwiftUI not simple enough for my taste. Just too bad it is difficult to piggy back on what exists in SwiftUI and is not available in UIKit (yet). There are ways to bridge in a tripartite way perhaps.. But I digress...
zarzavat · 1d ago
If the native apps end up looking worse than the web apps then they played themselves...
ohdeargodno · 1d ago
>3rd party UI frameworks can't copy it without a massive performance hit.
Do you believe pixel shaders are some unique magic that only Apple has the secret sauce to and noone else can use? There were some efficient implementations of liquid glass for flutter before the iOS beta was even released. Glass effects are dime a dozen on shadertoy, they're one of the most basic effects you could learn to do when learning about texture sampling.
The platforms that will take a large performance hit are the ones that can't drop down to the native platforms, i.e. the web.
theshrike79 · 1d ago
But will someone spend the time to recreate a pixel-perfect Liquid Glass effect?
(Actually, someone probably will. People are weird that way.)
I'm pretty sure any recreation will still be just that little bit off compared to the native one, which is their intention IMO.
bradhe · 1d ago
Anyone else get strong GPT5 vibes from this article?
firefoxd · 1d ago
Author here. No the article is not generated, it's dictated. This was more of a shower-thought (car thought I guess, since i was driving) that I dictated and fed to deepseek to punctuate and structure for myself. When it's a quick, short, unrefined article, I post it on the byte size section of my blog. So no, these are my words, and you can find it on some of my old comments where I mention how I've been testing AI editing and it's a hit or miss.
yellow_lead · 1d ago
Yes, either the author writes like AI, or this was written by AI. Either of which are not great
Telemakhos · 1d ago
Not really: as much as I despise AI slop, LLMs do a much better job than this author of making sure I have the information needed to understand a point. For example:
> I have this little web app I built for my kids to help them manage their day. It has those tiles that animate when you hover on them.
I have absolutely no idea what “those tiles” are. They are familiar to the author, but he has not bothered to explain them well enough to deserve that familiar “those.” AI would have explained them better.
This isn’t AI slop, just sloppy writing.
azangru · 1d ago
No. Could you try to put your finger on what exactly in this text gives you the vibes?
Also, a sibling comment suggests that "those tiles" is some sort of slop; but I find it no more sloppy than "this little web app" in the preceding sentence. Both are handwavy markers of imprecision common in oral speech. A comment on English Stack Exchange points out that this feature is referred to as the "indefinite this" [0].
Yea. I was not going to say that because apparently it's bad etiquette - and we're supposed to look at the substance. It happened to my writing once where someone said it was AI written and it did hurt me. :'(
But at this point I'm seeing it everywhere... the "Trash question? Trash answer" format posed as poetry EVERYWHERE and it is correlated with slop and I'm finding it very annoying to read. I might (to my own detriment perhaps) start factoring that in into what I'm gonna continue reading.
Examples of that question format in this article:
- Liquid Glass? That's what your M4 CPU is for
- That glassy transparency and window animations? A notorious resource hog that brought mid-2000s hardware to its knees
- The moment a single tile wiggles? The entire UI crawls
- Checking mail? Browsing? Streaming? Your M4 is bored out of its silicon mind
- That whole section on: Battery life? ... Thermals? ... Future-proofing? ... Real workloads? ... ugh.
- Is it worth it? For Apple’s vibe? Probably. But next time your fan whispers or your battery dips faster than expected… maybe blame the glass.
And apart from that question format, there's another thing but I can't quite figure out the pattern behind it but is slop: (I would really love to figure out what my brain thinks as "off" in these - maybe sentence length variability, maybe colons, or maybe trying to be poetic or dramatic or too self-assured with basically no data/substance underneath it, idk):
- Let’s be real: eye candy always comes at a price.
- When the system’s stressed, the pretty things break first.
- They chew through GPU/CPU time. Always have.
- Here’s my hot take: Apple knows exactly what they’re doing
- It's stealth bloat.
- You might not feel the drag today. That’s the point! The M4’s raw power is the perfect smokescreen. But those cycles aren’t free
- TL;DR: Liquid Glass is gorgeous tech debt. Your M4 can afford it… for now. But never forget: fancy pixels demand fancy math.
Basically the entire article at this point. There was one place which was a bit personalized about the web app he built for his kids where I was like OK at least something seems OK, but as another user pointed out "It has those tiles that animate when you hover on them" doesn't make any sense. What tiles. How are we supposed to know.
artursapek · 1d ago
the TLDR especially
SoKamil · 1d ago
Has anybody tested it yet? It is computed as GPU shader. I wonder if it is that less efficient compared to frosted glass blur effect that was there for at least a decade.
arghwhat · 1d ago
Blur is expensive because it propagates damage and has to wait for all previous rendering to be complete.
When one pixel changes underneath a blur the entire blurred area needs to be redrawn, meaning that all elements on top needs to be redrawn. As the blur cannot render before the underlay is finished, the graphics pipeline is stalled. Fancy blur look past the area immediately underneath to more accurately lens effects, meaning each output pixel reads a lot of input pixels.
When it comes to power and heat management, the goal is to be able to power the GPU down again as fast as possible, and this kind fo thing prolongs it quite a bit. There may be a point where efficiency makes the result acceptable, but it's always going to be much worse than not doing that.
artemisart · 1d ago
No, you never compute individual pixels because you never need to, and it's always faster to it in bulk (vectorization, memory access...) and so over an area you take the same number of pixels as input (or a little bit more with padding) and the blur will only increase significantly the compute.
arghwhat · 1d ago
You misunderstood, this is not about computing individual pixels but only selective rerendering graphical elements which have been changed, and in turn figuring out the total area of change. This propagates through the entire stack to let the GPU scanout hardware know which tiles have changed, and allow partial panel self refresh updates (depending on hardware).
Rendering is still done in bulk for the changed areas, avoiding rendering expensive elements (e.g., transformed video buffers, deeply layered effects, expensive shaders). It's a fundamental part of most UI frameworks.
aeonfox · 1d ago
Are windowed GUIs still doing diffed screen updates? I would have assumed that GPUs make this kind of thing very unrewarding to implement as an optimisation. I'd imagine every window is being redrawn every frame as a 2D billboard with textures and shaders.
The Guassian blur and lensing effects would still slow things down by needing to fetch pixels from the render target to compute the fragment, vs painting opaque pixels.
arghwhat · 1d ago
The usual mechanism is to mark widgets that changed dirty, accumulate the bounding boxes of such dirty areas, take the next swapchain buffer and get its invalid regions, iterate through the widget tree and render anything that intersects with the bounding box or invalid regions, and submit the buffer + the dirty areas to the display server/driver.
And yeah, having a render step depend on the output of a previous non-trivial render step is Bad™.
aeonfox · 21h ago
I was under the impression that for GPU accelerated GUIs, all windows are rendered to a render target. It might be that windows underneath have gone to sleep and aren't updating, but they would have their last state rendered to a texture. This permits things like roll-over previews and layered effects to have a more trivial overhead.
Software renderers typically do the optimisation you're suggesting to reduce on memory and CPU consumption, and this was a bigger deal back in the day when they were the only option. I think some VNC-like protocols benefit from this kind of lazy rendering, but the actual VNC protocol just diffs the entire frame.
On the GPU, the penalty for uploading textures via the bus negate the benefit, and the memory and processing burden is minimal relative to AAA games which are pushing trillions of pixel computations, and using GBs of compressed textures. GPUs are built more like signal processors and have quite large bus sizes, with memory arranged to make adjacent pixels more local to each other. Their very nature makes the kinds of graphics demands of a 2D GUI very negligible.
ohdeargodno · 1d ago
>you never compute individual pixels because you never need to
Pixel shaders are looking at this laughing at you. PS_OUTPUT is a single pixel whether you want it or not. PS wavefronts are usually very small, so you're still going to be doing a lot of sampling.
SoKamil · 1d ago
I suppose we have to wait for official release. Beta builds have some debug stuff which makes OS slower. This has always been the case with all previous macOS beta versions that felt slower.
andai · 1d ago
The article says "M4 isn't using much CPU so let's add fancy effects."
There's some truth to that, in the sense of "the hardware can handle it now" (but he also mentions Vista, which came out like 20 years ago...)
If it's actually resource intensive, then the logic would probably be "let's make all last-gen devices a bit slower to encourage people to upgrade..."
At least, that's been my experience when upgrading iOS. It's basically the same thing except mysteriously way slower. (I wonder if the CPU throttling thing was part of the iOS upgrade that made my phone slow to a crawl a few years ago.)
mmcconnell1618 · 1d ago
Doesn't MacOS already render as 2x resolution and downsize in order to do font smoothing? Looks are important to Apple and I think they are willing to add custom hardware capable of handling these type of effects without killing battery life and CPU cycles.
jsheard · 1d ago
> Doesn't MacOS already render as 2x resolution and downsize in order to do font smoothing?
It's not for font smoothing, their fonts are rendered with anti-aliasing in the first place. The 2x scaling thing is due to how they handle non-integer scaling factors, or rather how they don't, instead they render at a higher resolution with an integer factor and then downscale the result to fit the display.
e.g. if you set a 3024x1964 Macbook to a virtual resolution of 1680x1050 (1.87x scale), it'll actually render internally at 3360x2100 (2x scale) and then squish that down to 3024x1964.
MatthewWilkes · 1d ago
It's certainly not my experience that the M3 is overpowered for browsing. With the proliferation of SPAs for everything from messaging to word processing, my Macbook Air reminds me of a Chromebook in more ways than one.
Tiberium · 1d ago
It's sad that HN top nowadays has articles that are completely AI-edited or AI-generated, with the default style that's obvious to anyone who has used them for enough time.
Luker88 · 1d ago
I was under the impression that the iPhone had dedicated hw to render the screen while in idle/locked or something like that, in order to avoid waking the main cores.
I wonder if it had enough capabilities to run the extra passes needed for all that blur.
So the problem might not be that it makes things slow, but that it prevents low-power modes. AKA: slightly less battery life, possibly unused hardware
lupajz · 1d ago
I haven't tried any of the beta releases, but would like to hear an opinion by somebody who did.
Is it possible to turn the effects off? I mean something like accessibility settings to tone down the transparency and "glassiness"?
maxvij · 1d ago
Yes, I've been on all the public beta's so far and yes, you can turn it down. You can turn on 'reduce transparency' in the accessibility settings. It breaks most of the effects, but text is definitely more legible. Haven't noticed anything performance wise (when Liquid Glass is turned on), but then again I might not be the best test candidate as I just got a new phone.
delta_p_delta_x · 1d ago
That's what your M4 *GPU is for
The CPU processes draw calls and runs the compositor, and most compositors are fairly straightforward. The GPU runs the rasterisation, shaders, culling, occlusion, etc to achieve the effect.
ErneX · 1d ago
If anything it’d more taxing on the iPhone because some of the effects are tied to the gyroscope of the device. We’ll see.
I wonder if OP at least tested macOS Tahoe to base his impressions upon.
rsynnott · 1d ago
While you could certainly question the _taste_ of the 'liquid glass' stuff, I doubt it's at all expensive. Modern GPUs are good at this sort of thing.
weikju · 1d ago
Written before any beta showed up.
Works fine on my M1 Macs sooooo whatever.
adamors · 1d ago
I don’t see the point in posting a 3 month old article either, especially without any updates.
nunez · 1d ago
It'll probably be something you can disable in Accessibility Settings, so there's that at least!
arnaudsm · 1d ago
We need numbers. Can someone benchmark the wattage difference of using Liquid glass vs disabled ?
hulitu · 8h ago
> Liquid Glass? That's what your M4 CPU is for
This is like saying: Wallpaper ? That's what your (Win 95 SD)RAM is for.
The OS should not consume all resources in a system. But tell that to Apple, Google or Microsoft.
Because they switched from release number to release year.
mcphage · 1d ago
To match the year.
znpy · 1d ago
Compiz was doing similar things (actually more and better things) in 2011 on the integrated intel gpus… i don’t think it’s much of a burden on modern gpus
api · 1d ago
Isn’t this stuff mostly rendered by the GPU, and pretty efficiently?
I remember eye candy Linux desktop stuff like Enlightenment doing stuff like this back in the 90s and early 2000s. Not as pervasive or flashy but similar: lots of translucency, skins, themes, etc. Ran fine on a 400mhz machine, though at a much lower resolution.
It’s not free but I’m curious to see how significant it is. I also wonder if you can turn off the animated stuff.
The biggest gripe I have with the Tahoe shots I’ve seen is thst they seem to show it wasting more space with larger margins, etc. I hate that trend. I have a huge ultra wide monitor and often find myself wanting even more screen real estate while I’m working. Stop wasting my pixels.
echelon_musk · 1d ago
As long as it can be disabled it is a complete nothing burger.
If it can't be disabled and ends up crippling my otherwise perfectly adequate M1 it will accelerate me to switch to daily use of Asahi Linux.
Edit: That's assuming the effects are even expensive for the GPU.
maxvij · 1d ago
It can be disabled (or at least tuned down) via accessibility settings.
robin_reala · 1d ago
It can be disabled: reduce transparency in the accessibility settings.
latexr · 1d ago
> As long as it can be disabled it is a complete nothing burger.
Defaults matter. Most people never change them.
1oooqooq · 1d ago
they already got caught artificially slowing older iphones on ios updates, remember that?
this is a way to slow older models and get away with it.
theshrike79 · 1d ago
It was a choice between two options in phones with bad batteries:
- Slow down the phone so that it still works
- Crash phone immediately
Seems like the majority of people would've wanted option 2 for some strange reason?
They did fix it by giving people the option, but I wonder how many still opted for "yes, I want my phone to suddenly just die instead of slowing down".
And the fix for all this was a 50€ battery replacement.
ACCount37 · 1d ago
The issue was always that of communication and trust.
Communication: at no point has Apple tried to communicate "your phone's battery has gone bad, the device now suffers for it" to the user. Even though they obviously knew that this was what was happening.
Trust: with how openly anti-repair Apple was at the time, how can you trust that this was a honest oversight, and not another malicious action designed to prevent people from repairing their devices?
Apple has improved upon both since. But they're still not anywhere near perfect - and it took a lot of getting their shit kicked in by the media, the public and the regulators for Apple to get even this far.
Zambyte · 1d ago
How has Apple improved on repairability?
ACCount37 · 1d ago
They now actually sell some parts online, and offer official repair manuals and tools.
That only covers a few basic repair types, the tooling is clunky (for a reason - those tools are designed to allow cheap-and-replaceable official employees to perform basic repairs to an Apple-acceptable quality) and the parts are hilariously overpriced. But it's considerably better than nothing. They also enabled a few repairs that were previously hard blocked by software, and required some incredibly complex hardware workarounds because of it - like FaceID replacement.
Make no mistake - I have zero faith in this being Apple actually trying to be better. It's much more likely that this was them walking back their "no third party repairs never ever" stance under pressure from consumer rights activists and regulators in places like EU. But it's a change for the better either way - and it would be good to see more of that in the future.
foldr · 1d ago
The issue has always been that people who don’t trust Apple interpret everything that Apple does in the worst possible light.
I make no comment here on whether such suspicion is justified. But let’s imagine that the current behavior (showing the battery warning) had been the original behavior. Then people would no doubt have complained that Apple was trying to pressure people into unnecessary battery replacements.
theshrike79 · 1d ago
Ahh, the Misunderstanding Olympics and Apple's local CSAM scan implementation <3
thefz · 1d ago
> It was a choice between two options in phones with bad batteries:
> - Slow down the phone so that it still works
> - Crash phone immediately
Three, actually:
- Inform the user and provide decent repair and parts.
nani8ot · 1d ago
> Seems like the majority of people would've wanted option 2 for some strange reason?
People want to know this is what happens. If a slow down is not communicated, the average person doesn't go "Oh, I need a new battery.". They are going to buy a new phone because their old one is slow.
1oooqooq · 18h ago
do you work with technology and you swallowed that crazy excuse?
"oh we're protecting you. it's not like we have a dozen voltage and temperature sensors, and not like we already throttle when those sensors detect any of the things we're claiming in the excuse" ... and everyone fall for it. geez.
phoronixrly · 1d ago
Remind me, was the slowdown communicated in any meaningful way, and were the affected users directed to service their phones at the nearest apple store?
'Hey, there's an issue with your battery so we've slowed down your phone to prevent it going off. Please visit your local apple store to address this.' As opposed to misleading them that their phones are old and slow/crashing and they need to buy new ones?
can16358p · 1d ago
Great. Let's not slow down CPUs coupled with aging batteries and instead have a full system crash suddenly at the most demanding moment. Sure that would be much better.
mschuster91 · 1d ago
On top of that, hard crashes and partial brownouts due to undervolting can and do lead to data corruption on all sorts of storage, no matter if eMMC, straight NAND or microSD cards.
Most phone PCB designs do not account properly for the scenario "battery gone bad leads to undervolting of components" at all, the best you're gonna get is the BMC cutting off everything when the voltage at the battery drops way too much - but that is a failsafe mechanism, the flash, memory and processor chips will have undergone brownout events before the BMC emergency shutdown hits.
hulitu · 4h ago
> Most phone PCB designs do not account properly for the scenario "battery gone bad leads to undervolting of components"
We were talking about Apple here. If they don't know that a battery can get discharged, maybe they shall do something else.
For a SWE might be ok. For a HWE no.
hu3 · 1d ago
None of that was communicated previously which made many customers buy a new iPhone instead of replacing the battery.
"The Liquid Glass effects are not expensive and anyone claiming they are has no idea how modern GPUs and animation work. Anyone saying it is is either just parroting or is an idiot."
https://x.com/mitchellh/status/1933314816472723728
Now that I'm running the iOS 26 beta, I frequently feel animations going slowly or hitching. It's not terrible, but for the first time, I have the feeling that my experience using my iPad would be noticeably improved if I bought a new and more powerful one.
But I guess this makes me an idiot according to Mitchell?
I never though HN will be a source for paradoxes.
Same thing with Windows. If they just stopped touching it 20 years ago, it would be 50x more responsive now.
I always so this with all phones as it saves battery life and feels way snappier to me than some random animation between windows.
I'm struggling to parse that sentence, please elaborate.
(Proofreading in professional publishing is, indeed and to that industry’s great shame, much less of a thing than it used to be, but that’s a different story.)
I have a hard time believing that the GPU is somehow magically energy efficient, so that computing this glass stuff uses barely any energy (talking about battery drain here, not "unused cycles").
No comments yet
But Apple went down that xPU-taxing path a long time ago when they added the blur to views beneath other views (I don't remember what that was called).
Gaussian blurs are some of the most expensive operations you can run, and Apple has been using them for a long time. They’re almost always downscaled because of this.
The first retina iPad (the iPad 3 if I recall) had an awfully underpowered GPU relative to the number of pixels it had to push. Since then, the processors have consistently outpaced pixels.
Your device is easily wasting more time on redundant layout or some other inefficiency rather than Liquid Glass. Software does get slower and more bloated over time, often faster than the hardware itself, not in the ways you might expect.
The GPU has so much headroom that they fit language models in there!
When you have a frosted glass overlay, any pixel change anywhere near the overlay (not just directly underneath) requires the whole overlay to be redrawn, and this is stalled waiting for the entire previous render pass to complete first for the pixels to be valid to read.
The GPU isn't busy in any of this. But it has to stay awake notably longer, which is the worst possible sin when it comes to power efficiency and heat management.
It's true that GPU is itself not busy during a lot of this because it's waiting on pixels, but whatever is preparing the pixels (copying memory) is super busy.
Downscaling is a win not just for the blurring, but primarily the compositing. KDE describes the primary constraint as the number of windows and how many of them need to be blended:
https://userbase.kde.org/Desktop_Effects_Performance#Blur_Ef...Note that there are some differences when it's the display server that has to blur general output content on behalf of an app not allowed to see the result, vs. an app that is just blurring its own otherwise opaque content, but it's costly regardless.
(There isn't really anything like on-screen vs. off-screen, just buffers you render to and consume. Updating window content is a matter of submitting a new buffer to show, updating screen content is a matter of giving the display device a new buffer to show. For mostly hysterical raisins, these APIs tend to still have platform abstractions for window/surface management, but underneath these are just mini-toolkits that manage the buffers and hand them off for you.)
https://developer.apple.com/documentation/Metal/customizing-...
The buffers are not that different, it really just means “extra allocation”
> Now, with the powerful advances in our hardware, silicon, and graphics technologies, we have the opportunity (…)
https://www.youtube.com/watch?v=jGztGfRujSE&t=42s
Coupled with the reports of sluggish performance from the early betas, it’s understandable people would reach the conclusion that the new design pushes the hardware significantly more than before.
a.) Compute-cycles: Some added passes to apply additional shading on top of the completed render, or
b.) Power-consumption: Some added delay in putting components to sleep (reducing CPU/GPU-clock) on every screen update.
Deferred sleep for a portable, battery-powered device because of a longer UI-rendering pipeline can easily add up over time.
--
I'd be quite interested to see some technical analysis on this (although more out of technical curiosity than the assumption that there is something huge to be uncovered here...).
There's also the aspect of iOS prioritizing GUI-rendering over other processing to maintain touch-responsiveness, fluidity, etc. Spending more xPU-time on the GUI potentially means less/later availability for other processes.
For sure non-native apps trying to emulate this look (i.e. Flutter) will create a significantly higher impact on the power-profile of a device than a native app.
Sampling has a cost. Doing it with the previous frame makes it cheaper, but when half your products are running at 60FPS you can't afford to do that, so they're sampling the current frame.
Additional composition work has a cost, additional pixel shader work has a cost, additional video memory bandwidth has a cost. Glass effects that might sample pixels way outside just neighboring pixels have a cost.
Is liquid glass as expensive as some heavy duty visual effects like HBAO+, or lighting a scene with a hundred light sources? No. But pretending it's "free" is a dumbass take, and Hashimoto writing half baked shaders once for ghostty does not make him an expert in GPU performance. GPU running for longer and having dependencies on other passes is, fundamentally, more work.
(Additionally, the top answers to that tweet are a who's who of dogshit tech influencers and wannabe rendering dumbasses regularly having the worst opinions one can imagine. Quite telling)
This brings a more clear divide between fully native iOS applications and React Native -style "build once and cross-compile" -platforms.
Looking back at the announcement, it's more likely that the Management had to decide on the key message for the unveiling prior to the event, and there wasn't much "media-disruptive" to choose from.
So Liquid Glass got elevated to top priority and then all teams got the order to ensure it is present in as many apps as possible.
Now my personal inclination is to integrate with the platform still, as someone currently in the process of building such 3rd party frameworks. Less work for me as the platform handles rendering, keyboard events etc.
The only issue is that on iOS, UIKit is far easier to interact with than SwiftUI. And Swift is too large a language and SwiftUI not simple enough for my taste. Just too bad it is difficult to piggy back on what exists in SwiftUI and is not available in UIKit (yet). There are ways to bridge in a tripartite way perhaps.. But I digress...
Do you believe pixel shaders are some unique magic that only Apple has the secret sauce to and noone else can use? There were some efficient implementations of liquid glass for flutter before the iOS beta was even released. Glass effects are dime a dozen on shadertoy, they're one of the most basic effects you could learn to do when learning about texture sampling.
The platforms that will take a large performance hit are the ones that can't drop down to the native platforms, i.e. the web.
(Actually, someone probably will. People are weird that way.)
I'm pretty sure any recreation will still be just that little bit off compared to the native one, which is their intention IMO.
> I have this little web app I built for my kids to help them manage their day. It has those tiles that animate when you hover on them.
I have absolutely no idea what “those tiles” are. They are familiar to the author, but he has not bothered to explain them well enough to deserve that familiar “those.” AI would have explained them better.
This isn’t AI slop, just sloppy writing.
Also, a sibling comment suggests that "those tiles" is some sort of slop; but I find it no more sloppy than "this little web app" in the preceding sentence. Both are handwavy markers of imprecision common in oral speech. A comment on English Stack Exchange points out that this feature is referred to as the "indefinite this" [0].
[0] - https://english.stackexchange.com/questions/389637/using-thi...
But at this point I'm seeing it everywhere... the "Trash question? Trash answer" format posed as poetry EVERYWHERE and it is correlated with slop and I'm finding it very annoying to read. I might (to my own detriment perhaps) start factoring that in into what I'm gonna continue reading.
Examples of that question format in this article: - Liquid Glass? That's what your M4 CPU is for - That glassy transparency and window animations? A notorious resource hog that brought mid-2000s hardware to its knees - The moment a single tile wiggles? The entire UI crawls - Checking mail? Browsing? Streaming? Your M4 is bored out of its silicon mind - That whole section on: Battery life? ... Thermals? ... Future-proofing? ... Real workloads? ... ugh. - Is it worth it? For Apple’s vibe? Probably. But next time your fan whispers or your battery dips faster than expected… maybe blame the glass.
And apart from that question format, there's another thing but I can't quite figure out the pattern behind it but is slop: (I would really love to figure out what my brain thinks as "off" in these - maybe sentence length variability, maybe colons, or maybe trying to be poetic or dramatic or too self-assured with basically no data/substance underneath it, idk): - Let’s be real: eye candy always comes at a price. - When the system’s stressed, the pretty things break first. - They chew through GPU/CPU time. Always have. - Here’s my hot take: Apple knows exactly what they’re doing - It's stealth bloat. - You might not feel the drag today. That’s the point! The M4’s raw power is the perfect smokescreen. But those cycles aren’t free - TL;DR: Liquid Glass is gorgeous tech debt. Your M4 can afford it… for now. But never forget: fancy pixels demand fancy math.
Basically the entire article at this point. There was one place which was a bit personalized about the web app he built for his kids where I was like OK at least something seems OK, but as another user pointed out "It has those tiles that animate when you hover on them" doesn't make any sense. What tiles. How are we supposed to know.
When one pixel changes underneath a blur the entire blurred area needs to be redrawn, meaning that all elements on top needs to be redrawn. As the blur cannot render before the underlay is finished, the graphics pipeline is stalled. Fancy blur look past the area immediately underneath to more accurately lens effects, meaning each output pixel reads a lot of input pixels.
When it comes to power and heat management, the goal is to be able to power the GPU down again as fast as possible, and this kind fo thing prolongs it quite a bit. There may be a point where efficiency makes the result acceptable, but it's always going to be much worse than not doing that.
Rendering is still done in bulk for the changed areas, avoiding rendering expensive elements (e.g., transformed video buffers, deeply layered effects, expensive shaders). It's a fundamental part of most UI frameworks.
The Guassian blur and lensing effects would still slow things down by needing to fetch pixels from the render target to compute the fragment, vs painting opaque pixels.
And yeah, having a render step depend on the output of a previous non-trivial render step is Bad™.
Software renderers typically do the optimisation you're suggesting to reduce on memory and CPU consumption, and this was a bigger deal back in the day when they were the only option. I think some VNC-like protocols benefit from this kind of lazy rendering, but the actual VNC protocol just diffs the entire frame.
On the GPU, the penalty for uploading textures via the bus negate the benefit, and the memory and processing burden is minimal relative to AAA games which are pushing trillions of pixel computations, and using GBs of compressed textures. GPUs are built more like signal processors and have quite large bus sizes, with memory arranged to make adjacent pixels more local to each other. Their very nature makes the kinds of graphics demands of a 2D GUI very negligible.
Pixel shaders are looking at this laughing at you. PS_OUTPUT is a single pixel whether you want it or not. PS wavefronts are usually very small, so you're still going to be doing a lot of sampling.
There's some truth to that, in the sense of "the hardware can handle it now" (but he also mentions Vista, which came out like 20 years ago...)
If it's actually resource intensive, then the logic would probably be "let's make all last-gen devices a bit slower to encourage people to upgrade..."
At least, that's been my experience when upgrading iOS. It's basically the same thing except mysteriously way slower. (I wonder if the CPU throttling thing was part of the iOS upgrade that made my phone slow to a crawl a few years ago.)
It's not for font smoothing, their fonts are rendered with anti-aliasing in the first place. The 2x scaling thing is due to how they handle non-integer scaling factors, or rather how they don't, instead they render at a higher resolution with an integer factor and then downscale the result to fit the display.
e.g. if you set a 3024x1964 Macbook to a virtual resolution of 1680x1050 (1.87x scale), it'll actually render internally at 3360x2100 (2x scale) and then squish that down to 3024x1964.
I wonder if it had enough capabilities to run the extra passes needed for all that blur.
So the problem might not be that it makes things slow, but that it prevents low-power modes. AKA: slightly less battery life, possibly unused hardware
Is it possible to turn the effects off? I mean something like accessibility settings to tone down the transparency and "glassiness"?
The CPU processes draw calls and runs the compositor, and most compositors are fairly straightforward. The GPU runs the rasterisation, shaders, culling, occlusion, etc to achieve the effect.
I wonder if OP at least tested macOS Tahoe to base his impressions upon.
Works fine on my M1 Macs sooooo whatever.
This is like saying: Wallpaper ? That's what your (Win 95 SD)RAM is for.
The OS should not consume all resources in a system. But tell that to Apple, Google or Microsoft.
Wasn't that even before the beta was released...?
I remember eye candy Linux desktop stuff like Enlightenment doing stuff like this back in the 90s and early 2000s. Not as pervasive or flashy but similar: lots of translucency, skins, themes, etc. Ran fine on a 400mhz machine, though at a much lower resolution.
It’s not free but I’m curious to see how significant it is. I also wonder if you can turn off the animated stuff.
The biggest gripe I have with the Tahoe shots I’ve seen is thst they seem to show it wasting more space with larger margins, etc. I hate that trend. I have a huge ultra wide monitor and often find myself wanting even more screen real estate while I’m working. Stop wasting my pixels.
If it can't be disabled and ends up crippling my otherwise perfectly adequate M1 it will accelerate me to switch to daily use of Asahi Linux.
Edit: That's assuming the effects are even expensive for the GPU.
Defaults matter. Most people never change them.
this is a way to slow older models and get away with it.
They did fix it by giving people the option, but I wonder how many still opted for "yes, I want my phone to suddenly just die instead of slowing down".
And the fix for all this was a 50€ battery replacement.
Communication: at no point has Apple tried to communicate "your phone's battery has gone bad, the device now suffers for it" to the user. Even though they obviously knew that this was what was happening.
Trust: with how openly anti-repair Apple was at the time, how can you trust that this was a honest oversight, and not another malicious action designed to prevent people from repairing their devices?
Apple has improved upon both since. But they're still not anywhere near perfect - and it took a lot of getting their shit kicked in by the media, the public and the regulators for Apple to get even this far.
That only covers a few basic repair types, the tooling is clunky (for a reason - those tools are designed to allow cheap-and-replaceable official employees to perform basic repairs to an Apple-acceptable quality) and the parts are hilariously overpriced. But it's considerably better than nothing. They also enabled a few repairs that were previously hard blocked by software, and required some incredibly complex hardware workarounds because of it - like FaceID replacement.
Make no mistake - I have zero faith in this being Apple actually trying to be better. It's much more likely that this was them walking back their "no third party repairs never ever" stance under pressure from consumer rights activists and regulators in places like EU. But it's a change for the better either way - and it would be good to see more of that in the future.
I make no comment here on whether such suspicion is justified. But let’s imagine that the current behavior (showing the battery warning) had been the original behavior. Then people would no doubt have complained that Apple was trying to pressure people into unnecessary battery replacements.
> - Slow down the phone so that it still works
> - Crash phone immediately
Three, actually:
People want to know this is what happens. If a slow down is not communicated, the average person doesn't go "Oh, I need a new battery.". They are going to buy a new phone because their old one is slow.
"oh we're protecting you. it's not like we have a dozen voltage and temperature sensors, and not like we already throttle when those sensors detect any of the things we're claiming in the excuse" ... and everyone fall for it. geez.
'Hey, there's an issue with your battery so we've slowed down your phone to prevent it going off. Please visit your local apple store to address this.' As opposed to misleading them that their phones are old and slow/crashing and they need to buy new ones?
Most phone PCB designs do not account properly for the scenario "battery gone bad leads to undervolting of components" at all, the best you're gonna get is the BMC cutting off everything when the voltage at the battery drops way too much - but that is a failsafe mechanism, the flash, memory and processor chips will have undergone brownout events before the BMC emergency shutdown hits.
We were talking about Apple here. If they don't know that a battery can get discharged, maybe they shall do something else. For a SWE might be ok. For a HWE no.
I wonder why... /s