What Is HDR, Anyway?

213 _kush 98 5/14/2025, 12:46:39 PM lux.camera ↗

Comments (98)

mxfh · 2h ago
Does anyone else find the hubris in the first paragraph writing as off-putting as I do?

"we finally explain what HDR actually means"

Then spends 2/3rds of the article on a tone mapping expedition, only to not address the elephant in the room, that is the almost complete absence of predictable color management in consumer-grade digital environments.

UIs are hardly ever tested in HDR: I don't want my subtitles to burn out my eyes in actual HDR display.

It is here, where you, the consumer, are as vulnerable to light in a proper dark environment for movie watching, as when raising the window curtains on a bright summer morning. (That brightness abuse by content is actually discussed here)

Dolby Vision and Apple have the lead here as a closed platforms, on the web it's simply not predictably possible yet.

Best hope is the efforts of the Color on the Web Community Group from my impression.

https://github.com/w3c/ColorWeb-CG

sandofsky · 1h ago
> Does anyone else find the hubris in the first paragraph writing as off-putting as I do? > "we finally explain what HDR actually means"

No. Because it's written for the many casual photographers we've spoken with who are confused and asked for an explainer.

> Then spends 2/3rds of the article on a tone mapping expedition, only to not address the elephant in the room, that is the almost complete absence of predictable color management in consumer-grade digital environments.

That's because this post is about HDR and not color management, which is different topic.

klausa · 44m ago
>No. Because it's written for the many casual photographers we've spoken with who are confused and asked for an explainer.

To be fair, it would be pretty weird if you found your own post off-putting :P

willquack · 38m ago
> That brightness abuse by content

I predict HDR content on the web will eventually be disabled or mitigated on popular browsers similarly to how auto-playing audio content is no longer allowed [1]

Spammers and advertisers haven't caught on yet to how abusively attention grabbing eye-searingly bright HDR content can be, but any day now they will and it'll be everywhere.

1. https://hacks.mozilla.org/2019/02/firefox-66-to-block-automa...

Diti · 1h ago
They also make no mention of transfer functions, which is the main mechanism which explains why the images “burn your eyes” – content creators should use HLG (which has relative luminance) and not PQ (which has absolute luminance) when they create HDR content for the web.
klausa · 1h ago
It's a blog for a (fancy) iPhone camera app.

Color management and handling HDR in UIs is probably a bit out of scope.

srameshc · 20m ago
I on the other hand never thought or cared about HDR much before but I remember seeing it everywhere. But I feel the article explains well and clearly with examples, for someone like me who isn't much camera literate.
lordleft · 2h ago
Isn't that the point of the article? That the colloquial meaning of HDR is quite overloaded, and when people complain about HDR, they mean bad tone-mapping? I say this as someone as close to totally ignorant about photography as you can get; I personally thought the article was pretty spectacular.
mort96 · 1h ago
When I complain about HDR it's because I've intentionally set the brightness of pure white to a comfortable level, and then suddenly parts of my screen are brighter than that. You fundamentally can't solve that problem with just better tone mapping, can you?
Retr0id · 1h ago
You can for some definition of "solve", by tone-mapping all HDR content back down into an SDR range for display.
mort96 · 1h ago
Well yeah. I considered adding that caveat but elected not to because it's obvious and doesn't add anything to the conversation, since that's obviously not what's meant when the industry talks about "HDR". Should've remembered this is HN.
puzzlingcaptcha · 1h ago
But it's not the colloquial meaning, HDR is fairly well defined by e.g. ITU-R BT.2100, which addresses colorimetry, luminance and the corresponding transfer functions.
sandofsky · 1h ago
I don't think that's the colloquial meaning. If you asked 100 people on the street to describe HDR, I doubt a single person would bring up ITU-R BT.2100.
redczar · 11m ago
Colloquial meaning and the well defined meaning are two different things n most cases, right?
roywiggins · 24m ago
I think you may be operating with an idiosyncratic definition of "colloquial"
PaulHoule · 1h ago
The bit about "confused" turns me off right away. The kind of high-pressure stereo salesman who hopes I am the kind of 'audiophile' who prevents me from calling myself an 'audiophile' (wants mercury-filled cables for a more 'fluid' sound) always presupposes the reader/listener is "confused".
gyomu · 2h ago
As a photographer, I get the appeal of (this new incarnation of) HDR content, but the practical reality is that the photos I see posted in my feeds go from making my display looking normal to having photos searing my retinas, while other content that was uniform white a second prior now looks dull gray.

It's late night here so I was reading this article in dark mode, at a low display brightness - and when I got to the HDR photos I had to turn down my display even more to not strain my eyes, then back up again when I scrolled to the text.

For fullscreen content (games, movies) HDR is alright, but for everyday computing it's a pretty jarring experience as a user.

sandofsky · 2h ago
While it isn't touched on in the post, I think the issue with feeds is that platforms like Instagram have no interest in moderating HDR.

For context: YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold. I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.

I don't see this happening on Instagram any time soon, because bad HDR likely makes view counts go up.

As for the HDR photos in the post, well, those are a bit strong to show what HDR can do. That's why the Mark III beta includes a much tamer HDR grade.

tart-lemonade · 6m ago
> YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold.

For anyone else who was confused by this, it seems to be a client-side audio compressor feature (not a server-side adjustment) labeled as "Stable Volume".

https://support.google.com/youtube/answer/14106294

I can't find exactly when it appeared but the earliest capture of the help article was from May 2024, so it is a relatively recent feature: https://web.archive.org/web/20240523021242/https://support.g...

SquareWheel · 33m ago
FYI: You wrote Chrome 14 in the post, but I believe you meant Android 14.
sandofsky · 30m ago
Thanks. Updated.
corndoge · 2h ago
The effect of HDR increasing views is explicitly mentioned in the article
nightpool · 1h ago
You are replying to the article's author.
dheera · 1h ago
> because bad HDR likely makes view counts go up

Another related parallel trend recently is that bad AI images get very high view and like counts, so much so that I've lost a lot of motivation for doing real photography because the platforms cease to show them to anyone, even my own followers.

beachwood23 · 2h ago
Completely agree. To me, HDR feels like the system is ignoring my screen brightness settings.

I set my screen brightness to a certain level for a reason. Please don’t just arbitrarily turn up the brightness!

There is no good way to disable HDR on photos for iPhone, either. Sure, you can turn off the HDR on photos on your iphone. But then, when you cast to a different display, the TV tries to display the photos in HDR, and it won’t look half as good.

repelsteeltje · 2h ago
> To me, HDR feels like the system is ignoring my screen brightness settings.

You might be on to something there. Technically, HDR is mostly about profile signaling and therefore about interop. To support it in mpeg dash or hls media you need to make sure certain codec attributes are mentioned in the xml or m3u8 but the actual media payload stays the same.

Any bit or Bob being misconfigured or misinterpreted in the streaming pipeline will result in problems ranging from slightly suboptimal experience to nothing works.

Besides HDR, "spatial audio" formats like Dolby Atmos are notorious for interop isuues

kllrnohj · 1h ago
> To me, HDR feels like the system is ignoring my screen brightness settings.

On both Android & iOS/MacOS it's not that HDR is ignoring your screen brightness, but rather the brightness slider is controlling the SDR range and then yes HDR can exceed that, that's the singular purpose of HDR to be honest. All the other purported benefits of HDR are at best just about HDR video profiles and at worst just nonsense bullshit. The only thing HDR actually does is allow for brighter colors vs. SDR. When used selectively this really enhances a scene. But restraint is hard, and most forms of HDR content production are shit. The HDR images that newer iPhones and Pixel phones are capturing are generally quite good because they are actually restrained, but then ironically both of them have horrible HDR video that's just obnoxiously bright.

agos · 1m ago
you are right but at least in my experience it's very easy for a modern iPhone to capture a bad HDR photo, usually because there is some small strong highlight (often a form of specular reflection from a metallic object) that causes everything to be HDR while the photo content wouldn't need it
dmos62 · 40m ago
That's not inherent to HDR though. BFV (unless I'm confusing it with something else) has a HDR adjustment routine where you push a slider until the HDR white and the SDR white are identical. Same could be done for desktop environments. In my experience, HDR support is very lacking in PCs atm. You can't even play Dolby Vision on Windows, which is the only widely-used HDR format with dynamic metadata.
Suppafly · 23m ago
>HDR support is very lacking in PCs atm.

I think it's because no one wants it.

zamadatix · 23m ago
On the browser spec side this is just starting to get implemented as a CSS property https://caniuse.com/mdn-css_properties_dynamic-range-limit so I expect it might start to be a more common thing in web tech based feeds given time.
skhameneh · 2h ago
I’m under the impression this is caused by the use of “HDR mode”(s) and poor adaptive brightness implementations on devices. Displays such as the iPad Pro w/ OLED are phenomenal and don’t seem to implement an overactive adaptive brightness. HDR content has more depth without causing brightness distortion.

In contrast, my TV will change brightness modes to display HDR content and disables some of the brightness adjustments when displaying HDR content. It can be very uncomfortably bright in a dark room while being excessively dim in a bright room. It requires adjusting settings to a middle ground resulting in a mixed/mediocre experience overall. My wife’s laptop is the worst of all our devices, while reviews seem to praise the display, it has an overreactive adaptive brightness that cannot be disabled (along with decent G2G response but awful B2W/W2B response that causes ghosting).

hypeatei · 2h ago
This happens on Snapchat too with HDR videos. Brightness increases while everything else dims... including the buttons.
gwbas1c · 32m ago
> AI cannot read your mind, so it cannot honor your intent.

This. I can always tell when someone "gets" software development when they either understand (or don't) that computers can't read minds or infer intent like a person can.

echo_time · 2h ago
Note for Firefox users - view the page in Chrome to see more of what they are talking about. I was very confused by some of the images, and it was a world of difference when I tried again in Chrome. Things began to make a lot more sense - is there a flag I am missing in Firefox on the Mac?
viraptor · 2h ago
https://bugzilla.mozilla.org/show_bug.cgi?id=hdr there's the tracking issue for HDR support.
gwbas1c · 25m ago
I wasn't using Firefox, but I had the page open on an old monitor. I dragged the page to an HDR display and the images pop.
mikepurvis · 2h ago
Can confirm on Windows 11 with HDR enabled on my display— I see the photos in the article correctly on Chrome and they're a grey mess on Firefox.
cubefox · 1h ago
HDR support in Chrome (Android) looks still broken for me. For one, some of the images on the blog have a posterization effect, which is clearly wrong.

Second, the HDR effect seems to be implemented in a very crude way, which causes the whole Android UI (including the Android status bar at the top) to become brighter when HDR content is on screen. That's clearly not right. Though, of course, this might also be some issue of Android rather than Chrome, or perhaps of the Qualcomm graphics driver for my Adreno GPU, etc.

throwaway314155 · 1h ago
For what it's worth, your comment has me convinced I just "can't see" HDR properly because I have the same page side-by-side on Firefox and Chrome on my M4 MBP and honestly? Can't see the difference.

edit: Ah, nevermind. It seems Firefox is doing some sort of post-processing (maybe bad tonemapping?) on-the-fly as the pictures start out similar but degrade to washed out after some time. In particular, the "OVERTHROW BOXING CLUB" photo makes this quite apparent.

That's a damn shame Firefox. C'mon, HDR support feels like table stakes at this point.

edit2: Apparently it's not table stakes.

> Browser support is halfway there. Google beat Apple to the punch with their own version of Adaptive HDR they call Ultra HDR, which Chrome 14 now supports. Safari has added HDR support into its developer preview, then it disabled it, due to bugs within iOS.

at which point I would just say to `lux.camera` authors - why not put a big fat warning at the top for users with a Firefox or Safari (stable) browser? With all the emphasis on supposedly simplifying a difficult standard, the article has fallen for one of its most famous pitfalls.

"It's not you. HDR confuses tons of people."

Yep, and you've made it even worse for a huge chunk of people. :shrug: Great article n' all just saying.

aidenn0 · 1h ago
> A big problem is that it costs the TV, Film, and Photography industries billions of dollars (and a bajillion hours of work) to upgrade their infrastructure. For context, it took well over a decade for HDTV to reach critical mass.

This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.

gwbas1c · 29m ago
> I don't own a single 4k or HDR display

Don't feel like you have to. I bought a giant fancy TV with it, and even though it's impressive, it's kinda like ultra-hifi-audio. I don't miss it when I watch the same show on one of my older TVs.

If you ever do get it, I suggest doing for a TV that you watch with your full attention, and watching TV / movies in the dark. It's not very useful on a TV that you might turn on while doing housework; but very useful when you are actively watching TV with your full attention.

alerighi · 18m ago
I don't either see a point of having 4K TV vs 1080p TV. To me is just marketing, I have at my house both a 4K and a 1080p and from a normal viewing distance (that is 3/4 meters) you don't see differences.

Also in my country (Italy) TV transmissions are 1080i at best, a lot are still 570i (PAL resolution). Streaming media can be 4K (if you have enough bandwidth to stream it at that resolution, which I don't have at my house). Sure, if you download pirated movies you find it at 4K, and if you have the bandwidth to afford it... sure.

But even there, sometimes is better a well done 1080p movie than an hyper compressed 4K one, since you see compression artifacts.

To me 1080p, and maybe even 720p, is enough for TV vision. Well, sometimes I miss the CRT TVs, they where low resolution but for example had a much better picture quality than most modern 4K LCD TV where black scenes are gray (I know there is OLED, but is too expensive and has other issues).

dmitshur · 1h ago
To demonstrate some contrast (heh) with another data point from someone closer to the other extreme, I’ve owned a very HDR-capable monitor (the Apple Pro Display XDR) since 2020, so that’s 5 years now. Content that takes full advantage of it is still rare, but it’s getting better slowly over time.
colechristensen · 1h ago
I have a screen which is "HDR" but what that means is when you turn the feature on it just makes everything more muted, it doesn't actually have any more dynamic range. When you turn HDR on for a game it basically just makes most things more muddy grey.

I also have a screen which has a huge gamut and blows out colors in a really nice way (a bit like the aftereffects of hallucinogens, it has colors other screens just don't) and you don't have to touch any settings.

My OLED TV has HDR and it actually seems like HDR content makes a difference while regular content is still "correct".

reaperducer · 1h ago
This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.

People in the HN echo chamber over-estimate hardware adoption rates. For example, there are millions of people who went straight from CDs to streaming, without hitting the iPod era.

A few years ago on HN, there was someone who couldn't wrap their brain around the notion that even though VCRs were invented in the early 1960's that in 1980, not everyone owned one, or if they did, they only had one for the whole family.

Normal people aren't magpies who trash their kit every time something shiny comes along.

colechristensen · 1h ago
>there are millions of people who went straight from CDs to streaming, without hitting the iPod era

Who?

There was about a decade there where everyone who had the slightest interest in music had an mp3 player of some kind, at least in the 15-30 age bracket.

aidenn0 · 45m ago
I don't know if I count, but I never owned a dedicated MP3 player[1], I listened to MP3s on my computer, but used CDs and cassettes while on the move, until I got an android phone that had enough storage to put my music collection on.

1: Well my car would play MP3s burned to CDs in its CD player; not sure if that counts.

asafira · 1h ago
I did my PhD in Atomic, Molecular, and Optical (AMO) physics, and despite "optical" being part of that I realized midway that I didn't know enough about how regular cameras worked!

It didn't take very long to learn, and it turned out to be extremely important in the work I did during the early days at Waymo and later at Motional.

I wanted to pass along this fun video from several years ago that discusses HDR: https://www.youtube.com/watch?v=bkQJdaGGVM8 . It's short and fun, I recommend it to all HN readers.

Separately, if you want a more serious introduction to digital photography, I recommend the lectures by Marc Levoy from his Stanford course: https://www.youtube.com/watch?v=y7HrM-fk_Rc&list=PL8ungNrvUY... . I believe he runs his own group at Adobe now after leading a successful effort at Google making their pixel cameras the best in the industry for a couple of years. (And then everyone more-or-less caught up, just like with most tech improvements in the history of smartphones).

CarVac · 2h ago
HDR on displays is actually largely uncomfortable for me. They should reserve the brightest HDR whites for things like the sun itself and caustics, not white walls in indoor photos.

As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.

NBJack · 1h ago
HDR is really hard to get right apparently. It seems to get worse in video games too.

I'm a huge fan of Helldivers 2, but playing the game in HDR gives me a headache: the muzzle flash of weapons at high RPMs on a screen that goes to 240hz is basically a continuous flashbang for my eyes.

For a while, No Mans' Sky in HDR mode was basically the color saturation of every planet dialed up to 11.

The only game I've enjoyed at HDR was a port from a console, Returnal. The use of HDR brights was minimalistic and tasteful, often reserved for certain particle effects.

the_af · 1h ago
There's a pretty good video on YouTube (more than one, actually) that explains how careless use of HDR in modern cinema is destroying the look and feel of cinema we used to like.

Everything is flattened, contrast is eliminated, lights that should be "burned white" for a cinematic feel are brought back to "reasonable" brightness with HDR, really deep blacks are turned into flat greys, etc. The end result is the flat and washed out look of movies like Wicked. It's often correlated to CGI-heavy movies, but in reality it's starting to affect every movie.

NelsonMinar · 24m ago
Is there a consensus definition of what counts as "HDR" in a display? What is the "standard dynamic range" of a typical TV or computer monitor? Is it roughly the same for devices of the same age?

My understanding is most SDR TVs and computer screens have displays about 200-300 nits (aka cd/m²). Is that the correct measure of the range of the display? The brightest white is 300 nits brighter than the darkest black?

sergioisidoro · 2h ago
Just one other thing. In Analog you also have compensating developers, which will exhaust faster in darker areas (or lighter if you think in negative), and allow for lighter areas more time to develop and show, and hence some more control of the range. Same but to less degree with stand development which uses very low dilutions of the developer, and no agitation. So dodging and burning is not the only way to achieve higher dynamic range in analog photos.

About HDR on phones, I think they are the blight of photography. No more shadows and highlights. I find they are good at capturing family moments, but not as a creative tool.

CarVac · 1h ago
I wrote a raw processing app Filmulator that simulates stand development/compensating developers to give such an effect to digital photography.

I still use it myself but I need to redo the build system and release it with an updated LibRaw... not looking forward to that.

the__alchemist · 2h ago
So, HN, are HDR monitors worth it? I remember ~10 years ago delaying my monitor purchase for the HDR one that was right around the corner, but never (in my purchasing scope) became available. Time for another look?

The utility of HDR (as described in the article) is without question. It's amazing looking at an outdoors (or indoors with windows) scene with your Mk-1 eyeballs, then taking a photo and looking at it on a phone or PC screen. The pic fails to capture what your eyes see for lighting range.

kllrnohj · 2h ago
HDR gaming: Yes.

HDR full screen content: Yes.

HDR general desktop usage: No. In fact you'll probably actively dislike it to the point of just turning it off entirely. The ecosystem just isn't ready for this yet, although with things like the "constrained-high" concepts ( https://www.w3.org/TR/css-color-hdr-1/#the-dynamic-range-lim... ) this might, and hopefully does, change & improve to a more pleasing result

Also this is assuming an HDR monitor that's also a good match for your ambient environment. The big thing nobody really talks about wiith HDR is that it's really dominated by how dark you're able to get your surrounding environment such that you can push your display "brightness" (read: SDR whitepoint) lower and lower. OLED HDR monitors, for example, look fantastic in SDR and fantastic in HDR in a dark room, but if you have typical office lighting and so you want an SDR whitepoint of around 200-300 nits? Yeah, they basically don't do HDR at all anymore at that point.

wirybeige · 32m ago
I use HDR for general usage, Windows ruins non-HDR content when HDR is enabled due to their choice of sRGB tf. Luckily every Linux DE has chosen to use the gamma 2.2 tf, and looks fine for general usage.

I use a mini-led monitor, and its quite decent, except for starfields, & makes it very usable even in bright conditions, and HDR video still is better in bright conditions than the equivalent SDR video.

https://github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm

arduinomancer · 24m ago
HDR on desktop in windows looks straight up broken on some HDR monitors I’ve tried

Like totally washed out

eschatology · 2h ago
Yes but with asterisks; Best way I can describe it:

You know the 0-10 brightness slider you have to pick at the start of a game? Imagine setting it to 0 and still being able to spot the faint dark spot. The dynamic range of things you can see is so much expanded.

Early HDR screens were very limited (limited dimming zones, buggy implementation) but if you get one post 2024 (esp the oled ones) they are quite decent. However it needs to be supported at many layers: not just the monitor, but also the operating system, and the content. There are not many games with proper HDR implementation; and even if there is, it may be bad and look worse — the OS can hijack the rendering pipeline and provide HDR map for you (Nvidia RTX HDR) which is a gamble: it may look bleh, but sometimes also better than the native HDR implementation the game has).

But when everything works properly, wow it looks amazing.

kllrnohj · 1h ago
> You know the 0-10 brightness slider you have to pick at the start of a game? Imagine setting it to 0 and still being able to spot the faint dark spot. The dynamic range of things you can see is so much expanded.

Note that HDR only actually changes how bright things can get. There's zero difference in the dark regions. This is made confusing because HDR video marketing often claims it does, but it doesn't actually. HDR monitors do not, in general, have any advantage over SDR monitors in terms of the darks. Local dimming zones improve dark contrast. OLED improves dark contrast. Dynamic contrast improves dark contrast. But HDR doesn't.

eschatology · 1h ago
My understanding is that on the darker scenes (say, 0 to 5 in the brightness slider example), there is difference in luminance value with HDR but not SDR, so there is increased contrast and detail.

This matches my experience; 0 to 5 look identically black if I turn off HDR

kllrnohj · 56m ago
You may have a monitor that only enables local dimming zones when fed an HDR signal but not when fed an SDR one, but that would be unusual and certainly not required. And likely something you could change in your monitors controls. On things like an OLED, though, there's no difference in the darks. You'd see a difference between 8bit and 10bit potentially depending on what "0 to 5" means, but 10-bit SDR is absolutely a thing (it predates HDR even)

But like if you can't see a difference between 0 to 5 in a test pattern like this https://images.app.goo.gl/WY3FhCB1okaRANc28 in SDR but you can in HDR then that just means your SDR factory calibration is bad, or you've fiddled with settings that broke it.

nfriedly · 1h ago
A lot of monitors that advertise HDR support really shouldn't. Many of them can decode the signal but don't have the hardware to accurately reproduce it, so you just end up with a washed out muddy looking mess where you're better off disabling HDR entirely.

As others here have said, OLED monitors are generally excellent at reproducing a HDR signal, especially in a darker space. But they're terrible for productivity work because they'll get burned in for images that don't change a lot. They're fantastic for movies and gaming, though.

There are a few good non-OLED HDR monitors, but not many. I have an AOC Q27G3XMN; its a 27" 1440p 180hz monitor that is good for entry-level HDR, especially in brighter rooms. It has over 1000 nits of brightness, and no major flaws. It only has 336 backlight zones, though, so you might notice some blooming around subtitles or other fine details where there's dark and light content close together. (VA panels are better than IPS at suppressing that, though.) It's also around half the price of a comparable OLED.

Most of the other non-OLED monitors with good HDR support have some other deal-breaking flaws or at least major annoyances, like latency, screwing up SDR content, buggy controls, etc. The Monitors Unboxed channel on YouTube and rtngs.com are both good places to check.

Jhsto · 2h ago
I've been thinking of moving out of the Apple ecosystem but after seeing Severance on my iPhone Pro screen I feel like I want the keep the option to have the same HDR experience for movies specifically. With HDR support landing in Linux just a month ago I'm inclined to spend on a good monitor. However, I have IPS HDR 600 monitor but I never felt that the screen was as glorious as the iPhone screen.

I'd also be interested in hearing whether it makes sense to look into OLED HDR 400 screens (Samsung, LG) or is it really necessary to get an Asus ProArt which can push the same 1000 nits average as the Apple XDR display (which, mind you, is IPS).

SomeoneOnTheWeb · 2h ago
IF you have a display that can it roughly a 1000 nits, then for movies and games yes definitely the difference with SDR is pretty huge.

If you have say a 400 nits display the HDR may actually look worse than SDR. So it really depends on your screen.

baq · 53m ago
I've had an OLED TV since 2017 and the answer is a resounding yes... if you get an OLED and use it for movies or full screen gaming. Anything else is basically pointless.

For desktop work, don't bother unless your work involves HDR content.

SebastianKra · 1h ago
Apple's Displays yes. But I got a Phillips 4k OLED recently, and I'm already regretting that decision. I need to turn it off every 4 hours to refresh the pixels. Sometimes an entire line of pixels is brighter than the rest. I wiped it with a cloth while pixel refresh was running, and then saw burned in streaks in the direction of the wipe.

And thats now that all the LEDs are still fresh. I can't imagine how bad it will be in a few years.

Also, a lot of Software doesn't expect the subpixel arrangement, so text will often look terrible.

esperent · 2h ago
I think it depends on the screen and also what you use it for. My OLED is unusable for normal work in HDR because it's designed around only a small portion of the screen being at max brightness - reasonable for a game or movie, but the result is that a small window with white background will look really bright, but if I maximize it, it'll look washed out, grey not white.

Also the maximum brightness isn't even that bright at 800 nits, so no HDR content really looks that different. I think newer OLEDs are brighter though. I'm still happy with the screen in general, even in SDR the OLED really shines. But it made me aware not all HDR screens are equal.

Also, in my very short experiment using HDR for daily work I ran into several problems, the most serious of which was the discovery that you can no longer just screenshot something and expect it to look the same on someone else's computer.

aethrum · 2h ago
For gaming, definitely. An HDR Oled monitor is so immersive.
whywhywhywhy · 2h ago
Dunno if it's just my screen or setup but on windows I have a Dell U4025QW and HDR on the desktop just looks strange, overly dull. Looks good in games but I have to manually turn it on and off each time on the screen.

On my Macbook Pro only activates when it needs to but honestly I've only seen one video [1] that impressed me with it, the rest was completely meh. Not sure if its because it's mostly iPhone photography you see in HDR which is overall pretty meh looking anyway.

[1] https://www.youtube.com/watch?v=UwCFY6pmaYY I understand this isn't a true HDR process but someone messing with it in post, but it's the only video I've seen that noticeably shows you colors you can't see on a screen otherwise.

tomatotomato37 · 1h ago
If anyone was hoping for a more technical explanation, I find these pages do a good job explaining the inner workings behind the format

https://docs.krita.org/en/general_concepts/colors/bit_depth....

https://docs.krita.org/en/general_concepts/colors/color_spac...

https://docs.krita.org/en/general_concepts/colors/scene_line...

perching_aix · 1h ago
I'm not entirely convinced that greedy influencers are to blame for people hating on overly bright content. Instead, I think something is different with how displays produce brightness compared to just the nature outside. Light outside is supposed to reach up to tens of thousands of nits, yet even 1000 nits is searing on a display. Is it that displays output polarized light? Is it the spectral distribution of especially the better displays being three really tight peaks? I cannot tell you, but I'm suspecting something isn't quite right.

All this aside, HDR and high brightness are different things - HDR is just a representational thing. You can go full send on your SDR monitor as well, you'll just see more banding. The majority of the article is just content marketing about how they perform automatic tonemapping anyways.

layer8 · 28m ago
> Light outside is supposed to reach up to tens of thousands of nits, yet even 1000 nits is searing on a display.

That’s a consequence of https://en.wikipedia.org/wiki/Adaptation_(eye). If you look at 1000 nits on a display in bright sunlight, with your eyes adapted to the bright surroundings, the display would look rather dim.

ziml77 · 1h ago
It's all down to the ambient light. That's why bias lighting is now a thing. Try putting a light behind your screen to massively brighten the wall behind it, the 1000 nit peaks will be far less harsh. And if you bring the device out into sunlight I suspect you will wish for everything about its output to be quite a bit brighter.
hatsunearu · 1h ago
Is there any workflow that can output HDR photos (like the real HDR kind, with metadata to tell the display to go into HDR mode) for photos shot with a mirrorless and not an iPhone?
sebstefan · 2h ago
For the Halide's updated Image Lab demo about 2/3rd of the way down the page (https://www.lux.camera/content/media/2025/05/skyline-edit-tr...), you made the demo so tall desktop users can't both see the sky & the controls at the same time

A lot of these design flaws are fixed by Firefox's picture in picture option but for some reason, with the way you coded it, the prompt to pop it out as PIP doesn't show up

bookofjoe · 1h ago
As a non-techie I represent the 99.9% of the population who haven't a clue what tone mapping etc. is: NO WAY would we ever touch the various settings possible as opposed to watching the TV/computer screen/etc. as it came out of the box.
dsego · 2h ago
Isn't the result of their tone mapping algo similar to adjusting shadow and highlight sliders in other software?
sandofsky · 1h ago
No. When you simply adjust shadow and highlights, you lose local contrast. In an early draft of the post, there was an example, but it was cut for pacing.
randall · 1h ago
This was super awesome. Thanks for this! Especially the HDR photo reveal felt really awesome.
c-fe · 1h ago
i am still skeptical about HDR as pretty much all HDR content I see online is awful. But this post makes me believe that Lux/Halide can pull of HDR in a way that I will like. I am looking forward to Halide Mk3.
fogleman · 1h ago
So, if cameras have poor dynamic range, how are they getting away with a single exposure? They didn't explain that at all...
sandofsky · 1h ago
Human vision has around 20 stops of static dynamic range. Modern digital cameras can't match human vision— a $90,000 Arri Alexa boasts 17 stops— but they're way better than SDR screens.
cainxinth · 1h ago
I chuckled at "The Ed Hardy t-shirt of photography" for the early, overdone "HDR-mode" images.
alistairSH · 1h ago
"The Ed Hardy T-Shirt of Photography"

Literal snort.

esperent · 2h ago
This page crashed Brave on Android three times before I gave up.
mcjiggerlog · 26m ago
For me this crashed in android webview, android chrome, and android firefox. Impressive.
therealmarv · 1h ago
... something Linux Desktops don't understand and Macs only do well with their own displays with videos. Guess who the winner is on the desktop: Windows oO
4ad · 2h ago
HDR is just a scene-referred image using absolute luminance.
kllrnohj · 2h ago
No, it isn't. Absolute luminance is a "feature" of PQ specifically used by HDR10(+) and most DolbyVision content (notably the DolbyVision as produced by an iPhone is not PQ, it's not "real" DolbyVision). But this is not the only form of HDR and it's not even the common form for phone cameras. HLG is a lot more popular for cameras and it is not in absolute luminance. The gainmap-based approach that Google, Apple, and Adobe are all using is also very much not absolute luminance, either. In fact that flips it entirely and it's SDR relative instead, which is a much better approach to HDR than what video initially went with.
pavlov · 2h ago
Ideally in the abstract it could be just that, but in practice it's an umbrella name for many different techniques that provide some aspect of that goal.
the__alchemist · 2h ago
Not in the more general sense! It can refer to what its acronym spells out directly: Bigger range between dimmest and brightest capabilities of a display, imaging technique etc.
4ad · 2h ago
No. HDR can encode high dynamic range because (typically) it uses floating point encoding.

From a technical point of view, HDR is just a set of standards and formats for encoding absolute-luminance scene-referred images and video, along with a set of standards for reproduction.

cornstalks · 2h ago
No. HDR video (and images) don't use floating point encoding. They generally use a higher bit depth (10 bits or more vs 8 bits) to reduce banding and different transfer characteristics (i.e. PQ or HLG vs sRGB or BT.709), in addition to different YCbCr matrices and mastering metadata.

And no, it's not necessarily absolute luminance. PQ is absolute, HLG is not.

skhameneh · 1h ago
Isn’t HLG using floating point(s)?

Also DCI-P3 should fit in here somewhere, as it seems to be the most standardized color space for HDR. I would share more insight, if I had it. I thought I understood color profiles well, but I have encountered some challenges when trying to display in one, edit in another, and print “correctly”. And every device seems to treat color profiles a little bit differently.

kllrnohj · 1h ago
> Isn’t HLG using floating point(s)?

All transfer functions can generally work on either integer range or floating point. They basically just describe a curve shape, and you can have that curve be over the range of 0.0-1.0 just as easily as you can over 0-255 or 0-1023.

Extended sRGB is about the only thing that basically requires floating point, as it specifically describes 0.0-1.0 as being equivalent to sRGB and then has a valid range larger than that (you end up with something like -.8 to 2.4 or greater). And representing that in integer domain is conceptually possible but practically not really.

> Also DCI-P3 should fit in here somewhere, as it seems to be the most standardized color space for HDR.

BT2020 is the most standardized color space for HDR. DCI-P3 is the most common color gamut of HDR displays that you can actually afford, however, but that's a smaller gamut than what most HDR profiles expect (HDR10, HDR10+, and "professional" DolbyVision are all BT2020 - a wider gamut than P3). Which also means most HDR content specifies a color gamut it doesn't actually benefit from having as all that HDR content is still authored to only use somewhere between the sRGB and DCI-P3 gamut since that's all anyone who views it will actually have.

cornstalks · 45m ago
You can read the actual HLG spec here: https://www.arib.or.jp/english/html/overview/doc/2-STD-B67v2...

The math uses real numbers but table 2-4 ("Digital representation") discusses how the signal is quantized to/from analog and digital. The signal is quantized to integers.

This same quantization process is done for sRGB, BT.709, BT.2020, etc. so it's not unique to HLG. It's just how digital images/video are stored.