[HN Gopher] What is HDR, anyway?
       ___________________________________________________________________
        
       What is HDR, anyway?
        
       Author : _kush
       Score  : 749 points
       Date   : 2025-05-14 12:46 UTC (1 days ago)
        
 (HTM) web link (www.lux.camera)
 (TXT) w3m dump (www.lux.camera)
        
       | 4ad wrote:
       | HDR is just a scene-referred image using absolute luminance.
        
         | the__alchemist wrote:
         | Not in the more general sense! It can refer to what its acronym
         | spells out directly: Bigger range between dimmest and brightest
         | capabilities of a display, imaging technique etc.
        
           | 4ad wrote:
           | No. HDR can encode high dynamic range because (typically) it
           | uses floating point encoding.
           | 
           | From a technical point of view, HDR is just a set of
           | standards and formats for encoding absolute-luminance scene-
           | referred images and video, along with a set of standards for
           | reproduction.
        
             | cornstalks wrote:
             | No. HDR video (and images) don't use floating point
             | encoding. They generally use a higher bit depth (10 bits or
             | more vs 8 bits) to reduce banding and different transfer
             | characteristics (i.e. PQ or HLG vs sRGB or BT.709), in
             | addition to different YCbCr matrices and mastering
             | metadata.
             | 
             | And no, it's not necessarily absolute luminance. PQ is
             | absolute, HLG is not.
        
               | skhameneh wrote:
               | Isn't HLG using floating point(s)?
               | 
               | Also DCI-P3 should fit in here somewhere, as it seems to
               | be the most standardized color space for HDR. I would
               | share more insight, if I had it. I thought I understood
               | color profiles well, but I have encountered some
               | challenges when trying to display in one, edit in
               | another, and print "correctly". And every device seems to
               | treat color profiles a little bit differently.
        
               | kllrnohj wrote:
               | > Isn't HLG using floating point(s)?
               | 
               | All transfer functions can generally work on either
               | integer range or floating point. They basically just
               | describe a curve shape, and you can have that curve be
               | over the range of 0.0-1.0 just as easily as you can over
               | 0-255 or 0-1023.
               | 
               | Extended sRGB is about the only thing that basically
               | _requires_ floating point, as it specifically describes
               | 0.0-1.0 as being equivalent to sRGB and then has a valid
               | range larger than that (you end up with something like
               | -.8 to 2.4 or greater). And representing that in integer
               | domain is conceptually possible but practically not
               | really.
               | 
               | > Also DCI-P3 should fit in here somewhere, as it seems
               | to be the most standardized color space for HDR.
               | 
               | BT2020 is the most standardized color space for HDR.
               | DCI-P3 is the most common color gamut of HDR displays
               | that you can actually afford, however, but that's a
               | smaller gamut than what most HDR profiles expect (HDR10,
               | HDR10+, and "professional" DolbyVision are all BT2020 - a
               | wider gamut than P3). Which also means most HDR content
               | specifies a color gamut it doesn't actually benefit from
               | having as all that HDR content is still authored to only
               | use somewhere between the sRGB and DCI-P3 gamut since
               | that's all anyone who views it will actually have.
        
               | cornstalks wrote:
               | You can read the actual HLG spec here: https://www.arib.o
               | r.jp/english/html/overview/doc/2-STD-B67v2...
               | 
               | The math uses real numbers but table 2-4 ("Digital
               | representation") discusses how the signal is quantized
               | to/from analog and digital. The signal is quantized to
               | integers.
               | 
               | This same quantization process is done for sRGB, BT.709,
               | BT.2020, etc. so it's not unique to HLG. It's just how
               | digital images/video are stored.
        
             | dahart wrote:
             | I think most HDR formats do not typically use 32 bit
             | floating point. The first HDR file format I can remember is
             | Greg Ward's RGBE format, which is also now more commonly
             | known as .HDR and I think is pretty widely used.
             | 
             | https://www.graphics.cornell.edu/~bjw/rgbe.html
             | 
             | It uses a type of floating point, in a way, but it's a
             | shared 8 bit exponent across all 3 channels, and the
             | channels are still 8 bits each, so the whole thing fits in
             | 32 bits. Even the .txt file description says it's not
             | "floating point" per-se since that implies IEEE single
             | precision floats.
             | 
             | Cameras and displays don't typically use floats, and even
             | CG people working in HDR and using, e.g., OpenEXR, might
             | use half floats more often that float.
             | 
             | Some standards do exist, and it's improving over time, but
             | the ideas and execution of HDR in various ways preceded any
             | standards, so I think it's not helpful to define HDR as a
             | set of standards. From my perspective working in CG, HDR
             | began as a way to break away from 8 bits per channel RGB,
             | and it included improving both color range and color
             | resolution, and started the discussion of using physical
             | metrics as opposed to relative [0..1] ranges.
        
         | kllrnohj wrote:
         | No, it isn't. Absolute luminance is a "feature" of PQ
         | specifically used by HDR10(+) and _most_ DolbyVision content
         | (notably the DolbyVision as produced by an iPhone is _not_ PQ,
         | it 's not "real" DolbyVision). But this is not the only form of
         | HDR and it's not even the common form for phone cameras. HLG is
         | a lot more popular for cameras and it is not in absolute
         | luminance. The gainmap-based approach that Google, Apple, and
         | Adobe are all using is also very much not absolute luminance,
         | either. In fact that flips it entirely and it's _SDR relative_
         | instead, which is a _much_ better approach to HDR than what
         | video initially went with.
        
         | pavlov wrote:
         | Ideally in the abstract it could be just that, but in practice
         | it's an umbrella name for many different techniques that
         | provide some aspect of that goal.
        
       | CarVac wrote:
       | HDR on displays is actually largely uncomfortable for me. They
       | should reserve the brightest HDR whites for things like the sun
       | itself and caustics, not white walls in indoor photos.
       | 
       | As for tone mapping, I think the examples they show tend way too
       | much towards flat low-local-contrast for my tastes.
        
         | NBJack wrote:
         | HDR is really hard to get right apparently. It seems to get
         | worse in video games too.
         | 
         | I'm a huge fan of Helldivers 2, but playing the game in HDR
         | gives me a headache: the muzzle flash of weapons at high RPMs
         | on a screen that goes to 240hz is basically a continuous
         | flashbang for my eyes.
         | 
         | For a while, No Mans' Sky in HDR mode was basically the color
         | saturation of every planet dialed up to 11.
         | 
         | The only game I've enjoyed at HDR was a port from a console,
         | Returnal. The use of HDR brights was minimalistic and tasteful,
         | often reserved for certain particle effects.
        
           | simoncion wrote:
           | For a year or two after it launched, The Division 2 was a
           | really, really good example of HDR done right. The game had
           | (has?) a day/night cycle, and it had a really good control of
           | the brightness throughout the day. More importantly, it made
           | very good use of the wide color gamut available to it.
           | 
           | I stopped playing that game for several years, and when I
           | went back to it, the color and brightness had been wrecked to
           | all hell. I have heard that it's received wisdom that gamers
           | complain that HDR modes are "too dark", so perhaps that's
           | part of why they ruined their game's renderer.
           | 
           | Some games that I think _currently_ have good HDR:
           | 
           | * Lies of P
           | 
           | * Hunt: Showdown 1896
           | 
           | * Monster Hunter: World (if you increase the game's color
           | saturation a bit from its default settings)
           | 
           | Some games that had decent-to-good HDR the last time I played
           | them, a few years ago:
           | 
           | * Battlefield 1
           | 
           | * Battlefield V
           | 
           | * Battlefield 2042 (If you're looking for a fun game, I do
           | NOT recommend this one. Also, the previous two are probably
           | chock-full of cheaters these days.)
           | 
           | I found Helldivers 2's HDR mode to have blacks that were WAY
           | too bright. In SDR mode, nighttime in forest areas was
           | _dark_. In HDR mode? It was as if you were standing in the
           | middle of a field during a full moon.
        
             | xienze wrote:
             | > I have heard that it's received wisdom that gamers
             | complain that HDR modes are "too dark", so perhaps that's
             | part of why they ruined their game's renderer.
             | 
             | A lot of people have cheap panels that claim HDR support
             | (read: can display an HDR signal) but have garbage color
             | space coverage, no local dimming, etc. and to them, HDR
             | ends up looking muted.
        
           | qingcharles wrote:
           | A lot of this is poor QA. When you start to do clever things
           | like HDR you have to test on a bunch of properly calibrated
           | devices of different vendors etc. And if you're targeting
           | Windows you have to accept that HDR is a mess for consumers
           | and even if their display supports, their GPU supports it,
           | they might still have the drivers and color profiles
           | misconfigured. (and many apps are doing it wrong or weird,
           | even when they say they support it)
           | 
           | Also (mostly) on Windows, or on videos for your TV: a lot of
           | cheap displays that say they are HDR are a range of hot
           | garbage.
        
         | the_af wrote:
         | There's a pretty good video on YouTube (more than one,
         | actually) that explains how careless use of HDR in modern
         | cinema is destroying the look and feel of cinema we used to
         | like.
         | 
         | Everything is flattened, contrast is eliminated, lights that
         | should be "burned white" for a cinematic feel are brought back
         | to "reasonable" brightness with HDR, really deep blacks are
         | turned into flat greys, etc. The end result is the flat and
         | washed out look of movies like _Wicked_. It 's often correlated
         | to CGI-heavy movies, but in reality it's starting to affect
         | every movie.
        
           | jiggawatts wrote:
           | The washed out grey thing was an _error_ that became a style!
           | 
           | Because HDR wasn't natively supported on most displays and
           | software, for a long time it was just "hacked in there" by
           | squashing the larger dynamic range into a smaller one using a
           | mathematical transform, usually a log function. When viewed
           | without the inverse transform this looks horribly grey and
           | unsaturated.
           | 
           | Directors and editors would see this aesthetic day in, day
           | out, with the final color grade applied only after a long
           | review process.
           | 
           | Some of them got used to it and _even liking it_ , and now
           | here we are: horribly washed out movies made to look like
           | that on purpose.
        
             | XorNot wrote:
             | The transition from Avengers to the later movies is very
             | noticeable, and one of the worst offenders since source
             | material really speaks against the choice.
        
             | qingcharles wrote:
             | What you said, it's definitely become a style. But, also, a
             | lot of these movies that look like ass on Joe Public's
             | OOGLAMG $130 85" Black Friday TV in his brightly-lit living
             | room actually look awesome if your entire setup is proper,
             | real HDR devices and software, your screen has proper OLED
             | or local dimming, is calibrated to within an inch of its
             | life etc, and you view them in a dark home theater.
        
               | the_af wrote:
               | True! But I think that movies adapted for TV must be made
               | for the average "good quality" screen, not the state of
               | the art that almost nobody owns. At least they should
               | look decent enough in a good quality (but not top-notch)
               | setup.
               | 
               | Also, the YouTube video I'm thinking of singles out
               | Wicked _as seen in movie theaters_. The image  "as
               | intended" looks washed out and without contrast.
        
         | pornel wrote:
         | Most "HDR" monitors are junk that can't display HDR. The HDR
         | formats/signals are designed for brightness levels and viewing
         | conditions that nobody uses.
         | 
         | The end result is a complete chaos. Every piece of the pipeline
         | doing something wrong, and then the software tries to
         | compensate for it by emitting doubly wrong data, without even
         | having reliable information about what it needs to compensate
         | for.
         | 
         | https://docs.google.com/document/d/1A__vvTDKXt4qcuCcSN-vLzcQ...
        
           | esperent wrote:
           | What we really need is some standards that _everybody_
           | follows. The reason normal displays work so well is that
           | everyone settled on sRGB, and as long as a display gets close
           | to that, say 95% sRGB, everyone except maybe a few graphics
           | designers will have a n equivalent experience.
           | 
           | But HDR, it's a minefield of different display qualities,
           | color spaces, standards. It's no wonder that nobody gets it
           | right and everyone feels confused.
           | 
           | HDR on a display that has peak brightness of 2000 nits will
           | look completely different than a display with 800 nits, and
           | they both get to claim they are HDR.
           | 
           | We should have a standard equivalent to color spaces. Set,
           | say, 2000 nits as 100% of HDR. Then a 2000 nit display gets
           | to claim it's 100% HDR. A 800 nit display gets to claim 40%
           | HDR, etc. A 2500 nit display could even use 125% HDR in it's
           | marketing.
           | 
           | It's still not perfect - some displays (OLED) can only show
           | peak brightness over a portion of the screen. But it would be
           | an improvement.
        
             | pornel wrote:
             | DisplayHDR standard is supposed to be it, but they've
             | ruined its reputation by allowing HDR400 to exist when
             | HDR1000 should have been the minimum.
             | 
             | Besides, HDR quality is more complex than just max nits,
             | because it depends on viewing conditions and black levels
             | (and everyone cheats with their contrast metrics).
             | 
             | OLEDs can peak at 600 nits and look awesome -- in a pitch
             | black room. LCD monitors could boost to 2000 nits and
             | display white on grey.
             | 
             | We have sRGB kinda working for color primaries and gamma,
             | but it's not _the_ real sRGB at 80 nits. It ended up being
             | relative instead of absolute.
             | 
             | A lot of the mess is caused by the need to adapt content
             | mastered for pitch black cinema at 2000 nits to 800-1000
             | nits in daylight, which needs very careful processing to
             | preserve highlights and saturation, but software can't rely
             | on the display doing it properly, and doing it in software
             | sends false signal and risks display correcting it twice.
        
       | the__alchemist wrote:
       | So, HN, are HDR monitors worth it? I remember ~10 years ago
       | delaying my monitor purchase for the HDR one that was right
       | around the corner, but never (in my purchasing scope) became
       | available. Time for another look?
       | 
       | The utility of HDR (as described in the article) is without
       | question. It's amazing looking at an outdoors (or indoors with
       | windows) scene with your Mk-1 eyeballs, then taking a photo and
       | looking at it on a phone or PC screen. The pic fails to capture
       | what your eyes see for lighting range.
        
         | aethrum wrote:
         | For gaming, definitely. An HDR Oled monitor is so immersive.
        
         | esperent wrote:
         | I think it depends on the screen and also what you use it for.
         | My OLED is unusable for normal work in HDR because it's
         | designed around only a small portion of the screen being at max
         | brightness - reasonable for a game or movie, but the result is
         | that a small window with white background will look really
         | bright, but if I maximize it, it'll look washed out, grey not
         | white.
         | 
         | Also the maximum brightness isn't even that bright at 800 nits,
         | so no HDR content really looks _that_ different. I think newer
         | OLEDs are brighter though. I 'm still happy with the screen in
         | general, even in SDR the OLED really shines. But it made me
         | aware not all HDR screens are equal.
         | 
         | Also, in my very short experiment using HDR for daily work I
         | ran into several problems, the most serious of which was the
         | discovery that you can no longer just screenshot something and
         | expect it to look the same on someone else's computer.
        
           | simoncion wrote:
           | > ...the most serious of which was the discovery that you can
           | no longer just screenshot something and expect it to look the
           | same on someone else's computer.
           | 
           | To be pedantic, this has always been the case... Who the hell
           | knows what bonkers "color enhancement" your recipient has
           | going on on their end?
           | 
           | But (more seriously) it's very, very stupid that most systems
           | out there will ignore color profile data embedded in pictures
           | (and many video players ignore the same in videos [0]). It's
           | quite possible to tone-map HDR stuff so it looks reasonable
           | on SDR displays, but color management is like accessibility
           | in that nearly noone who's in charge of paying for software
           | development appears to give any shits about it.
           | 
           | [0] A notable exception to this is MPV. I can't recommend
           | this video player highly enough.
        
         | kllrnohj wrote:
         | HDR gaming: Yes.
         | 
         | HDR _full screen_ content: Yes.
         | 
         | HDR general desktop usage: No. In fact you'll probably actively
         | dislike it to the point of just turning it off entirely. The
         | ecosystem just isn't ready for this yet, although with things
         | like the "constrained-high" concepts (
         | https://www.w3.org/TR/css-color-hdr-1/#the-dynamic-range-lim...
         | ) this might, and hopefully does, change & improve to a more
         | pleasing result
         | 
         | Also this is assuming an HDR monitor that's also a good match
         | for your ambient environment. The big thing nobody really talks
         | about wiith HDR is that it's really dominated by how dark
         | you're able to get your surrounding environment such that you
         | can push your display "brightness" (read: SDR whitepoint) lower
         | and lower. OLED HDR monitors, for example, look fantastic in
         | SDR and fantastic in HDR _in a dark room_ , but if you have
         | typical office lighting and so you want an SDR whitepoint of
         | around 200-300 nits? Yeah, they basically don't do HDR at all
         | anymore at that point.
        
           | wirybeige wrote:
           | I use HDR for general usage, Windows ruins non-HDR content
           | when HDR is enabled due to their choice of sRGB tf. Luckily
           | every Linux DE has chosen to use the gamma 2.2 tf, and looks
           | fine for general usage.
           | 
           | I use a mini-led monitor, and its quite decent, except for
           | starfields, & makes it very usable even in bright conditions,
           | and HDR video still is better in bright conditions than the
           | equivalent SDR video.
           | 
           | https://github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm
        
             | hbn wrote:
             | Windows HDR implementation is janky as hell. For months
             | after I got my monitor I couldn't take screenshots because
             | they'd all appear completely blown out, like you cranked
             | the brightness to 300%.
             | 
             | Eventually I did some digging and found there's a setting
             | in Snipping Tool that just... makes screenshots work on HDR
             | displays.
             | 
             | It also seems to add another layer of Your Desktop Trying
             | To Sort Its Shit Out when launching a game that's full
             | screen. Sometimes it's fine, but some games like Balatro
             | will appear fine at first, but then when you quit back to
             | the desktop everything is washed out. Sleeping my PC and
             | waking it back up seems to resolve this.
             | 
             | I recently played through Armored Core VI, and it supports
             | HDR, but whenever I adjust my volume the screen becomes
             | washed out to display the volume slider. Screenshots and
             | recordings also appear washed out in the resulting file.
        
               | wirybeige wrote:
               | I always thought the "Your Desktop Trying To Sort Its
               | Shit Out" part was a necessary evil, but other platforms
               | don't suffer from this (at least from what I can tell);
               | the state of HDR on Windows is very disappointing, even
               | just adjusting the TF to gamma 2.2 would make it
               | substantially better. Watching all your non hdr content's
               | blacks become gray is terrible. I assume the washed out
               | appearance comes from it giving up on doing SDR->HDR for
               | the desktop.
               | 
               | My brother got an OLED monitor & was telling me how bad
               | his experience was on Windows, & he recently switched to
               | Linux & does not have the issues he was complaining about
               | before. Ofc, downsides to hdr on Linux (no hdr on
               | chromium, hdr on Firefox is unfinished) atm, but the
               | foundation seems better set for it.
        
               | qingcharles wrote:
               | Agree. Wide gamut and HDR is janky as hell on Windows. I
               | have multi-mon with one SDR and one HDR and that plays
               | havoc with things. But even Microsoft apps aren't
               | updated. I'm pretty certain even Explorer still doesn't
               | support HDR or wide gamut for thumbnails so everything
               | looks either under or oversaturated in the previews. If
               | you open stuff in the default Photos app there is a
               | "flash of content" where it displays it in the wrong
               | profile before it maps it to the correct one, too.
        
           | arduinomancer wrote:
           | HDR on desktop in windows looks straight up broken on some
           | HDR monitors I've tried
           | 
           | Like totally washed out
        
           | 98codes wrote:
           | Every few years I turn on that HDR toggle for my desktop PC,
           | and it never lasts longer than a day or two.
           | 
           | Top tip: If you have HDR turned on for your display in
           | Windows (at least, MacOS not tested) and then share your
           | screen in Teams, your display will look weirdly dimmed for
           | everyone not using HDR on their display--which is everyone.
        
           | Sohcahtoa82 wrote:
           | > HDR gaming: Yes.
           | 
           | The difference is absolutely _stunning_ in some games.
           | 
           | In MS Flight Simulator 2024, going from SDR to HDR goes from
           | looking like the computer game it is to looking life-like.
           | Deeper shadows with brighter highlights makes the scene pop
           | in ways that SDR just can't do.
           | 
           | EDIT: You'll almost certainly need an OLED monitor to really
           | appreciate it, though. Local dimming isn't good enough.
        
         | SomeoneOnTheWeb wrote:
         | _IF_ you have a display that can it roughly a 1000 nits, then
         | for movies and games yes definitely the difference with SDR is
         | pretty huge.
         | 
         | If you have say a 400 nits display the HDR may actually look
         | _worse_ than SDR. So it really depends on your screen.
        
           | simoncion wrote:
           | Honestly, I find the extended brightness FAR less important
           | than the extended color gamut. I have a ~300 nit VA monitor
           | that I'm generally quite happy with and that looks fantastic
           | with well-built HDR renderers.
           | 
           | Given that monitors report information about their HDR
           | minimum and maximum panel brightness capabilities to the
           | machine they are connected to, any competently-built HDR
           | renderer (whether that be for games or movies or whatever)
           | will be able to take that information and adjust the picture
           | appropriately.
        
         | eschatology wrote:
         | Yes but with asterisks; Best way I can describe it:
         | 
         | You know the 0-10 brightness slider you have to pick at the
         | start of a game? Imagine setting it to 0 and still being able
         | to spot the faint dark spot. The dynamic range of things you
         | can see is so much expanded.
         | 
         | Early HDR screens were very limited (limited dimming zones,
         | buggy implementation) but if you get one post 2024 (esp the
         | oled ones) they are quite decent. However it needs to be
         | supported at many layers: not just the monitor, but also the
         | operating system, and the content. There are not many games
         | with proper HDR implementation; and even if there is, it may be
         | bad and look worse -- the OS can hijack the rendering pipeline
         | and provide HDR map for you (Nvidia RTX HDR) which is a gamble:
         | it may look bleh, but sometimes also better than the native HDR
         | implementation the game has).
         | 
         | But when everything works properly, wow it looks amazing.
        
           | kllrnohj wrote:
           | > You know the 0-10 brightness slider you have to pick at the
           | start of a game? Imagine setting it to 0 and still being able
           | to spot the faint dark spot. The dynamic range of things you
           | can see is so much expanded.
           | 
           | Note that HDR only actually changes how _bright_ things can
           | get. There 's zero difference in the dark regions. This is
           | made confusing because HDR video marketing often claims it
           | does, but it doesn't actually. HDR monitors do not, in
           | general, have any advantage over SDR monitors in terms of the
           | darks. Local dimming zones improve dark contrast. OLED
           | improves dark contrast. Dynamic contrast improves dark
           | contrast. But HDR doesn't.
        
             | eschatology wrote:
             | My understanding is that on the darker scenes (say, 0 to 5
             | in the brightness slider example), there is difference in
             | luminance value with HDR but not SDR, so there is increased
             | contrast and detail.
             | 
             | This matches my experience; 0 to 5 look identically black
             | if I turn off HDR
        
               | kllrnohj wrote:
               | You _may_ have a monitor that only enables local dimming
               | zones when fed an HDR signal but not when fed an SDR one,
               | but that would be unusual and certainly not required. And
               | likely something you could change in your monitors
               | controls. On things like an OLED, though, there 's no
               | difference in the darks. You'd see a difference between
               | 8bit and 10bit potentially depending on what "0 to 5"
               | means, but 10-bit SDR is absolutely a thing (it predates
               | HDR even)
               | 
               | But like if you can't see a difference between 0 to 5 in
               | a test pattern like this
               | https://images.app.goo.gl/WY3FhCB1okaRANc28 in SDR but
               | you can in HDR then that just means your SDR factory
               | calibration is bad, or you've fiddled with settings that
               | broke it.
        
         | Jhsto wrote:
         | I've been thinking of moving out of the Apple ecosystem but
         | after seeing Severance on my iPhone Pro screen I feel like I
         | want the keep the option to have the same HDR experience for
         | movies specifically. With HDR support landing in Linux just a
         | month ago I'm inclined to spend on a good monitor. However, I
         | have IPS HDR 600 monitor but I never felt that the screen was
         | as glorious as the iPhone screen.
         | 
         | I'd also be interested in hearing whether it makes sense to
         | look into OLED HDR 400 screens (Samsung, LG) or is it really
         | necessary to get an Asus ProArt which can push the same 1000
         | nits average as the Apple XDR display (which, mind you, is
         | IPS).
        
         | whywhywhywhy wrote:
         | Dunno if it's just my screen or setup but on windows I have a
         | Dell U4025QW and HDR on the desktop just looks strange, overly
         | dull. Looks good in games but I have to manually turn it on and
         | off each time on the screen.
         | 
         | On my Macbook Pro only activates when it needs to but honestly
         | I've only seen one video [1] that impressed me with it, the
         | rest was completely meh. Not sure if its because it's mostly
         | iPhone photography you see in HDR which is overall pretty meh
         | looking anyway.
         | 
         | [1] https://www.youtube.com/watch?v=UwCFY6pmaYY I understand
         | this isn't a true HDR process but someone messing with it in
         | post, but it's the only video I've seen that noticeably shows
         | you colors you can't see on a screen otherwise.
        
         | nfriedly wrote:
         | A lot of monitors that advertise HDR support really shouldn't.
         | Many of them can decode the signal but don't have the hardware
         | to accurately reproduce it, so you just end up with a washed
         | out muddy looking mess where you're better off disabling HDR
         | entirely.
         | 
         | As others here have said, OLED monitors are generally excellent
         | at reproducing a HDR signal, especially in a darker space. But
         | they're terrible for productivity work because they'll get
         | burned in for images that don't change a lot. They're fantastic
         | for movies and gaming, though.
         | 
         | There are a few good non-OLED HDR monitors, but not many. I
         | have an AOC Q27G3XMN; its a 27" 1440p 180hz monitor that is
         | good for entry-level HDR, especially in brighter rooms. It has
         | over 1000 nits of brightness, and no major flaws. It only has
         | 336 backlight zones, though, so you might notice some blooming
         | around subtitles or other fine details where there's dark and
         | light content close together. (VA panels are better than IPS at
         | suppressing that, though.) It's also around half the price of a
         | comparable OLED.
         | 
         | Most of the other non-OLED monitors with good HDR support have
         | some other deal-breaking flaws or at least major annoyances,
         | like latency, screwing up SDR content, buggy controls, etc. The
         | Monitors Unboxed channel on YouTube and rtngs.com are both good
         | places to check.
        
           | Sohcahtoa82 wrote:
           | I think an OLED is basically an absolute necessity for HDR
           | content.
           | 
           | My current monitor is an OLED and HDR in games looks
           | absolutely amazing. My previous was an IPS that supported
           | HDR, but turning it on caused the backlight to crank to the
           | max, destroying black levels and basically defeating the
           | entire purpose of HDR. Local dimming only goes so far.
        
             | nfriedly wrote:
             | > _My previous was an IPS that supported HDR, but turning
             | it on caused the backlight to crank to the max, destroying
             | black levels and basically defeating the entire purpose of
             | HDR_
             | 
             | Yeah, that's kind of what I meant when I said that most
             | monitors that advertise HDR shouldn't.
             | 
             | The AOC monitor is the third or fourth one I've owned that
             | advertised HDR, but the first one that doesn't look like
             | garbage when it's enabled.
             | 
             | I haven't gone oled yet because of both the cost and the
             | risk of burn-in in my use case (lots of coding and other
             | productivity work, occasional gaming).
        
             | vlmutolo wrote:
             | Modern mini-led monitors are very good. The "local" dimming
             | is so local that there isn't much light bleed even in the
             | worst-case situations (cursor over black background makes
             | it particularly apparent).
             | 
             | The advantage of LEDs is they're brighter. For example,
             | compare two modern Asus ProArt displays: their mini-LED
             | (PA32UCXR) at 1600 nits and their OLED (PA32DC) at 300ish
             | nits. The OLED is 20% more expensive. These two monitors
             | have otherwise comparable specs. Brightness matters a lot
             | for HDR because if you're in a bright room, the monitor's
             | peak brightness needs to overpower the room.
             | 
             | Plus for color managed work, I think LED monitors are
             | supposed to retain their calibration well. OLEDs have to be
             | frequently recalibrated.
             | 
             | And so-called micro-LEDs are coming soon, which promise to
             | make "local" so small that it's imperceptible. I think the
             | near-term future of displays is really good LEDs.
        
           | mxfh wrote:
           | Avoid cranking up the OLED brightness over 70% for static
           | content and absolutely never drive SDR-Reds into the HDR
           | range by using fake HDR-modes when having brightness high.
           | 
           | I have a LG 2018 OLED that has some burnt in Minecraft hearts
           | because of that, not from Minecraft itself, but just a few
           | hours of minecraft Youtube video in those settings from the
           | built in youtube client, but virtually no other detectable
           | issues after excessive years of use with static content.
           | 
           | You only see them with fairly uniform colors as a background
           | where color banding would usually be my bigger complaint.
           | 
           | So burn-ins definitely happen, but they are far from being a
           | deal breaker over the obvious benefits you get vs other types
           | of displays.
           | 
           | And driving everything possible in dark mode (white text on
           | dark bg) on those displays is even the logical thing to do.
           | Then you dont need much max brightness anyway and even save
           | some energy.
        
         | SebastianKra wrote:
         | Apple's Displays yes. But I got a Phillips 4k OLED recently,
         | and I'm already regretting that decision. I need to turn it off
         | every 4 hours to refresh the pixels. Sometimes an entire line
         | of pixels is brighter than the rest. I wiped it with a cloth
         | while pixel refresh was running, and then saw burned in streaks
         | in the direction of the wipe.
         | 
         | And thats now that all the LEDs are still fresh. I can't
         | imagine how bad it will be in a few years.
         | 
         | Also, a lot of Software doesn't expect the subpixel
         | arrangement, so text will often look terrible.
        
         | baq wrote:
         | I've had an OLED TV since 2017 and the answer is a resounding
         | yes... if you get an OLED and use it for movies or full screen
         | gaming. Anything else is basically pointless.
         | 
         | For desktop work, don't bother unless your work involves HDR
         | content.
        
         | EasyMark wrote:
         | for movies yes. for vim/vscode nope.
        
         | qingcharles wrote:
         | For the love of gods, read the reviews though. Most HDR
         | displays will make the picture _worse_ not better because they
         | have only implemented enough to put a sticker on the box.
         | 
         | You have to spend really good money to get a display which does
         | HDR properly.
        
         | pornel wrote:
         | HDR _when it works properly_ is nice, but nearly all HDR LCD
         | monitors are so bad, they 're basically a scam.
         | 
         | The high-end LCD monitors (with full-array local dimming)
         | barely make any difference, while you'll get a lot of downsides
         | from bad HDR software implementations that struggle to get the
         | correct brightness/gamma and saturation.
         | 
         | IMHO HDR is only worth viewing on OLED screens, and requires a
         | dimly lit environment. Otherwise either the hardware is not
         | capable enough, or the content is mastered for wrong brightness
         | levels, and the software trying to fix that makes it look even
         | worse.
        
       | gyomu wrote:
       | As a photographer, I get the appeal of (this new incarnation of)
       | HDR content, but the practical reality is that the photos I see
       | posted in my feeds go from making my display looking normal to
       | having photos searing my retinas, while other content that was
       | uniform white a second prior now looks dull gray.
       | 
       | It's late night here so I was reading this article in dark mode,
       | at a low display brightness - and when I got to the HDR photos I
       | had to turn down my display even more to not strain my eyes, then
       | back up again when I scrolled to the text.
       | 
       | For fullscreen content (games, movies) HDR is alright, but for
       | everyday computing it's a pretty jarring experience as a user.
        
         | beachwood23 wrote:
         | Completely agree. To me, HDR feels like the system is ignoring
         | my screen brightness settings.
         | 
         | I set my screen brightness to a certain level for a _reason_.
         | Please don't just arbitrarily turn up the brightness!
         | 
         | There is no good way to disable HDR on photos for iPhone,
         | either. Sure, you can turn off the HDR on photos on your
         | iphone. But then, when you cast to a different display, the TV
         | tries to display the photos in HDR, and it won't look half as
         | good.
        
           | repelsteeltje wrote:
           | > To me, HDR feels like the system is ignoring my screen
           | brightness settings.
           | 
           | You might be on to something there. Technically, HDR is
           | mostly about profile signaling and therefore about interop.
           | To support it in mpeg dash or hls media you need to make sure
           | certain codec attributes are mentioned in the xml or m3u8 but
           | the actual media payload stays the same.
           | 
           | Any bit or Bob being misconfigured or misinterpreted in the
           | streaming pipeline will result in problems ranging from
           | slightly suboptimal experience to nothing works.
           | 
           | Besides HDR, "spatial audio" formats like Dolby Atmos are
           | notorious for interop isuues
        
           | kllrnohj wrote:
           | > To me, HDR feels like the system is ignoring my screen
           | brightness settings.
           | 
           | On both Android & iOS/MacOS it's not that HDR is ignoring
           | your screen brightness, but rather the brightness slider is
           | controlling the SDR range and then yes HDR can exceed that,
           | that's the singular purpose of HDR to be honest. All the
           | other purported benefits of HDR are at best just about HDR
           | _video_ profiles and at worst just nonsense bullshit. The
           | only thing HDR actually does is allow for brighter colors vs.
           | SDR. When used selectively this really enhances a scene. But
           | restraint is hard, and most forms of HDR content production
           | are shit. The HDR images that newer iPhones and Pixel phones
           | are capturing are generally quite good because they are
           | actually restrained, but then ironically both of them have
           | horrible HDR _video_ that 's just obnoxiously bright.
        
             | agos wrote:
             | you are right but at least in my experience it's very easy
             | for a modern iPhone to capture a bad HDR photo, usually
             | because there is some small strong highlight (often a form
             | of specular reflection from a metallic object) that causes
             | everything to be HDR while the photo content wouldn't need
             | it
        
               | altairprime wrote:
               | In beta testing this morning, the Halide "HDR slider"
               | works as intended to solve that. Some of my photos have
               | only needed +0.3 while a couple taken in near-total no-
               | light-pollution darkness have it cranked all the way to
               | max and that still isn't enough.
        
             | frollogaston wrote:
             | "On both Android & iOS/MacOS it's not that HDR is ignoring
             | your screen brightness, but rather the brightness slider is
             | controlling the SDR range and then yes HDR can exceed that"
             | 
             | Doesn't this mean HDR is ignoring my brightness setting?
             | Looking at the Mac color profiles, the default HDR has some
             | fixed max brightness regardless of the brightness slider.
             | And it's very bright, 1600 nits vs the SDR max of 600 nits.
             | At least I was able to pick another option capping HDR to
             | 600, but that still allows HDR video to force my screen to
             | its normal full brightness even if I dimmed it.
        
               | kllrnohj wrote:
               | > Doesn't this mean HDR is ignoring my brightness
               | setting?
               | 
               | Not exactly because it is still being scaled by your
               | brightness setting. As in, start playing an HDR video and
               | then mess with the brightness slider. You will still see
               | the HDR content getting dimmer/brighter.
               | 
               | It's easier to think about in Apple's EDR terms. 0.0-1.0
               | is the SDR range, and the brightness slider is changing
               | what the nit value is of "1.0" - is it 100 nits? 300
               | nits? 50 nits? etc... HDR content (in theory) still has
               | that same 0.0-1.0 portion of the range, and it's still
               | being scaled. However it can exceed that 1.0. It's still
               | being scaled, it's still "respecting" that slider. Just
               | the slider wasn't a brightness limit as you're wanting it
               | to be, but a 1.0 alignment point.
               | 
               | The problem comes when HDR content is disrespectful to
               | that. When it just absolutely _slams_ the brightness,
               | pushing all of its content way past that 1.0 value. This
               | is bad content, and unfortunately it 's incredibly common
               | in HDR media due in part to the fact that the original
               | HDR specs are very incomplete and in part because it's a
               | new loudness war.
        
               | frollogaston wrote:
               | Ah, just tested with https://github.com/dtinth/superwhite
               | and you're correct. I remember it not even scaling with
               | the brightness, but guess I was wrong.
        
             | LinAGKar wrote:
             | >HDR can exceed that
             | 
             | It's not just the HDR content that gets brighter, but SDR
             | content too. When I test it in Chrome on Android, if an HDR
             | image shows up on screen the phone start overriding the
             | brightness slider completely and making everything
             | brighter, including the phone's system UI.
             | 
             | >The only thing HDR actually does is allow for brighter
             | colors vs. SDR.
             | 
             | Not just brighter, but also darker, so it can preserve
             | detail in dark areas rather than crushing them.
        
               | kllrnohj wrote:
               | > It's not just the HDR content that gets brighter, but
               | SDR content too. When I test it in Chrome on Android, if
               | an HDR image shows up on screen the phone start
               | overriding the brightness slider completely and making
               | everything brighter, including the phone's system UI.
               | 
               | You have an "old" style handling of HDR on Android.
               | Newer/better devices don't do that (specifically those
               | that support
               | https://source.android.com/docs/core/display/mixed-sdr-
               | hdr )
               | 
               | Similarly MacOS/iOS doesn't do that.
               | 
               | > Not just brighter, but also darker, so it can preserve
               | detail in dark areas rather than crushing them.
               | 
               | It does not get darker, and while PQ allocates more bits
               | to the dark region HLG does _not_. And, more importantly,
               | neither does the actual display panel which are still
               | typically gamma 2.2-2.4 regardless. So PQ 's extra
               | precision in the dark areas is ~never utilized other than
               | as tonemapping input, but the resulting output does not
               | have any increased precision in the darks over SDR.
               | 
               | In fact it actually has _less_ precision in the dark
               | areas as the increased display luminance range means the
               | panels native bit depth need to cover more range.
        
             | BlueTemplar wrote:
             | It isn't just about the brightness (range).
             | 
             | In practice the 'HDR' standards are also about wider color
             | gamuts (than sRGB), and (as mentioned in parallel) packed
             | into more bits, in a different way, so as to minimise
             | banding while keeping file sizes in check.
        
         | sandofsky wrote:
         | While it isn't touched on in the post, I think the issue with
         | feeds is that platforms like Instagram have no interest in
         | moderating HDR.
         | 
         | For context: YouTube automatically edits the volume of videos
         | that have an average loudness beyond a certain threshold. I
         | think the solution for HDR is similar penalization based on log
         | luminance or some other reasonable metric.
         | 
         | I don't see this happening on Instagram any time soon, because
         | bad HDR likely makes view counts go up.
         | 
         | As for the HDR photos in the post, well, those are a bit strong
         | to show what HDR can do. That's why the Mark III beta includes
         | a much tamer HDR grade.
        
           | corndoge wrote:
           | The effect of HDR increasing views is explicitly mentioned in
           | the article
        
             | nightpool wrote:
             | You are replying to the article's author.
        
           | dheera wrote:
           | > because bad HDR likely makes view counts go up
           | 
           | Another related parallel trend recently is that bad AI images
           | get very high view and like counts, so much so that I've lost
           | a lot of motivation for doing real photography because the
           | platforms cease to show them to anyone, even my own
           | followers.
        
           | SquareWheel wrote:
           | FYI: You wrote Chrome 14 in the post, but I believe you meant
           | Android 14.
        
             | sandofsky wrote:
             | Thanks. Updated.
        
           | tart-lemonade wrote:
           | > YouTube automatically edits the volume of videos that have
           | an average loudness beyond a certain threshold.
           | 
           | For anyone else who was confused by this, it seems to be a
           | client-side audio compressor feature (not a server-side
           | adjustment) labeled as "Stable Volume". On the web, it's
           | toggleable via the player settings menu.
           | 
           | https://support.google.com/youtube/answer/14106294
           | 
           | I can't find exactly when it appeared but the earliest
           | capture of the help article was from May 2024, so it is a
           | relatively recent feature: https://web.archive.org/web/202405
           | 23021242/https://support.g...
           | 
           | I didn't realize this was a thing until just now, but I'm
           | glad they added it because (now that I think about it) it's
           | been awhile since I felt the need to adjust my system volume
           | when a video was too quiet even at 100% player volume. It's a
           | nice little enhancement.
        
             | scraptor wrote:
             | The client side toggle might be new since 2024 but the
             | volume normalisation has been a thing for a long time.
        
               | ddingus wrote:
               | Yes, and I love it! Finally, the volume knob control was
               | pulled from producers all about gimmicks to push their
               | productions.
               | 
               | There are still gimmicks, but at least they do not
               | include music so badly clipped as to be unlistenable...
               | hint: go get the DVD or Blu-Ray release of whatever it is
               | and you are likely to enjoy a not clipped album.
               | 
               | It is all about maximizing the overall sonic impact the
               | music is capable of. Now when levels are sane, song
               | elements well differentiated and equalized such that no
               | or only a minor range of frequencies are crushed due to
               | many sounds all competing for them, it will sound, full,
               | great and not tiring!
               | 
               | Thanks audio industry. Many ears appreciate what was
               | done.
        
               | ddingus wrote:
               | I expected a moderate amount of heat directed at my
               | comment.
               | 
               | No worries. I've friends in various industries doing
               | production who hate the change.
               | 
               | I like it, of course. Losing the volume knob is a direct
               | result of the many abuses.
        
               | hbn wrote:
               | I know they've automatically boosted brightness in dark
               | scenes for a long time too. It's not rare for people to
               | upload a clip from a video game with a very dark scene
               | and it's way brighter after upload than it was when they
               | played or how it looks in the file they uploaded.
        
             | ignaloidas wrote:
             | Youtube has been long normalizing videos standard feed,
             | switching to a -14 LUFS target in 2019. But LUFS is a
             | global target, and is meant to allow higher peaks and
             | troughs over the whole video, and the normalization does
             | happen on a global level - if you exceed it by 3dB, then
             | the whole video gets it's volume lowered by 3dB, no matter
             | if the part is quiet or not.
             | 
             | The stable volume thing is meant to essentially level out
             | all of the peaks and troughs, and IIRC it's actually
             | computed server-side, I think yt-dlp can download stable
             | volume streams if asked to.
        
           | spoonsort wrote:
           | Why is nobody talking about the standards development? They
           | (OS, image formats) could just say all stuff by default
           | assumes SDR and if a media file explicitly calls for HDR even
           | then it cannot have sharp transitions except in special
           | cases, and the software just blocks or truncates any non
           | conforming images. The OS should have had something like this
           | for sound, about 25-30 years ago. For example a brightness
           | aware OS/monitor combo could just outright disallow anything
           | about x nits. And disallow certain contrast levels, in the
           | majority of content.
        
           | altairprime wrote:
           | Instagram has to allow HDR for the same reason that Firefox
           | spent the past twenty years displaying web colors like HN
           | orange at maximum display gamut rather than at sRGB
           | calibrated: because a brighter red than anyone else's draws
           | people in, and makes the competition seem lifeless by
           | comparison, especially in a mixed-profiles environment.
           | _Eventually_ that is regarded as 'garishly bright', so to
           | speak, and people push back against it. I assume Firefox is
           | already fixing this to support the latest CSS color spec
           | (which defines #rrggbb as sRGB and requires it to be
           | presented as such unless stated otherwise in CSS), but I
           | doubt Instagram is willing to literally dim their feed;
           | instead, I would expect them to begin AI-HDR'ing SDR uploads
           | in order that all videos are captivatingly, garishly, bright.
        
           | mrandish wrote:
           | > I think the solution for HDR is similar penalization based
           | on log luminance or some other reasonable metric.
           | 
           | I completely understand the desire to address the issue of
           | content authors misusing or intentionally abusing HDR with
           | some kind of auto-limiting algorithm similar to the way the
           | radio 'loudness wars' were addressed. Unfortunately, I
           | suspect it will be difficult, if not impossible, to achieve
           | without also negatively impacting some content applying HDR
           | correctly for artistically expressive purposes. Static photos
           | may be solvable without excessive false positive over-
           | correction but cinematic video is much more challenging due
           | to the dynamic nature of the content.
           | 
           | As a cinemaphile, I'm starting to wonder if maybe HDR on
           | mobile devices simply isn't a solvable problem in practice.
           | While I think it's solvable technically and certainly
           | addressable from a standards perspective, the reality of
           | having so many stakeholders in the mobile ecosystem
           | (hardware, OS, app, content distributors, original creators)
           | with diverging priorities makes whatever we do from a base
           | technology and standards perspective unlikely to work in
           | practice for most users. Maybe I'm too pessimistic but as a
           | high-end home theater enthusiast I'm continually dismayed how
           | hard it is to correctly display diverse HDR content from
           | different distribution sources in a less complex ecosystem
           | where the stakeholders are more aligned and the leading
           | standards bodies have been around for many decades (SMPTE et
           | al).
        
             | amlib wrote:
             | I believe everything could be solved the same way we solved
             | high dynamic range in audio, with a volume control.
             | 
             | I find it pretty weird that all tvs and most monitors hide
             | the brightness adjustment under piles and piles of menus
             | when it could be right there in the remote alongside the
             | sound volume buttons. Maybe phones could have hardware
             | brightness buttons too, at least something as easy as it is
             | on adjusting brightness in notebooks that have dedicated
             | brightness fn buttons.
             | 
             | Such brightness slider could also control the amount of
             | tonemapping applied to HDR content. High brightness would
             | mean no to low tonemapping and low brightness would use a
             | very agressive tonemapper producing a similar image to the
             | SDR content along it.
             | 
             | Also note that good audio volume attenuation requires
             | proper loudness contour compensation (as you lower the
             | volume you also increase the bass and treble) for things to
             | sound reasonably good and the "tone" sound well balanced.
             | So, adjusting the tonemapping based on the brightness isn't
             | that far off what we do with audio.
        
           | frollogaston wrote:
           | Btw, YouTube doesn't moderate HDR either. I saw one video of
           | a child's violin recital that was insanely bright, and
           | probably just by accident of using a bad HDR recorder.
        
         | hypeatei wrote:
         | This happens on Snapchat too with HDR videos. Brightness
         | increases while everything else dims... including the buttons.
        
         | skhameneh wrote:
         | I'm under the impression this is caused by the use of "HDR
         | mode"(s) and poor adaptive brightness implementations on
         | devices. Displays such as the iPad Pro w/ OLED are phenomenal
         | and don't seem to implement an overactive adaptive brightness.
         | HDR content has more depth without causing brightness
         | distortion.
         | 
         | In contrast, my TV will change brightness modes to display HDR
         | content and disables some of the brightness adjustments when
         | displaying HDR content. It can be very uncomfortably bright in
         | a dark room while being excessively dim in a bright room. It
         | requires adjusting settings to a middle ground resulting in a
         | mixed/mediocre experience overall. My wife's laptop is the
         | worst of all our devices, while reviews seem to praise the
         | display, it has an overreactive adaptive brightness that cannot
         | be disabled (along with decent G2G response but awful B2W/W2B
         | response that causes ghosting).
        
           | altairprime wrote:
           | Apple's method involves a good deal of what they call "EDR",
           | wherein the display gamma is ramped down in concert with
           | ramping the brightness up, so that the brighter areas get
           | brighter while the non-bright areas remain dark due to gamma
           | math; that term is helpful for searching their WWDC developer
           | videos for more details.
        
             | kjkjadksj wrote:
             | It still looks like how they say where its way too bright
             | and makes formerly "bright" whites appear even neutral grey
             | surrounding the hdr content. Personally I find it extremely
             | jarring when it is just a frame that is hdr within the
             | overall screen. It is much nicer when the hdr content is
             | full screen. Imo I wish I could just disable partial screen
             | hdr and keep the full screen implementation because it is
             | that distracting.
        
               | altairprime wrote:
               | Out curiosity -- I don't have Instagram to test and this
               | is a perceptual question anyways -- if you enable iOS
               | Settings > Accessibility > Display > Reduce White Point
               | (25%) and increase the screen brightness slightly to
               | compensate for the slight dimness; does that reduce or
               | eliminate the 'jarring'ness?
        
               | kjkjadksj wrote:
               | I'm talking about my macbook and that would be untenable
               | because it needs max brightness on non hdr content in a
               | daylight lit room.
        
         | dmos62 wrote:
         | That's not inherent to HDR though. BFV (unless I'm confusing it
         | with something else) has a HDR adjustment routine where you
         | push a slider until the HDR white and the SDR white are
         | identical. Same could be done for desktop environments. In my
         | experience, HDR support is very lacking in PCs atm. You can't
         | even play Dolby Vision on Windows, which is the only widely-
         | used HDR format with dynamic metadata.
        
           | Suppafly wrote:
           | >HDR support is very lacking in PCs atm.
           | 
           | I think it's because no one wants it.
        
             | altairprime wrote:
             | No; Windows 10 barely supports it, and of my hundreds of
             | Steam games, exactly none have any code making use of it;
             | seemingly only AAA mega-budget games have the pipeline
             | bandwidth, but e.g. Dune Imperium and Spaceways have no
             | reason to use it and so don't bother. Windows 11 focused on
             | improving wide color support which is much more critical,
             | as any game shipping for Mac/Win already has dealt with
             | that aspect of the pipeline and has drabified their game
             | for the pre-Win11 ICC nightmares. Even games like Elite
             | Dangerous, which would be a top candidate for both HDR
             | _and_ wide color, don't handle this yet; their pipeline is
             | years old and I don't envy them the work of trying to
             | update it, given how they're overlaying the in-game UI in
             | the midst of their pipeline.
             | 
             | (It doesn't help that Windows only allows HDR to be defined
             | in EDID and monitor INF files, and that PC monitors start
             | shutting off calibration features when HDR is enabled
             | because their chipsets can't keep up -- just as most modern
             | Sony televisions can't do both Dolby Vision _and_ VRR
             | because that requires too much processing power for their
             | design budget.)
        
               | Suppafly wrote:
               | Nothing you're written disproves that no one wants it, if
               | anything the fact that nothing really supports it implies
               | that it's not a valuable feature that people are
               | clamoring for.
        
               | altairprime wrote:
               | Given how popular it is in modern console AAA gaming,
               | which many people _do_ play on PCs, one can reasonably
               | expect that any studio desiring to target high contrast
               | gaming -- a valuable niche for stealth games, where a
               | bright torch should absolutely stand out brightly against
               | the darkest shadows, as well as star exploration and
               | anything with landscapes or specular highlights -- would
               | _like_ to be targeting HDR on PC, if only to simplify
               | their cross-platform build and test pipelines, but cannot
               | due to platform barriers that don't exist for consoles.
        
               | Suppafly wrote:
               | >Given how popular it is in modern console AAA gaming
               | 
               | What are some of the games where it's necessary?
        
               | altairprime wrote:
               | You'll have to do that research yourself, apologies; I
               | haven't looked for that data and so I don't have it
               | available for you.
        
               | qingcharles wrote:
               | It's an absolute mess on Windows. It's one area where I
               | totally bow to Apple's vertical integration which makes
               | this stuff flawless.
        
             | dmos62 wrote:
             | I want it. And, I'm hardpressed to imagine a multimedia
             | consumer that doesn't care about color and brightness
             | ranges, unless they don't understand what that is. Every
             | movie is recorded and many phones now capture in a range
             | that can't be encoded in 8-bits.
        
           | zamadatix wrote:
           | If you mean https://i.imgur.com/0LtYuDZ.jpeg that is probably
           | the slider GP wants but it's not about matching HDR white to
           | SDR white, it's just about clamping the peak HDR brightness
           | in its own consideration. The white on the left is the HDR
           | brightness according to a certain value in nits set via the
           | bottom slider. The white on the right is the maximally bright
           | HDR signal. The goal of adjusting the slider is to find how
           | bright of an HDR white your display can actually produce,
           | which is the lowest slider value at which the two whites
           | appear identical to a viewer.
           | 
           | Some games also have a separate slider
           | https://i.imgur.com/wenBfZY.png for adjusting "paper white",
           | which is the HDR white one might normally associate with
           | matching to SDR reference white (100 nits when in a dark room
           | according to the SDR TV color standards, higher in other
           | situations or standards). Extra note: the peak brightness
           | slider in this game (Red Dead Redemption 2) is the same knob
           | as the brightness slider in the above Battlefield V
           | screenshot)
        
             | dmos62 wrote:
             | Thanks for clarifying this!
        
         | zamadatix wrote:
         | On the browser spec side this is just starting to get
         | implemented as a CSS property https://caniuse.com/mdn-
         | css_properties_dynamic-range-limit so I expect it might start
         | to be a more common thing in web tech based feeds given time.
        
         | casey2 wrote:
         | This seems more like a "your feeds" problem than an HDR
         | problem. Much in the same way people screencap and convert
         | images willy nilly. I suggest blocking non HDR content
        
         | JKCalhoun wrote:
         | I experience the same thing you do -- but my take on it is
         | different. Being hit with HDR images (and videos on YouTube),
         | while unsettling, makes me then realize how just damned dull
         | the SDR world I had been forced to succumb to has been.
         | 
         | Let the whole experience be HDR and perhaps it won't be
         | jarring.
        
       | echo_time wrote:
       | Note for Firefox users - view the page in Chrome to see more of
       | what they are talking about. I was very confused by some of the
       | images, and it was a world of difference when I tried again in
       | Chrome. Things began to make a lot more sense - is there a flag I
       | am missing in Firefox on the Mac?
        
         | viraptor wrote:
         | https://bugzilla.mozilla.org/show_bug.cgi?id=hdr there's the
         | tracking issue for HDR support.
        
           | frollogaston wrote:
           | Tbh, I'm glad this isn't supported in Firefox as of right now
        
         | mikepurvis wrote:
         | Can confirm on Windows 11 with HDR enabled on my display-- I
         | see the photos in the article correctly on Chrome and they're a
         | grey mess on Firefox.
        
           | lloeki wrote:
           | On macOS even without HDR enabled on my display there's a
           | striking difference due to better tone mapping between Safari
           | and Firefox.
           | 
           | If I enable HDR the Firefox ones become a gray mess vs the
           | lights feeling like actual lights in Safari.
        
         | throwaway314155 wrote:
         | For what it's worth, your comment has me convinced I just
         | "can't see" HDR properly because I have the same page side-by-
         | side on Firefox and Chrome on my M4 MBP and honestly? Can't see
         | the difference.
         | 
         | edit: Ah, nevermind. It seems Firefox is doing some sort of
         | post-processing (maybe bad tonemapping?) on-the-fly as the
         | pictures start out similar but degrade to washed out after some
         | time. In particular, the "OVERTHROW BOXING CLUB" photo makes
         | this quite apparent.
         | 
         | That's a damn shame Firefox. C'mon, HDR support feels like
         | table stakes at this point.
         | 
         | edit2: Apparently it's not table stakes.
         | 
         | > Browser support is halfway there. Google beat Apple to the
         | punch with their own version of Adaptive HDR they call Ultra
         | HDR, which Chrome 14 now supports. Safari has added HDR support
         | into its developer preview, then it disabled it, due to bugs
         | within iOS.
         | 
         | at which point I would just say to `lux.camera` authors - why
         | not put a big fat warning at the top for users with a Firefox
         | or Safari (stable) browser? With all the emphasis on supposedly
         | simplifying a difficult standard, the article has fallen for
         | one of its most famous pitfalls.
         | 
         | "It's not you. HDR confuses tons of people."
         | 
         | Yep, and you've made it even worse for a huge chunk of people.
         | :shrug: Great article n' all just saying.
        
         | cubefox wrote:
         | HDR support in Chrome (Android) looks still broken for me. For
         | one, some of the images on the blog have a posterization
         | effect, which is clearly wrong.
         | 
         | Second, the HDR effect seems to be implemented in a very crude
         | way, which causes the whole Android UI (including the Android
         | status bar at the top) to become brighter when HDR content is
         | on screen. That's clearly not right. Though, of course, this
         | might also be some issue of Android rather than Chrome, or
         | perhaps of the Qualcomm graphics driver for my Adreno GPU, etc.
        
           | davidmurdoch wrote:
           | Yeah, the HDR videos on my Asus Zenfone 9 (on Android 14)
           | look like really terrible.
        
           | dpacmittal wrote:
           | Which Android phone are you using?
        
             | cubefox wrote:
             | This one:
             | https://gsmarena.com/motorola_edge_30_ultra-11206.php
        
         | gwbas1c wrote:
         | I wasn't using Firefox, but I had the page open on an old
         | monitor. I dragged the page to an HDR display and the images
         | pop.
        
       | sebstefan wrote:
       | For the Halide's updated Image Lab demo about 2/3rd of the way
       | down the page
       | (https://www.lux.camera/content/media/2025/05/skyline-edit-
       | tr...), you made the demo so tall desktop users can't both see
       | the sky & the controls at the same time
       | 
       | A lot of these design flaws are fixed by Firefox's picture in
       | picture option but for some reason, with the way you coded it,
       | the prompt to pop it out as PIP doesn't show up
        
       | esperent wrote:
       | This page crashed Brave on Android three times before I gave up.
        
         | mcjiggerlog wrote:
         | For me this crashed in android webview, android chrome, and
         | android firefox. Impressive.
        
       | mxfh wrote:
       | Does anyone else find the hubris in the first paragraph writing
       | as off-putting as I do?
       | 
       | "we finally explain what HDR actually means"
       | 
       | Then spends 2/3rds of the article on a _tone mapping_ expedition,
       | only to not address the elephant in the room, that is the almost
       | complete absence of predictable color management in consumer-
       | grade digital environments.
       | 
       | UIs are hardly ever tested in HDR: I don't want my subtitles to
       | burn out my eyes in actual HDR display.
       | 
       | It is here, where you, the consumer, are as vulnerable to light
       | in a proper dark environment for movie watching, as when raising
       | the window curtains on a bright summer morning. (That brightness
       | abuse by content is actually discussed here)
       | 
       | Dolby Vision and Apple have the lead here as a closed platforms,
       | on the web it's simply not predictably possible yet.
       | 
       | Best hope is the efforts of the Color on the Web Community Group
       | from my impression.
       | 
       | https://github.com/w3c/ColorWeb-CG
        
         | lordleft wrote:
         | Isn't that the point of the article? That the colloquial
         | meaning of HDR is quite overloaded, and when people complain
         | about HDR, they mean bad tone-mapping? I say this as someone as
         | close to totally ignorant about photography as you can get; I
         | personally thought the article was pretty spectacular.
        
           | mort96 wrote:
           | When I complain about HDR it's because I've intentionally set
           | the brightness of pure white to a comfortable level, and then
           | suddenly parts of my screen are brighter than that. You
           | fundamentally can't solve that problem with just better tone
           | mapping, can you?
        
             | Retr0id wrote:
             | You can for some definition of "solve", by tone-mapping all
             | HDR content back down into an SDR range for display.
        
               | mort96 wrote:
               | Well yeah. I considered adding that caveat but elected
               | not to because it's obvious and doesn't add anything to
               | the conversation, since that's obviously not what's meant
               | when the industry talks about "HDR". Should've remembered
               | this is HN.
        
           | PaulHoule wrote:
           | The bit about "confused" turns me off right away. The kind of
           | high-pressure stereo salesman who hopes I am the kind of
           | 'audiophile' who prevents me from calling myself an
           | 'audiophile' (wants mercury-filled cables for a more 'fluid'
           | sound) always presupposes the reader/listener is "confused".
        
           | puzzlingcaptcha wrote:
           | But it's not the colloquial meaning, HDR is fairly well
           | defined by e.g. ITU-R BT.2100, which addresses colorimetry,
           | luminance and the corresponding transfer functions.
        
             | sandofsky wrote:
             | I don't think that's the colloquial meaning. If you asked
             | 100 people on the street to describe HDR, I doubt a single
             | person would bring up ITU-R BT.2100.
        
               | babypuncher wrote:
               | HDR has a number of different common meanings, which adds
               | to the confusion.
               | 
               | For example, in video games, "HDR" has been around since
               | the mid '00s, and refers to games that render a wider
               | dynamic range than displays were capable of, and use
               | post-process effects to simulate artifcats like bloom and
               | pupil dilation.
               | 
               | In photography, HDR has almost the opposite meaning of
               | what it does everywhere else. Long and multiple exposures
               | are combined to create an image that has very little
               | contrast, bringing out detail in a shot that would
               | normally be lost in shadows or to overexposure.
        
               | altairprime wrote:
               | Photography's meaning is also about a hundred years older
               | than video games; high(er) dynamic range was a concern in
               | film processing as far back as Ansel, if not prior. That
               | technology adopted it as a sales keyword is interesting
               | and it's worth keeping in mind when writing for an
               | audience -- but this post is about a photography app, not
               | television content or video games, so one can reasonably
               | expect photography's definition to be used, even if the
               | audience isn't necessarily familiar.
        
             | roywiggins wrote:
             | I think you may be operating with an idiosyncratic
             | definition of "colloquial"
        
             | redczar wrote:
             | Colloquial meaning and the well defined meaning are two
             | different things in most cases, right?
        
         | Diti wrote:
         | They also make no mention of transfer functions, which is the
         | main mechanism which explains why the images "burn your eyes" -
         | content creators should use HLG (which has relative luminance)
         | and not PQ (which has absolute luminance) when they create HDR
         | content for the web.
        
           | sandofsky wrote:
           | In theory PQ specifies absolute values, but in practice it's
           | treated as relative. Go load some PQ encoded content on an
           | iPhone, adjust your screen brightness, and watch the HDR
           | brightness also change. Beyond the iPhone, it would be
           | ridiculous to render absolute values as-is, given SDR white
           | is supposedly 100-nits; that would be unwatchable in most
           | living rooms.
           | 
           | Bad HDR boils down to poor taste and the failure of platforms
           | to rein it in. You can't fix bad HDR by switching encodings
           | any more than you can fix global warming by switching from
           | Fahrenheit to Celsius.
        
         | klausa wrote:
         | It's a blog for a (fancy) iPhone camera app.
         | 
         | Color management and handling HDR in UIs is probably a bit out
         | of scope.
        
         | sandofsky wrote:
         | > Does anyone else find the hubris in the first paragraph
         | writing as off-putting as I do? > "we finally explain what HDR
         | actually means"
         | 
         | No. Because it's written for the many casual photographers
         | we've spoken with who are confused and asked for an explainer.
         | 
         | > Then spends 2/3rds of the article on a tone mapping
         | expedition, only to not address the elephant in the room, that
         | is the almost complete absence of predictable color management
         | in consumer-grade digital environments.
         | 
         | That's because this post is about HDR and not color management,
         | which is different topic.
        
           | klausa wrote:
           | >No. Because it's written for the many casual photographers
           | we've spoken with who are confused and asked for an
           | explainer.
           | 
           | To be fair, it would be pretty weird if you found your own
           | post off-putting :P
        
             | mullingitover wrote:
             | Me, routinely, reading things I wrote a while ago: what is
             | this dreck
        
           | mxfh wrote:
           | Maybe my response was part of the broader HDR symptom--that
           | the acronym is overloaded with different meanings depending
           | on where you're coming from.
           | 
           | On the HN frontpage, people are likely thinking of one of at
           | least three things:
           | 
           | HDR as display tech (hardware)
           | 
           | HDR as wide gamut data format (content)
           | 
           | HDR as tone mapping (processing)
           | 
           | ...
           | 
           | So when the first paragraph says _we finally explain what HDR
           | actually means_ , it set me off on the wrong foot--it comes
           | across pretty strongly for a term that's notoriously context-
           | dependent. Especially in a blog post that reads like a
           | general explainer rather than a direct Q&A response when not
           | coming through your apps channels.
           | 
           | Then followed up by _The first HDR is the "HDR mode"
           | introduced to the iPhone camera in 2010._ caused me to write
           | the comment.
           | 
           | For people over 35 with even the faintest interest in
           | photography, the first exposure to the _HDR_ acronym probably
           | didn't arrive with the iPhone in 2010, but _HDR_ IS
           | equivalent to _Photomatix_ style tone mapping starting in
           | 2005 as even mentioned later. The ambiguity of the term is a
           | given now. I think it 's futile to insist or police one
           | meaning other the other in non-scientific informal
           | communication, just use more specific terminology.
           | 
           | So the correlation of what _HDR_ means or what sentiment it
           | evokes in people by age group and self-assesed photography
           | skill might be something worthwhile to explore.
           | 
           | The post get's a lot better after that. That said, I really
           | did enjoy the depth. The dive into the classic dodge and burn
           | and the linked YouTube piece. One explainer at a time makes
           | sense--and tone mapping is a good place to start. Even tone
           | mapping is fine in moderation :)
        
             | ddingus wrote:
             | I took the post about the same way. Thought it excellent
             | because of depth.
             | 
             | Often, we don't get that and this topic, plus my relative
             | ignorance on it, welcomed the post as written.
        
               | mxfh wrote:
               | Just out of curiosity since your profile suggests your
               | from an older cohort. Do you actively remember the
               | Pixelmatix tone mapping era, or where you already old
               | enough to see this as a passing fad, or was this a more
               | niche thing than I remember?
               | 
               | Now I even remember the 2005 HDR HL2 Lost Coast Demo was
               | a thing 20 years ago: https://bit-
               | tech.net/previews/gaming/pc/hl2_hdr_overview/1/
        
               | ddingus wrote:
               | I was old enough to see it as the passing fad it was.
               | 
               | Niche, style points first kind of thing for sure.
               | 
               | Meta: old enough that getting either a new color not
               | intended, or an additional one visible on screen and
               | having the machine remain able to perform was a big deal.
        
               | mxfh wrote:
               | I missed the MDA/EGA/CGA/Hercules era and jumped right
               | into glorious VGA. Only start options for some DOS-games
               | informed you about that drama in the mid 90s not having
               | any idea what that meant otherwise.
        
             | mrandish wrote:
             | > "The first HDR is the "HDR mode" introduced to the iPhone
             | camera in 2010."
             | 
             | Yeah, I had a full halt and process exception on that line
             | too. I guess all the research, technical papers and
             | standards development work done by SMPTE, Kodak, et al in
             | the 1990s and early 2000s just didn't happen? Turns out
             | Apple invented it all in 2010 (pack up those Oscars and
             | Emmys awarded for technical achievement and send'em back
             | boys!)
        
           | mrandish wrote:
           | > That's because this post is about HDR
           | 
           | It's about HDR from the perspective of still photography, in
           | your app, on iOS, in the context of hand-held mobile devices.
           | The post's title ("What Is HDR, Anyway?"), content level and
           | focus would be appropriate in the context of your company's
           | social media feeds for users of your app - which is probably
           | the audience and context it was written for. However in the
           | much broader context of HN, a highly technical community
           | whose interests in imaging are diverse, the article's content
           | level and narrow focus aren't consistent with the headline
           | title. It seems written at a level appropriate for novice
           | users.
           | 
           | If this post was titled "How does Halide handle HDR, anyway?"
           | or even "How should iOS photo apps handle HDR, anyway?" I'd
           | have no objection about the title's promise not matching the
           | content _for the HN audience_. When I saw the post 's
           | headline I thought " _Cool!_ We really need a good technical
           | deep dive into the mess that is HDR - including tech, specs,
           | standards, formats, content acquisition, distribution and
           | display across content types including stills, video clips
           | and cinematic story-telling and diverse viewing contexts from
           | phones to TVs to cinemas to VR. " When I started reading and
           | the article only used photos to illustrate concepts best
           | conveyed with color gradient graphs PLUS photos, I started to
           | feel duped by the title.
           | 
           | (Note: I don't use iOS or your app but the photo comparison
           | of the elderly man near the end of the article confused me.
           | From my perspective (video, cinematography and color
           | grading), the "before" photo looks like a raw capture with
           | flat LUT (or no LUT) applied. Yet the text seemed to imply
           | Halide's feature was 'fixing' some problem with the image.
           | Perhaps I'm misunderstanding since I don't know the tool(s)
           | or workflow but I don't see anything wrong with the original
           | image. It's what you want in a flat capture for later
           | grading.)
        
             | haswell wrote:
             | > _However in the much broader context of HN, a highly
             | technical community whose interests in imaging are diverse,
             | the article 's content level and narrow focus aren't
             | consistent with the headline title. It seems written at a
             | level appropriate for novice users._
             | 
             | That is hardly the fault of the authors though. The article
             | seems entirely appropriate for its intended audience, and
             | they can't control who posts it on a site like HN.
        
         | willquack wrote:
         | > That brightness abuse by content
         | 
         | I predict HDR content on the web will eventually be disabled or
         | mitigated on popular browsers similarly to how auto-playing
         | audio content is no longer allowed [1]
         | 
         | Spammers and advertisers haven't caught on yet to how abusively
         | attention grabbing eye-searingly bright HDR content can be, but
         | any day now they will and it'll be everywhere.
         | 
         | 1. https://hacks.mozilla.org/2019/02/firefox-66-to-block-
         | automa...
        
           | babypuncher wrote:
           | This seems like a fairly easy problem to solve from a UX
           | standpoint, even moreso than auto-playing audio/video.
           | Present all pages in SDR by default, let the user click a
           | button on the toolbar or url bar to toggle HDR rendering when
           | HDR content is detected.
        
           | FireBeyond wrote:
           | They haven't, but influencers certainly have, I get regular
           | still images which are rendered as a video to get the HDR
           | brightness boost in Instagram, etc.
        
         | srameshc wrote:
         | I on the other hand never thought or cared about HDR much
         | before but I remember seeing it everywhere. But I feel the
         | article explains well and clearly with examples, for someone
         | like me who isn't much camera literate.
        
         | altairprime wrote:
         | It seems fine to me. Made sense on the first read and matches
         | my experiences with OpenEXR and ProPhoto RGB and pre-Apple
         | monitors.
         | 
         | High dynamic resolution has always been about tone mapping.
         | Post-sRGB color profile support is called "Wide color" these
         | days, has been available for twenty years or more on all DSLR
         | cameras (such as Nikon ProPhoto RGB supported in-camera on my
         | old D70), and has nothing to do with the dynamic range and tone
         | mapping of the photo. It's convenient that we don't have to use
         | EXR files anymore, though!
         | 
         | An HDR photo in sRGB will have the same defects beyond peak
         | saturation at any given hue point, as an SDR photo in sRGB
         | would, relative to either in DCI-P3 or ProPhoto. Even a two-bit
         | black-or-white "what's color? on or off pixels only" HyperCard
         | dithered image file can still be HDR or SDR. In OKLCH, the
         | selected luminosity will _also_ impact the available chroma
         | range; at some point you start spending your new post-sRGB peak
         | chroma on luminosity instead; but the exact characteristic of
         | that tradeoff at any given hue point is defined by the color
         | profile algorithm, not by whether the photo is SDR or HDR, and
         | the highest peak saturation possible for each hue is fixed,
         | whatever luminosity it happens to be at.
        
         | thfuran wrote:
         | It may not be the best explanation, but I think any explanation
         | of HDR beyond a sentence or two of definition that doesn't
         | mention the mess that is tone mapping is entirely remiss.
        
         | frollogaston wrote:
         | It's nonsense that an image/video gets to override my screen
         | brightness, end of story. I want that removed, not even a
         | setting, just gone.
         | 
         | The photo capture HDR is good. That's a totally different thing
         | and shouldn't have had its name stolen.
        
       | sergioisidoro wrote:
       | Just one other thing. In Analog you also have compensating
       | developers, which will exhaust faster in darker areas (or lighter
       | if you think in negative), and allow for lighter areas more time
       | to develop and show, and hence some more control of the range.
       | Same but to less degree with stand development which uses very
       | low dilutions of the developer, and no agitation. So dodging and
       | burning is not the only way to achieve higher dynamic range in
       | analog photos.
       | 
       | About HDR on phones, I think they are the blight of photography.
       | No more shadows and highlights. I find they are good at capturing
       | family moments, but not as a creative tool.
        
         | CarVac wrote:
         | I wrote a raw processing app Filmulator that simulates stand
         | development/compensating developers to give such an effect to
         | digital photography.
         | 
         | I still use it myself but I need to redo the build system and
         | release it with an updated LibRaw... not looking forward to
         | that.
        
         | kjkjadksj wrote:
         | For analog photos the negative also has more dynamic range than
         | your screen or photopaper without any of that. Contrast is
         | applied in the darkroom by choice of photopaper and enlarger
         | timing and light levels, or after scanning with contrast
         | adjustment applied in post processing. It is really a storage
         | medium of information more than how the final image ought to
         | look.
         | 
         | Slide film has probably a third the dynamic range of negative
         | film and is meant as the final output fit for projection to
         | display.
        
         | CharlesW wrote:
         | > _About HDR on phones, I think they are the blight of
         | photography. No more shadows and highlights._
         | 
         | HDR is what enables you to capture both the darkest shadow
         | detail and the brightest highlight detail.
         | 
         | With SDR, one or both are often simply just lost. It might come
         | down to preference -- if you're an "auto" shooter and like the
         | effect of the visual information at the edges of the available
         | dynamic range being truncated, SDR is for you.
         | 
         | Some people prefer to capture that detail and have the ability
         | to decide whether and how to diminish or remove it, with
         | commensurately more control over the artistic impact. For those
         | folks, HDR is highly desirable.
        
       | dsego wrote:
       | Isn't the result of their tone mapping algo similar to adjusting
       | shadow and highlight sliders in other software?
        
         | sandofsky wrote:
         | No. When you simply adjust shadow and highlights, you lose
         | local contrast. In an early draft of the post, there was an
         | example, but it was cut for pacing.
        
       | fogleman wrote:
       | So, if cameras have poor dynamic range, how are they getting away
       | with a single exposure? They didn't explain that at all...
        
         | sandofsky wrote:
         | Human vision has around 20 stops of static dynamic range.
         | Modern digital cameras can't match human vision-- a $90,000
         | Arri Alexa boasts 17 stops-- but they're way better than SDR
         | screens.
        
       | cainxinth wrote:
       | I chuckled at "The Ed Hardy t-shirt of photography" for the
       | early, overdone "HDR-mode" images.
        
       | perching_aix wrote:
       | I'm not entirely convinced that _greedy influencers_ are to blame
       | for people hating on overly bright content. Instead, I think
       | _something_ is different with how displays produce brightness
       | compared to just the nature outside. Light outside is supposed to
       | reach up to tens of thousands of nits, yet even 1000 nits is
       | searing on a display. Is it that displays output polarized light?
       | Is it the spectral distribution of especially the better displays
       | being three really tight peaks? I cannot tell you, but I 'm
       | suspecting something isn't quite right.
       | 
       | All this aside, HDR and high brightness are different things -
       | HDR is just a representational thing. You can go full send on
       | your SDR monitor as well, you'll just see more banding. The
       | majority of the article is just content marketing about how they
       | perform automatic tonemapping anyways.
        
         | ziml77 wrote:
         | It's all down to the ambient light. That's why bias lighting is
         | now a thing. Try putting a light behind your screen to
         | massively brighten the wall behind it, the 1000 nit peaks will
         | be far less harsh. And if you bring the device out into
         | sunlight I suspect you will wish for everything about its
         | output to be quite a bit brighter.
        
         | layer8 wrote:
         | > Light outside is supposed to reach up to tens of thousands of
         | nits, yet even 1000 nits is searing on a display.
         | 
         | That's a consequence of
         | https://en.wikipedia.org/wiki/Adaptation_(eye). If you look at
         | 1000 nits on a display in bright sunlight, with your eyes
         | adapted to the bright surroundings, the display would look
         | rather dim.
        
       | therealmarv wrote:
       | ... something Linux Desktops don't understand and Macs only do
       | well with their own displays with videos. Guess who the winner is
       | on the desktop: Windows oO
        
         | dismalaf wrote:
         | Linux does HDR. It's *technically" worked for years but none of
         | the big DEs had it working. Now Ubuntu 25.04 and Fedora 42
         | (Gnome) have it working out of the box.
        
           | therealmarv wrote:
           | ha, interesting. thanks!
        
         | hombre_fatal wrote:
         | HDR on my MacBook means play the youtube video at full
         | brightness no matter how dim I've made the screen.
         | 
         | It's infuriating.
         | 
         | e.g. Open this in macOS Chrome:
         | https://www.youtube.com/watch?v=Gq7H6PI4JF8
        
       | bookofjoe wrote:
       | As a non-techie I represent the 99.9% of the population who
       | haven't a clue what tone mapping etc. is: NO WAY would we ever
       | touch the various settings possible as opposed to watching the
       | TV/computer screen/etc. as it came out of the box.
        
       | c-fe wrote:
       | i am still skeptical about HDR as pretty much all HDR content I
       | see online is awful. But this post makes me believe that
       | Lux/Halide can pull of HDR in a way that I will like. I am
       | looking forward to Halide Mk3.
        
       | alistairSH wrote:
       | "The Ed Hardy T-Shirt of Photography"
       | 
       | Literal snort.
        
       | randall wrote:
       | This was super awesome. Thanks for this! Especially the HDR photo
       | reveal felt really awesome.
        
       | aidenn0 wrote:
       | > A big problem is that it costs the TV, Film, and Photography
       | industries billions of dollars (and a bajillion hours of work) to
       | upgrade their infrastructure. For context, it took well over a
       | decade for HDTV to reach critical mass.
       | 
       | This is also true for consumers. I don't own a single 4k or HDR
       | display. I probably won't own an HDR display until my TV dies,
       | and I probably won't own a 4k display until I replace my work
       | screen, at which point I'll also replace one of my home screens
       | so I can remote into it without scaling.
        
         | dmitshur wrote:
         | To demonstrate some contrast (heh) with another data point from
         | someone closer to the other extreme, I've owned a very HDR-
         | capable monitor (the Apple Pro Display XDR) since 2020, so
         | that's 5 years now. Content that takes full advantage of it is
         | still rare, but it's getting better slowly over time.
        
           | colechristensen wrote:
           | I have a screen which is "HDR" but what that means is when
           | you turn the feature on it just makes everything more muted,
           | it doesn't actually have any more dynamic range. When you
           | turn HDR on for a game it basically just makes most things
           | more muddy grey.
           | 
           | I also have a screen which has a huge gamut and blows out
           | colors in a really nice way (a bit like the aftereffects of
           | hallucinogens, it has colors other screens just don't) and
           | you don't have to touch any settings.
           | 
           | My OLED TV has HDR and it actually seems like HDR content
           | makes a difference while regular content is still "correct".
        
             | zamadatix wrote:
             | The cheap displays adding broken HDR400 support destroyed
             | so much public opinion on HDR. Not actually providing a
             | wider range but accepting the HDR signal would at least
             | have been a minor improvement if the tone mapping weren't
             | completely broken to the point most people just associate
             | HDR with a washed out picture.
        
               | colechristensen wrote:
               | >The cheap displays adding broken HDR400 support
               | destroyed so much public opinion on HDR.
               | 
               | It's funny because the display I have that does this was
               | a relatively premium Odyssey G7 which at $700 isn't at
               | all a cheap monitor. (I like it, it's just not at all
               | HDR, or at least not perceivably so compared to Apple
               | devices and an OLED TV)
        
               | zamadatix wrote:
               | The only HDR monitors I have actually enjoyed (which,
               | admittedly, is a slightly higher bar than "most of this
               | price aren't completely miscalibrated") have been $1500+
               | which is a bit ridiculous considering far cheaper TVs do
               | far better with HDR (though tend to suck as monitors). In
               | the Odyssey line this would be the Neo G9 options, but
               | I've never went home with one because they lack flat
               | variants for the ones with good HDR.
        
         | reaperducer wrote:
         | _This is also true for consumers. I don 't own a single 4k or
         | HDR display. I probably won't own an HDR display until my TV
         | dies, and I probably won't own a 4k display until I replace my
         | work screen, at which point I'll also replace one of my home
         | screens so I can remote into it without scaling._
         | 
         | People in the HN echo chamber over-estimate hardware adoption
         | rates. For example, there are millions of people who went
         | straight from CDs to streaming, without hitting the iPod era.
         | 
         | A few years ago on HN, there was someone who couldn't wrap
         | their brain around the notion that even though VCRs were
         | invented in the early 1960's that in 1980, not everyone owned
         | one, or if they did, they only had one for the whole family.
         | 
         | Normal people aren't magpies who trash their kit every time
         | something shiny comes along.
        
           | colechristensen wrote:
           | >there are millions of people who went straight from CDs to
           | streaming, without hitting the iPod era
           | 
           | Who?
           | 
           | There was about a decade there where everyone who had the
           | slightest interest in music had an mp3 player of some kind,
           | at least in the 15-30 age bracket.
        
             | aidenn0 wrote:
             | I don't know if I count, but I never owned a dedicated MP3
             | player[1], I listened to MP3s on my computer, but used CDs
             | and cassettes while on the move, until I got an android
             | phone that had enough storage to put my music collection
             | on.
             | 
             | 1: Well my car would play MP3s burned to CDs in its CD
             | player; not sure if that counts.
        
             | 98codes wrote:
             | My father, for one. He was entirely happy with radio in the
             | car and CDs at home.
        
             | Sohcahtoa82 wrote:
             | I imagine this depends a LOT on your specific age and what
             | you were doing in the 00's when MP3 player usage peaked.
             | 
             | I finished high school in 2001 and didn't immediately go to
             | college, so I just didn't have a need for a personal music
             | player anymore. I was nearly always at home or at work, and
             | I drove a car that actually had an MP3 CD player. I felt no
             | need to get an iPod.
             | 
             | In 2009, I started going to college, but then also got my
             | first smartphone, the Motorola Droid, which acted as my
             | portable MP3 player for when I was studying in the library
             | or taking mass transit.
             | 
             | If you were going to school or taking mass transit in the
             | middle of the '00s, then you were probably more likely to
             | have a dedicated MP3 player.
        
             | BlueTemplar wrote:
             | I skipped 2 generations for portable music : went straight
             | from cassette to smartphone with MP3 (and radio).
        
           | babypuncher wrote:
           | > A few years ago on HN, there was someone who couldn't wrap
           | their brain around the notion that even though VCRs were
           | invented in the early 1960's that in 1980, not everyone owned
           | one, or if they did, they only had one for the whole family.
           | 
           | Point of clarification: While the technology behind the VCR
           | was invented in the '50s and matured in the '60s, consumer-
           | grade video tape systems weren't really a thing until Betamax
           | and VHS arrived in 1975 and 1976 respectively.
           | 
           | Early VCRs were also incredibly expensive, with prices
           | ranging from $3,500 to almost $10,000 after adjusting for
           | inflation. Just buying into the VHS ecosystem at the entry
           | level was a similar investment to buying an Apple Vision Pro
           | today.
        
             | reaperducer wrote:
             | Exactly my point. But people on HN, especially the person I
             | referenced, don't understand that we didn't just throw
             | stuff away and go into debt to buy the latest gadgets
             | because we were told to.
        
         | gwbas1c wrote:
         | > I don't own a single 4k or HDR display
         | 
         | Don't feel like you have to. I bought a giant fancy TV with it,
         | and even though it's impressive, it's kinda like ultra-hifi-
         | audio. I don't miss it when I watch the same show on one of my
         | older TVs.
         | 
         | If you ever do get it, I suggest doing for a TV that you watch
         | with your full attention, and watching TV / movies in the dark.
         | It's not very useful on a TV that you might turn on while doing
         | housework; but very useful when you are actively watching TV
         | with your full attention.
        
           | alerighi wrote:
           | I don't either see a point of having 4K TV vs 1080p TV. To me
           | is just marketing, I have at my house both a 4K and a 1080p
           | and from a normal viewing distance (that is 3/4 meters) you
           | don't see differences.
           | 
           | Also in my country (Italy) TV transmissions are 1080i at
           | best, a lot are still 570i (PAL resolution). Streaming media
           | can be 4K (if you have enough bandwidth to stream it at that
           | resolution, which I don't have at my house). Sure, if you
           | download pirated movies you find it at 4K, and if you have
           | the bandwidth to afford it... sure.
           | 
           | But even there, sometimes is better a well done 1080p movie
           | than an hyper compressed 4K one, since you see compression
           | artifacts.
           | 
           | To me 1080p, and maybe even 720p, is enough for TV vision.
           | Well, sometimes I miss the CRT TVs, they where low resolution
           | but for example had a much better picture quality than most
           | modern 4K LCD TV where black scenes are gray (I know there is
           | OLED, but is too expensive and has other issues).
        
             | zamadatix wrote:
             | For TVs under ~80" I feel like you'd have to be sitting
             | abnormally close to your TV for it to matter much. At the
             | same time I think the cost difference between producing
             | 1080p and 4k panels is so low it probably doesn't matter.
             | Like you say, things like the backlight technology (or lack
             | thereof) make a much bigger difference in perceived quality
             | but that's also where the actual cost comes in.
        
             | kjkjadksj wrote:
             | I feel the same way. To be honest even the laptop retina
             | screen is excess. I sometimes go back to a 2012 non retina
             | macbook pro and to be honest at normal laptop viewing
             | angles, you can't really discern pixels. Biggest difference
             | is display scaling but I have my retina scaled at what the
             | old display would be anyhow because otherwise its too
             | small.
             | 
             | Kind of crazy no one thought of this aspect and we just
             | march on to higher resolution and the required hardware for
             | that.
        
             | some-guy wrote:
             | I agree about 4k vs non-4k. I will say going OLED was a
             | huge upgrade, even for SDR content. HDR content is hit-or-
             | miss...I find some of it is tastefully done but in many
             | cases is overdone.
             | 
             | My own movie collection is mostly 2-4GB SDR 1080p files and
             | looks wonderful.
        
             | gwbas1c wrote:
             | You still watch broadcast TV?
             | 
             | Jokes aside, when a 4k TV has a good upscaler, it's hard to
             | tell the difference between 1080 and 4k. Not impossible; I
             | certainly can, but 1080 isn't _distracting_.
        
           | anon7000 wrote:
           | I totally love HDR on my OLED TV, and definitely miss it on
           | others.
           | 
           | Like a lot of things, it's weird how some people are more
           | sensitive to visual changes. For example:
           | 
           | - At this point, I need 120hz displays. I can easily notice
           | when my wife's phone is in power saver mode at 60hz.
           | 
           | - 4k vs 1080p. This is certainly more subtle, but I
           | definitely miss detail in lower res content.
           | 
           | - High bitrate. This is way more important than 4k vs 1080p
           | or even HDR. But it's so easy to tell when YouTube lowers the
           | quality setting on me, or when a TV show is streaming at a
           | crappy bitrate.
           | 
           | - HDR is tricky, because it relies completely on the content
           | creator to do a good job producing HDR video. When done well,
           | the image basically sparkles, water looks actually wet, parts
           | of the image basically glow... it looks so good.
           | 
           | I 100% miss this HDR watching equivalent content on other
           | displays. The problem is that a lot of content isn't produced
           | to take advantage of this very well. The HDR 4k Blu-ray of
           | several Harry Potter movies, for example, has extremely muted
           | colors and dark scenes... so how is the image going to pop?
           | I'm glad we're seeing more movies rely on bright colors and
           | rich, contrasty color grading. There are so many old film
           | restorations that look excellent in HDR because the original
           | color grade had rich, detailed, contrasty colors.
           | 
           | On top of that, budget HDR implementations, ESPECIALLY in PC
           | monitors, just don't get very bright. Which means their HDR
           | is basically useless. It's impossible to replicate the
           | "shiny, wet look" of really good HDR water if the screen
           | can't get bright enough to make it look shiny. Plus, it needs
           | to be selective about what gets bright, and cheap TVs don't
           | have a lot of backlighting zones to make that happen very
           | well.
           | 
           | So whereas I can plug in a 4k 120hz monitor and immediately
           | see the benefit in everything I do for normal PC stuff, you
           | can't get that with HDR unless you have good source material
           | and a decent display.
        
             | gwbas1c wrote:
             | > At this point, I need 120hz displays. I can easily notice
             | when my wife's phone is in power saver mode at 60hz.
             | 
             | Yeah, the judder is a lot more noticeable on older TVs now
             | that I have a 120hz TV. IMO, CRTs handled this the best,
             | but I'm not going back.
        
         | babypuncher wrote:
         | Pretty much any display you can buy today will be HDR capable,
         | though that doesn't mean much.
         | 
         | I think the industry is strangling itself putting "DisplayHDR
         | 400" certification on edgelit/backlit LCD displays. In order
         | for HDR to look "good" you either need high resolution full
         | array local dimming backlighting (which still isn't perfect),
         | or a panel type that doesn't use any kind of backlighting like
         | OLED.
         | 
         | Viewing HDR content on these cheap LCDs often looks worse than
         | SDR content. You still get the wider color gamut, but the
         | contrast just isn't there. Local dimming often loses all detail
         | in shadows whenever there is something bright on the screen.
        
           | SchemaLoad wrote:
           | HDR marketing on monitors almost seems like a scam. Monitors
           | will claim HDR compatibility when what they actually means is
           | they will take the HDR data stream and display it exactly the
           | same as SDR content because they don't actually have the
           | contrast and brightness ability of a proper HDR monitor.
        
         | miladyincontrol wrote:
         | Few things are in absolutes. Yes most consumers wont have every
         | screen hdr nor 4k, but most consumers use a modern smartphone
         | and just about every modern smartphone from the past half
         | decade or more has hdr of some level.
         | 
         | I absolutely loathe consuming content on a mobile screen, but
         | its the reality is the vast majority are using phone and
         | tablets most the time.
        
           | mxfh wrote:
           | Funny enough HDR content works absolutely perfect as long as
           | it stays on device that has both HDR-recording and display
           | tech, aka smartphones.
           | 
           | The problem starts with sending HDR content to SDR-only
           | devices, or even just other HDR-standards. Not even talking
           | about printing here.
           | 
           | This step can inherently only be automated so much, because
           | it's also a stylistic decision on what information to keep or
           | emphasize. This is an editorial process, not something you
           | want to emburden casual users with. What works for some
           | images can't work for others. Even with AI the preference
           | would still need to be aligned.
        
           | aidenn0 wrote:
           | How would I know if my android phone has an HDR screen?
           | 
           | [edit]
           | 
           | Some googling suggested I check in the Netflix app; at least
           | Netflix thinks my phone does not support HDR. (Unihertz Jelly
           | Max)
        
         | EasyMark wrote:
         | I have to think you are are the 1-3% outlier though. Everyone I
         | know has an HDR screen even my friend who never buys anything
         | new, but he did run out and buy an HDR tv to replace his old
         | one that he gave to his son.
        
           | edelhans wrote:
           | I honestly do not know if I have any screen that supports
           | HDR. At least I've never noticed any improved image quality
           | when viewing HDR video content and compare the image on my M3
           | Macbook Pro screen vs. an old external IPS monitor. Maybe my
           | eyes are just broken?
        
         | JKCalhoun wrote:
         | Only my laptop supports HDR. But that's one that I own anyway.
        
       | tomatotomato37 wrote:
       | If anyone was hoping for a more technical explanation, I find
       | these pages do a good job explaining the inner workings behind
       | the format
       | 
       | https://docs.krita.org/en/general_concepts/colors/bit_depth....
       | 
       | https://docs.krita.org/en/general_concepts/colors/color_spac...
       | 
       | https://docs.krita.org/en/general_concepts/colors/scene_line...
        
       | asafira wrote:
       | I did my PhD in Atomic, Molecular, and Optical (AMO) physics, and
       | despite "optical" being part of that I realized midway that I
       | didn't know enough about how regular cameras worked!
       | 
       | It didn't take very long to learn, and it turned out to be
       | extremely important in the work I did during the early days at
       | Waymo and later at Motional.
       | 
       | I wanted to pass along this fun video from several years ago that
       | discusses HDR: https://www.youtube.com/watch?v=bkQJdaGGVM8 . It's
       | short and fun, I recommend it to all HN readers.
       | 
       | Separately, if you want a more serious introduction to digital
       | photography, I recommend the lectures by Marc Levoy from his
       | Stanford course: https://www.youtube.com/watch?v=y7HrM-
       | fk_Rc&list=PL8ungNrvUY... . I believe he runs his own group at
       | Adobe now after leading a successful effort at Google making
       | their pixel cameras the best in the industry for a couple of
       | years. (And then everyone more-or-less caught up, just like with
       | most tech improvements in the history of smartphones).
        
         | brcmthrowaway wrote:
         | Pixel camera hardware or software? Isnt there only one vendor
         | for sensors - Sony?
        
           | dawidpotocki wrote:
           | Samsung also makes sensors for phones. IIRC some Pixels use
           | their sensors.
        
             | xattt wrote:
             | Is Sony the ASML of the sensor world?
        
               | brcmthrowaway wrote:
               | Definitely
        
             | hypercube33 wrote:
             | I think Canon makes at least some of their sensors, and
             | Nikon designs theirs and makes it at a third party I forget
             | the name of that isn't Sony or Samsung but they still do
             | use Sony stuff in a lot of their cameras.
             | 
             | I don't know about Pentax, Panasonic or OMD (formerly
             | Olympus)
        
               | lambdasquirrel wrote:
               | I think folks here have some idea how expensive chip fabs
               | are. That's why only Canon is able to make their own
               | sensors.
               | 
               | Sony makes sensors for pretty much everyone else. But
               | it's well known that other folks e.g. Nikon have been
               | able to get better signal-to-noise with Sony-made sensors
               | than Sony themselves. I think Panasonic used to make
               | their own sensors but with some recent re-org, that got
               | spun out.
               | 
               | It's been widely rumored that Leica uses Sony sensors,
               | but this gets repeatedly denied by people claiming inside
               | information. We know that Leica was getting 24MP CMOS
               | sensors from CMOSIS in the 2012 timeframe, but CMOSIS has
               | since been acquired by OSRAM, and there hasn't been any
               | verifiable information since then, whether confirming or
               | denying a continued business relationship.
        
           | asafira wrote:
           | He worked mostly on the software side, but of course had
           | important input into what sensors and processors were chosen
           | for the phones.
        
         | spookie wrote:
         | Try capturing fire with a non-Sony phone and a Sony phone. At
         | least Samsung doesn't color correct blackbodies right and the
         | flame looks nothing like reality.
        
         | bigtones wrote:
         | That Kevin Chen video using Excel was amazing and hilarious.
         | Thanks.
        
       | hatsunearu wrote:
       | Is there any workflow that can output HDR photos (like the real
       | HDR kind, with metadata to tell the display to go into HDR mode)
       | for photos shot with a mirrorless and not an iPhone?
        
         | Uncorrelated wrote:
         | Yes. For example, Lightroom and Camera Raw support HDR editing
         | and export from RAW images, and Adobe published a good rundown
         | on the feature when they introduced it.
         | 
         | https://blog.adobe.com/en/publish/2023/10/10/hdr-explained
         | 
         | Greg Benz Photography maintains a list of software here:
         | 
         | https://gregbenzphotography.com/hdr-display-photo-software/
         | 
         | I'm not sure what FOSS options there are; it's difficult to
         | search for given that "HDR" can mean three or four different
         | things in common usage.
        
       | gwbas1c wrote:
       | > AI cannot read your mind, so it cannot honor your intent.
       | 
       | This. I can always tell when someone "gets" software development
       | when they either understand (or don't) that computers can't read
       | minds or infer intent like a person can.
        
       | NelsonMinar wrote:
       | Is there a consensus definition of what counts as "HDR" in a
       | display? What is the "standard dynamic range" of a typical TV or
       | computer monitor? Is it roughly the same for devices of the same
       | age?
       | 
       | My understanding is most SDR TVs and computer screens have
       | displays about 200-300 nits (aka cd/m2). Is that the correct
       | measure of the range of the display? The brightest white is 300
       | nits brighter than the darkest black?
        
         | lloeki wrote:
         | Yes, and not just static but dynamic properties:
         | https://displayhdr.org/performance-criteria/
        
         | adgjlsfhk1 wrote:
         | this isn't a correct definition. human perception of brightness
         | is roughly logarithmic, so it also matters how deep the blacks
         | get. For a good HDR experience, you need a monitor that gets to
         | 600 nits at a bare minimum, but also which can get very close
         | to 0 nits (e.g. via OLED or less optionally local dimming)
        
       | caseyy wrote:
       | HDR is when you're watching a dark film at night, looking at the
       | subtle nuances between shades of dark and black in the shadows on
       | the screen, making out the faint contours the film director
       | carefully curated, and the subtitles gently deposit 40W of light
       | into your optical nerves with "".
        
         | shrx wrote:
         | I actually kind of hate the fact that the subtitles on my (SDR)
         | OLED display never get as bright as some parts of the video
         | itself.
        
       | dahart wrote:
       | It seems like a mistake to lump HDR capture, HDR formats and HDR
       | display together, these are very different things. The claim that
       | Ansel Adams used HDR is super likely to cause confusion, and
       | isn't particularly accurate.
       | 
       | We've had HDR formats and HDR capture and edit workflows since
       | long before HDR displays. The big benefit of HDR capture &
       | formats is that your "negative" doesn't clip super bright colors
       | and doesn't lose color resolution in super dark color. As a
       | photographer, with HDR you can re-expose the image when you
       | display/print it, where previously that wasn't possible.
       | Previously when you took a photo, if you over-exposed it or
       | under-exposed it, you were stuck with what you got. Capturing HDR
       | gives the photographer one degree of extra freedom, allowing them
       | to adjust exposure after the fact. Ansel Adams wasn't using HDR
       | in the same sense we're talking about, he was just really good at
       | capturing the right exposure for his medium without needing to
       | adjust it later. There is a very valid argument to be made for
       | doing the work up-front to capture what you're after, but
       | ignoring that for a moment, it is simply not possible to re-
       | expose Adams' negatives to reveal color detail he didn't capture.
       | That's why he's not using HDR, and why saying he is will only
       | further muddy the water.
        
         | QuantumGood wrote:
         | Adams adjusted heavily with dodging and burning, even working
         | to invent a new chemical process to provide more control when
         | developing. He was great at determining exposure for his
         | process as well. A key skill was having a vision for what the
         | image would be after adjusting. Adams talked a lot about this
         | as a top priority of his process.
        
           | Demiurge wrote:
           | > It's even more incredible that this was done on paper,
           | which has even less dynamic range than computer screens!
           | 
           | I came here to point this out. You have a pretty high dynamic
           | range in the captured medium, and then you can use the tools
           | you have to darken or lighten portions of the photograph when
           | transferring it to paper.
        
             | jrapdx3 wrote:
             | Indeed so. Printing on paper and other substrates is
             | inherently subtractive in nature which limits the gamut of
             | colors and values that can be reproduced. Digital methods
             | make the job of translating additive to subtractive media
             | easier vs. the analog techniques available to film
             | photographers. In any case, the image quality classic
             | photography was able to achieve is truly remarkable.
             | 
             | Notably, the dodging and burning used by photographers
             | aren't obsolete. There's a reason these tools are included
             | in virtually every image-editing program out there.
             | Manipulating dynamic range, particularly in printed images,
             | remains part of the craft of image-making.
        
           | staticautomatic wrote:
           | Contact printing on azo certainly helped!
        
         | albumen wrote:
         | But the article even shows Adams dodging/burning a print, which
         | is 'adjusting the exposure' in a localised fashion of the high
         | dynamic range of the film, effectively revealing detail for the
         | LDR of the resulting print that otherwise wouldn't have been
         | visible.
        
         | arghwhat wrote:
         | Arguably, even considering HDR a distinct thing is itself weird
         | an inaccurate.
         | 
         | All mediums have a range, and they've never all matched.
         | Sometimes we've tried to calibrate things to match, but anyone
         | watching SDR content for the past many years probably didn't do
         | so on a color-calibrated and brightness calibrated screen -
         | that wouldn't allow you to have a brightness slider.
         | 
         | HDR on monitors is about communicating content brightness and
         | monitor capabilities, but then you have the question of whether
         | to clip the highlights or just map the range when the content
         | is mastered for 4000 nits but your monitor manages 1000-1500
         | and only in a small window.
        
           | dahart wrote:
           | This! Yes I think you're absolutely right. The term "HDR" is
           | in part kind of an artifact of how digital image formats
           | evolved, and it kind of only makes sense relative to a time
           | when the most popular image formats and most common displays
           | were not very sophisticated about colors.
           | 
           | That said, there is one important part that is often lost.
           | One of the ideas behind HDR, sometimes, is to capture
           | absolute values in physical units, rather than relative
           | brightness. This is the distinguishing factor that film and
           | paper and TVs don't have. Some new displays are getting
           | absolute brightness features, but historically most media
           | display relative color values.
        
             | arghwhat wrote:
             | Absolute is also a funny size. From the perspective of
             | human visual perception, an absolute brightness only
             | matters if the entire viewing environment is also
             | controlled to the same absolute values. Visual perception
             | is highly contextual, and we are not only seeing the
             | screen.
             | 
             | It's not fun being unable to watch dark scenes during the
             | day or evening in a living room, nor is vaporizing your
             | retinas if the ambient environment went dark in the
             | meantime. People want good viewing experience in the
             | available environment that is logically similar to what the
             | content intended, but that is not always the same as
             | reproducing the exact same photons as the directors's
             | mastering monitor sent towards their their eyeballs at the
             | time of production.
        
               | dahart wrote:
               | Yep, absolutely! ;)
               | 
               | This brings up a bunch of good points, and it tracks with
               | what I was trying to say about conflating HDR processing
               | with HDR display. But do keep in mind that even when you
               | have absolute value images, that doesn't imply anything
               | about how you display them. You can experience large
               | benefits with an HDR workflow, even when your output or
               | display is low dynamic range. Assume that there will be
               | some tone mapping process happening and that the way you
               | map tones depends on the display medium and its
               | capabilities, and on the context and environment of the
               | display. Using the term "HDR" shouldn't imply any
               | mismatch or disconnect in the viewing environment. It
               | only did so in the article because it wasn't very careful
               | about its terms and definitions.
        
               | tshaddox wrote:
               | Indeed. For a movie scene depicting the sky including the
               | Sun, you probably wouldn't want your TV to achieve the
               | same brightness as the Sun. You _might_ want your TV to
               | become significantly brighter than the rest of the
               | scenes, to achieve an effect _something_ like the Sun
               | catching your eye.
               | 
               | Of course, the same thing goes for audio in movies. You
               | probably want a gunshot or explosion to sound loud and
               | even be slightly shocking, but you probably don't want it
               | to be as loud as a real gunshot or explosion would be
               | from the depicted distance.
               | 
               | The difference is that for 3+ decades the dynamic range
               | of ubiquitous audio formats (like 16 bit PCM in audio CDs
               | and DVDs) has provided far more dynamic range than is
               | comfortably usable in normal listening environments. So
               | we're very familiar with audio being mastered with a much
               | smaller dynamic range than the medium supports.
        
           | tshaddox wrote:
           | The term "HDR" arguably makes more sense for the effect
           | achieved by tone mapping multiple exposures of the same
           | subject onto a "normal" (e.g. SRGB) display. In this case,
           | the "high" in "HDR" just means "from a source with higher
           | dynamic range than the display."
        
             | BlueTemplar wrote:
             | Remember "wide gamut" screens ?
             | 
             | This is part of 'HDR' standards too...
             | 
             | And it's quite annoying that 'HDR' (and which specific one
             | ?) is treated as just being 'on' or 'off' even for power
             | users...
        
           | theshackleford wrote:
           | > but your monitor manages 1000-1500 and only in a small
           | window.
           | 
           | Owning a display that can do 1300+ nits sustained across a
           | 100% window has been the biggest display upgrade I think I
           | have ever had. It's given me a tolerance for LCD, a
           | technology I've hated since the death of CRTs and turned me
           | away from OLED.
           | 
           | There was a time I would have said i'd never own a non OLED
           | display again. But a _capable_ HDR display changed that logic
           | in a big way.
           | 
           | Too bad the motion resolution on it, especially compared to
           | OLED is meh. Again, at one point, motion was the most
           | important aspect to me (its why I still own CRTs) but this
           | level of HDR...transformative for lack of a better word.
        
             | lotrjohn wrote:
             | Hello fellow CRT owner. What is your use case? Retro video
             | games? PC games? Movies?
        
               | theshackleford wrote:
               | Hello indeed!
               | 
               | > What is your use case? Retro video games? PC games?
               | Movies?
               | 
               | All of the above! The majority of my interest largely
               | stems from the fact that for whatever reason, I am
               | INCREDIBLY sensitive to sample and hold motion blur.
               | Whilst I tolerate it for modern gaming because I largely
               | have no choice, CRT's mean I do not for my retro gaming,
               | which I very much enjoy. (I was very poor growing up, so
               | most of it for me is not even nostalgia, most of these
               | games are new to me.)
               | 
               | Outside of that, we have a "retro" corner in our home
               | with a 32" trinitron. I collect laserdisc/VHS and we have
               | "retro video" nights where for whatever reason, we watch
               | the worst possible quality copies of movies we could get
               | in significantly higher definition. Much the same as
               | videogames, I was not exposed to a lot of media growing
               | up, my wife has also not seen many things because she was
               | in Russia back then, so there is a ton for us to catch up
               | on very slowly and it just makes for a fun little date
               | night every now and again.
               | 
               | Sadly though, as I get ready to take on a mortgage, it's
               | likely most of my CRT's will be sold, or at least the
               | broadcast monitors. I do not look forward to it haha.
        
               | lotrjohn wrote:
               | > Outside of that, we have a "retro" corner in our home
               | with a 32" trinitron.
               | 
               | A 32" Trinny. Nice. I have the 32" JVC D-series which I
               | consider my crown jewel. It's for retro gaming and I have
               | a laserdisc player but a very limited selection of
               | movies. Analog baby.
               | 
               | > Sadly though, as I get ready to take on a mortgage,
               | it's likely most of my CRT's will be sold
               | 
               | Mortgage = space. You won't believe the nooks and
               | crannies you can fit CRTs into. Attic. Shed. Crawl space.
               | Space under basement stairs. Heck, even the neighbors
               | house. I have no less than 14 CRTs ferreted away in the
               | house. Wife thinks I have only 5. Get creative. Don't
               | worry about the elements, these puppies were built to
               | survive nuclear blasts. Do I have a sickness? Probably.
               | But analog!!!
        
               | hypercube33 wrote:
               | Speaking of laser disc it's wild how vivid colors are on
               | that platform. My main example movie is Star Trek First
               | contact and everything is very colorful. DVD is muddy.
               | Even a Blu-ray copy kinda looks like crap. A total side
               | note is the surround sound for that movie is absolutely
               | awesome especially the cube battle scene.
        
               | theshackleford wrote:
               | > I have the 32" JVC D-series which
               | 
               | I would _love_ one of these however I have never seen one
               | in my country. Super jealous haha! The tubes they use
               | apparently were an american made tube, with most of the
               | JVCs that were released in my country using different
               | tubes than those released in the US market.
               | 
               | That being said, I do own two JVC "broadcast" monitors
               | that I love. A 17" and a 19". They are no D-series real
               | "TV" but.
        
             | arghwhat wrote:
             | Motion resolution? Do you mean the pixel response time?
             | 
             | CRTs technically have quite a few artifacts in this area,
             | but as content displayed CRTs tend to be built for CRTs
             | this is less of an issue, and in many case even required.
             | The input is expecting specific distortions and effects
             | from scanlines and phosphor, which a "perfect" display
             | wouldn't exhibit...
             | 
             | The aggressive OLED ABL is simply a thermal issue. It can
             | be mitigated with thermal design in smaller devices, and
             | anything that increases efficiency (be it micro lens
             | arrays, stacked "tandem" panels, quantum dots, alternative
             | emitter technology) will lower the thermal load and
             | increase the max full panel brightness.
             | 
             | (LCD with zone dimming would also be able to pull this
             | trick to get even brighter zones, but because the base
             | brightness is high enough it doesn't bother.)
        
               | theshackleford wrote:
               | > Motion resolution? Do you mean the pixel response time?
               | 
               | I indeed meant motion resolution, which pixel response
               | time only partially affects. It's about how clearly a
               | display shows motion, unlike static resolution which only
               | reflects realistically a still image. Even with fast
               | pixels, sample and hold displays blur motion unless
               | framerate and refresh rate is high, or BFI/strobing is
               | used. This blur immediately lowers perceived resolution
               | the moment anything moves on screen.
               | 
               | > The input is expecting specific distortions and effects
               | from scanlines and phosphor, which a "perfect" display
               | wouldn't exhibit...
               | 
               | That's true for many CRT purists, but is not a huge deal
               | for me personally. My focus is motion performance. If
               | LCD/OLED matched CRT motion at the same refresh rate, I'd
               | drop CRT in a heartbeat, slap on a CRT shader, and call
               | it a day. Heresy to many CRT enthusiasts.
               | 
               | Ironically, this is an area in which I feel we are
               | getting CLOSE enough with the new higher refresh OLEDs
               | for non HDR retro content in combination with:
               | https://blurbusters.com/crt-simulation-in-a-gpu-shader-
               | looks... (which hopefully will continue to be improved.)
               | 
               | > The aggressive OLED ABL is simply a thermal issue.
               | 
               | Theoretically, yes and there's been progress, but it's
               | still unsolved in practice. If someone shipped an OLED
               | twice as thick and full of fans and heatsinks, I'd buy it
               | tomorrow. But that's not what the market wants, so
               | obviously it's not what they make.
               | 
               | > It can be mitigated with thermal design in smaller
               | devices, and anything that increases efficiency (be it
               | micro lens arrays, stacked "tandem" panels, quantum dots,
               | alternative emitter technology) will lower the thermal
               | load and increase the max full panel brightness.
               | 
               | Sure, in theory. But so far the improvements (like QD-
               | OLED or MLA) haven't gone far enough. I already own
               | panels using these. Beyond that, much of the tech isn't
               | in the display types I care about, or isn't ready yet.
               | Which is a pity, because the tandem based displays I have
               | seen in usage are really decent.
               | 
               | That said, the latest G5 WOLEDs are the first I'd call
               | acceptable for HDR at high APL, for the preferences I
               | hold with very decent real scene brightness, at least in
               | film. Sadly, I doubt we'll see comparable performance in
               | PC monitors until many years down the track and monitors
               | are my preference.
        
         | munificent wrote:
         | _> The claim that Ansel Adams used HDR is super likely to cause
         | confusion_
         | 
         | That isn't what the article claims. It says:
         | 
         | "Ansel Adams, one of the most revered photographers of the 20th
         | century, was a master at capturing dramatic, high dynamic range
         | scenes."
         | 
         | "Use HDR" (your term) is vague to the point of not meaning much
         | of anything, but the article is clear that Adams was capturing
         | scenes that had a high dynamic range, which is objectively
         | true.
        
           | dahart wrote:
           | Literally the sentence preceding the one you quoted is "What
           | if I told you that analog photographers captured HDR as far
           | back as 1857?".
        
             | zymhan wrote:
             | And that quote specifically does _not_ "lump HDR capture,
             | HDR formats and HDR display together".
             | 
             | It is directly addressing _capture_.
        
               | dahart wrote:
               | Correct. I didn't say that sentence was the source of the
               | conflation, I said it was the source of the Ansel Adams
               | problem. There are other parts that mix together capture,
               | formats, and display.
               | 
               | Edit: and btw I am objecting to calling film capture
               | "HDR", I don't think that helps define HDR nor reflects
               | accurately on the history of the term.
        
               | pavlov wrote:
               | That's a strange claim because the first digital HDR
               | capture devices were film scanners (for example the
               | Cineon equipment used by the motion picture industry in
               | the 1990s).
               | 
               | Film provided a higher dynamic range than digital
               | sensors, and professionals wanted to capture that for
               | image editing.
               | 
               | Sure, it wasn't terribly deep HDR by today's standards.
               | Cineon used 10 bits per channel with the white point at
               | coding value 685 (and a log color space). That's still a
               | lot more range and superwhite latitude than you got with
               | standard 8-bpc YUV video.
        
               | dahart wrote:
               | They didn't call that "HDR" at the time, and it wasn't
               | based on the idea of recording radiance or other absolute
               | physical units.
               | 
               | I'm certain physicists had high range digital cameras
               | before Cineon, and they were working in absolute physical
               | metrics. That would be a stronger example.
               | 
               | You bring up an important point that is completely lost
               | in the HDR discussion: this is about color resolution at
               | least as much as it's about range, if not moreso. I can
               | use 10 bits for a [0..1] range just as easily as I can
               | use 4 bits to represent quantized values from 0 to 10^9.
               | Talking about the range of a scene captured is leaving
               | out most of the story, and all of the important parts.
               | We've had outdoor photography, high quality films, and
               | the ability to control exposure for a long time, and that
               | doesn't explain what "HDR" is.
        
               | pavlov wrote:
               | It certainly was called HDR when those Cineon files were
               | processed in a linear light workflow. And film was the
               | only capture source available that could provide
               | sufficient dynamic range, so IMO that makes it "HDR".
               | 
               | But I agree that the term is such a wide umbrella that
               | almost anything qualifies. Fifteen years ago you could do
               | a bit of superwhite glows and tone mapping on 8-bpc and
               | people called that look HDR.
        
               | dahart wrote:
               | Do you have any links from 1990ish that show use of
               | "HDR"? I am interested in when "HDR" became a phrase
               | people used. I believe I remember hearing it first around
               | 1996 or 97, but it may have started earlier. It was
               | certainly common by 2001. I don't see that used as a term
               | nor an acronym in the Cineon docs from 1995, but it does
               | talk about log and linear spaces and limiting the dynamic
               | range when converting. The Cineon scanner predates sRGB,
               | and used gamma 1.7. https://dotcsw.com/doc/cineon1.pdf
               | 
               | This 10 bit scanner gave you headroom of like 30% above
               | white. So yeah it qualifies as a type of high dynamic
               | range when compared to 8 bit/channel RGB, but on the
               | other hand, a range of [0 .. 1.3] isn't exactly in the
               | spirit of what "HDR" stands for. The term implicitly
               | means a _lot_ more than 1.0, not just a little. And again
               | people developing HDR like Greg Ward and Paul Debevec
               | were arguing for absolute units such as luminance, which
               | the Cineon scanner does not do.
        
               | altairprime wrote:
               | It was called "extended dynamic range" by ILM when they
               | published the OpenEXR spec (2003):
               | 
               | > _OpenEXR (www.openexr.net), its previously proprietary
               | extended dynamic range image file format, to the open
               | source community_
               | 
               | https://web.archive.org/web/20170721234341/http://www.ope
               | nex...
               | 
               | And "larger dynamic range" by Rea & Jeffrey (1990):
               | 
               | > _With g = 1 there is equal brightness resolution over
               | the entire unsaturated image at the expense of a larger
               | dynamic range within a given image. Finally, the
               | automatic gain control, AGC, was disabled so that the
               | input /output relation would be constant over the full
               | range of scene luminances._
               | 
               | https://doi.org/10.1080/00994480.1990.10747942
               | 
               | I'm not sure when everyone settled on "high" rather than
               | "large" or "extended", but certainly 'adjective dynamic
               | range' is near-universal.
        
               | dahart wrote:
               | As I remember it, Paul Debevec had borrowed Greg Ward's
               | RGBE file format at some point in the late 90s and
               | rebranded it ".hdr" for his image viewer tool (hdrView)
               | and code to convert a stack of LDR exposures into HDR. I
               | can see presentations online from Greg Ward in 2001 that
               | have slides with "HDR" and "HDRI" all over the place. So
               | yeah the term definitely must have started in the late
               | 90s if not earlier. I'm not sure it was as there in the
               | early 90s though.
        
               | altairprime wrote:
               | Oo, interesting! That led me to this pair of sentences:
               | 
               | "Making global illumination user-friendly" (Ward, 1995) h
               | ttps://radsite.lbl.gov/radiance/papers/erw95.1/paper.html
               | 
               | > _Variability is a qualitative setting that indicates
               | how much light levels vary in the zone, i.e. the dynamic
               | range of light landing on surfaces._
               | 
               | > _By the nature of the situation being modeled, the user
               | knows whether to expect a high degree of variability in
               | the lighting or a low one._
               | 
               | Given those two phrases, 'a high or low degree of
               | variability in the lighting' translates as 'a high or low
               | degree of dynamic range' -- or would be likely to, given
               | human abbreviation tendencies, in successive works and
               | conversations.
        
             | munificent wrote:
             | Yes, Ansel Adams was using a camera to capture a scene that
             | had high dynamic range.
             | 
             | I don't see the confusion here.
        
               | dahart wrote:
               | HDR is not referring to the scene's range, and it doesn't
               | apply to film. It's referring superficially but
               | specifically to a digital process that improves on 8
               | bits/channel RGB images. And one of the original intents
               | behind HDR was to capture pixels in absolute physical
               | measurements like radiance, to enable a variety of post-
               | processing workflows that are not available to film.
        
               | altairprime wrote:
               | The digital process of tonemapping, aka. 'what Apple
               | calls Smart HDR processing of SDR photos to increase
               | perceptual dynamic range', can be applied to images of
               | any number of channels of any bit depth -- though, if you
               | want to tonemap a HyperCard dithered black-and-white
               | image, you'll probably have to decompile the dithering as
               | part of creating the gradient map. Neither RGB nor 8-bit
               | are necessary to make tonemapping a valuable step in
               | image processing.
        
               | dahart wrote:
               | That's true, and it's why tonemapping is distinct from
               | HDR. If you follow the link from @xeonmc's comment and
               | read the comments, the discussion centers on the
               | conflation of tonemapping and HDR.
               | 
               | https://news.ycombinator.com/item?id=43987923
               | 
               | That said, the entire reason that tonemapping is a thing,
               | and the primary focus of the tonemapping literature, is
               | to solve the problem of squeezing images with very wide
               | ranges into narrow display ranges like print and non-HDR
               | displays, and to achieve a natural look that mirrors
               | human perception of wide ranges. Tonemapping might be
               | technically independent of HDR, but they did co-evolve,
               | and that's part of the history.
        
               | munificent wrote:
               | "High dynamic range" is a phrase that is much older than
               | tone mapping. I see uses of "dynamic range" going back to
               | the 1920s and "high dynamic range" to the 1940s:
               | 
               | https://books.google.com/ngrams/graph?content=dynamic+ran
               | ge%...
               | 
               | You might argue that "HDR" the _abbreviation_ refers to
               | using tone mapping to approximate rendering high dynamic
               | range imagery on lower dynamic range displays. But even
               | then, the sentence in question doesn 't use the
               | abbreviation. It is specifically talking about a dynamic
               | range that is high.
               | 
               | Dynamic range is a property of any signal or quantifiable
               | input, including, say sound pressure hitting our ears or
               | photons hitting an eyeball, film, or sensor.
        
               | dahart wrote:
               | > But even then, the sentence in question doesn't use the
               | abbreviation
               | 
               | Yes it does. Why are you still looking at a different
               | sentence than the one I quoted??
               | 
               | HDR in this context isn't referring to just any dynamic
               | range. If it was, then it would be so vague as to be
               | meaningless.
               | 
               | Tone mapping is closely related to HDR and very often
               | used, but is not necessary and does not define HDR. To me
               | it seems like your argument is straw man. Photographers
               | have never broadly used the term "high dynamic range" as
               | a phrase, nor the acronym "HDR" before it showed up in
               | computer apps like hdrView, Photoshop, and iPhone camera.
        
               | munificent wrote:
               | Oh, sorry, you're right. Mentioning the abbreviation is a
               | red herring. The full quote is:
               | 
               | "But what if we don't need that tradeoff? What if I told
               | you that analog photographers captured HDR as far back as
               | 1857? Ansel Adams, one of the most revered photographers
               | of the 20th century, was a master at capturing dramatic,
               | high dynamic range scenes. It's even more incredible that
               | this was done on paper, which has even less dynamic range
               | than computer screens!"
               | 
               | It seems pretty clear to me that in this context the
               | author is referring to the high dynamic range of the
               | scenes that Adams pointed his camera at. That's why he
               | says "captured HDR" and "high dynamic range scenes".
        
               | dahart wrote:
               | > It seems pretty clear to me that in this context the
               | author is referring to the high dynamic range of the
               | scenes that Adams pointed his camera at.
               | 
               | Yes, this is the problem I have with the article. "HDR"
               | is not characterized solely by the range of the scene,
               | and never was. It's a term of art that refers to an
               | increased range (and resolution) on the capture and
               | storage side, and it's referring to a workflow that
               | involves/enables deferring exposure until display time.
               | The author's claim here is making the term "HDR" harder
               | to understand, not easier, and it's leaving out of the
               | most important conceptual aspects. There are some
               | important parallels between film and digital HDR, and
               | there are some important differences. The differences are
               | what make claiming that nineteenth century photographers
               | were capturing HDR problematic and inaccurate.
        
               | samplatt wrote:
               | To further complicate the issue, "high dynamic range" is
               | a phrase that will come up across a few different
               | disciplines, not just related to the capture &
               | reproduction of visual data.
        
           | PaulHoule wrote:
           | I think about the Ansel Adams zone system
           | 
           | https://www.kimhildebrand.com/how-to-use-the-zone-system/
           | 
           | where my interpretation is colored by the experience of
           | making high quality prints and viewing them under different
           | conditions, particularly poor illumination quality but you
           | could also count "small handheld game console", "halftone
           | screened and printed on newsprint" as other degraded
           | conditions. In those cases you might imagine that the eye can
           | only differentiate between 11 tones so even if an image has
           | finer detail it ought to connect well with people if colors
           | were quantized. (I think about concept art from _Pokemon Sun
           | and Moon_ which looked great printed with a thermal printer
           | because it was designed to look great on a cheap screen.)
           | 
           | In my mind, the ideal image would look good quantized to 11
           | zones but also has interesting detail in texture in 9 of the
           | zones (extreme white and black don't show texture). That's a
           | bit of an oversimplification (maybe a shot outdoors in the
           | snow is going to trend really bright, maybe for artistic
           | reasons you want things to be really dark, ...) but Ansel
           | Adams manually "tone mapped" his images using dodging,
           | burning and similar techniques to make it so.
        
         | xeonmc wrote:
         | > It seems like a mistake to lump HDR capture, HDR formats and
         | HDR display together
         | 
         | Reminded me of the classic "HDR in games vs HDR in photography"
         | comparison[0]
         | 
         | [0] https://www.realtimerendering.com/blog/thought-for-the-day/
        
         | levidos wrote:
         | Is there a difference in capturing in HDR vs RAW?
        
           | dahart wrote:
           | Good question. I think it depends. They are kind of different
           | concepts, but in practice they can overlap considerably. RAW
           | is about using the camera's full native color resolution, and
           | not having lossy compression. HDR is overloaded, as you can
           | see from the article & comments, but I think HDR capture is
           | conceptually about expressing brightness in physical units
           | like luminance or radiance, and delaying the 'exposure' until
           | display time. Both RAW and HDR typically mean using more than
           | 8 bits/channel and capturing high quality images that will
           | withstand more post-processing than 'exposed' LDR images can
           | handle.
        
         | sandofsky wrote:
         | > It seems like a mistake to lump HDR capture, HDR formats and
         | HDR display together, these are very different things.
         | 
         | These are all related things. When you talk about color, you
         | can be talking about color cameras, color image formats, and
         | color screens, but the concept of color transcends the
         | implementation.
         | 
         | > The claim that Ansel Adams used HDR is super likely to cause
         | confusion, and isn't particularly accurate.
         | 
         | The post never said Adams used HDR. I very carefully chose the
         | words, "capturing dramatic, high dynamic range scenes."
         | 
         | > Previously when you took a photo, if you over-exposed it or
         | under-exposed it, you were stuck with what you got. Capturing
         | HDR gives the photographer one degree of extra freedom,
         | allowing them to adjust exposure after the fact.
         | 
         | This is just factually wrong. Film negatives have 12-stops of
         | useful dynamic range, while photo paper has 8 stops at best.
         | That gave photographers exposure latitude during the print
         | process.
         | 
         | > Ansel Adams wasn't using HDR in the same sense we're talking
         | about, he was just really good at capturing the right exposure
         | for his medium without needing to adjust it later.
         | 
         | There's a photo of Ansel Adams in the article, dodging and
         | burning a print. How would you describe that if not adjusting
         | the exposure?
        
           | dahart wrote:
           | I agree capture, format and display are closely related. But
           | HDR capture and processing specifically developed outside of
           | HDR display devices, and use of HDR displays changes how HDR
           | images are used compared to LDR displays.
           | 
           | > The post never said Adams used HDR. I very carefully chose
           | the words
           | 
           | Hey I'm sorry for criticizing, but I honestly feel like
           | you're being slightly misleading here. The sentence "What if
           | I told you that analog photographers captured HDR as far back
           | as 1857?" is explicitly claiming that analog photographers
           | use "HDR" capture, and the Ansel Adams sentence that follows
           | appears to be merely a specific example of your claim. The
           | result of the juxtaposition is that the article did in fact
           | claim Adams used HDR, even if you didn't quite intend to.
           | 
           | I think you're either misunderstanding me a little, or maybe
           | unaware of some of the context of HDR and its development as
           | a term of art in the computer graphics community. Film's 12
           | stops is not really "high" range by HDR standards, and a
           | little exposure latitude isn't where "HDR" came from. The
           | more important part of HDR was the intent to push toward
           | absolute physical units like luminance. That doesn't just
           | enable deferred exposure, it enables physical and perceptual
           | processing in ways that aren't possible with film. It enables
           | calibrated integration with CG simulation that isn't possible
           | with film. And it enables a much wider rage of exposure
           | push/pull than you can do when going from 12 stops to 8. And
           | of course non-destructive digital deferred exposure at
           | display time is quite different from a print exposure.
           | 
           | Perhaps it's useful to reflect on the fact that HDR has a
           | counterpart called LDR that's referring to 8 bits/channel
           | RGB. With analog photography, there is no LDR, thus zero
           | reason to invent the notion of a 'higher' range. Higher than
           | what? High relative to what? Analog cameras have exposure
           | control and thus can capture any range you want. There is no
           | 'high' range in analog photos, there's just range. HDR was
           | invented to push against and evolve beyond the de-facto
           | digital practices of the 70s-90s, it is not a statement about
           | what range can be captured by a camera.
        
             | sandofsky wrote:
             | > The sentence "What if I told you that analog
             | photographers captured HDR as far back as 1857?" is
             | explicitly claiming that analog photographers use "HDR"
             | capture,
             | 
             | No, it isn't. It's saying they captured HDR scenes.
             | 
             | > The result of the juxtaposition is that the article did
             | in fact claim Adams used HDR
             | 
             | You can't "use" HDR. It's an adjective, not a noun.
             | 
             | > Film's 12 stops is not really "high" range by HDR
             | standards, and a little exposure latitude isn't where "HDR"
             | came from.
             | 
             | The Reinhard tone mapper, a benchmark that regularly
             | appears in research papers, specifically cites Ansel Adams
             | as inspiration.
             | 
             | "A classic photographic task is the mapping of the
             | potentially high dynamic range of real world luminances to
             | the low dynamic range of the photographic print."
             | 
             | https://www-
             | old.cs.utah.edu/docs/techreports/2002/pdf/UUCS-0...
             | 
             | > Perhaps it's useful to reflect on the fact that HDR has a
             | counterpart called LDR that's referring to 8 bits/channel
             | RGB.
             | 
             | 8-bits per channel does not describe dynamic range. If I
             | attach an HLG transfer function on an 8-bit signal, I have
             | HDR. Furthermore, assuming you actually meant 8-bit sRGB,
             | nobody calls that "LDR." It's SDR.
             | 
             | > Analog cameras have exposure control and thus can capture
             | any range you want.
             | 
             | This sentence makes no sense.
        
               | dahart wrote:
               | Sorry man, you seem really defensive, I didn't mean to
               | put you on edge. Okay, if you are calling the scenes
               | "HDR" then I'm happy to rescind my critique about Ansel
               | Adams and switch instead to pointing out that "HDR"
               | doesn't refer to the range of the scene, it refers to the
               | range _capability_ of a digital capture process. I think
               | the point ultimately ends up being the same either way.
               | Hey where is HDR defined as an adjective? Last time I
               | checked, "range" could be a noun, I think... no? You must
               | be right, but FWIW, you used HDR as a noun in your 2nd to
               | last point... oh and in the title of your article too.
               | 
               | Hey it's great Reinhard was inspired by Adams. I have
               | been too, like a lot of photographers. And I've used the
               | Reinhard tone mapper in research papers, I'm quite
               | familiar with it and personally know all three authors of
               | that paper. I've even written a paper or maybe two on
               | color spaces with one of them. Anyway, the inspiration
               | doesn't change the fact that 12 stops isn't particularly
               | high dynamic range. It's _barely_ more than SDR. Even the
               | earliest HDR formats had like 20 or 30 stops, in part
               | because the point was to use physical luminance _instead_
               | of a relative [0..1] range.
               | 
               | 8 bit RGB does sort-of in practice describe a dynamic
               | range, as long as the 1 bit difference is approximately
               | the 'just noticeable difference' or JND as some
               | researchers call it. This happens to line up with 8 bits
               | being about 8 stops, which is what RGB images have been
               | doing for like 50 years, give or take. While it's
               | perfectly valid arithmetic to use 8 bits values to
               | represent an arbitrary amount like 200 stops or 0.003
               | stops, it'd be pretty weird.
               | 
               | Plenty of people have called and continue to call 8 bit
               | images "LDR", here's just three of the thousands of uses
               | of "LDR" [1][2][3], and LDR predates usage of SDR by like
               | 15 years maybe? LDR predates sRGB too, I did not actually
               | mean 8 bit sRGB. LDR and SDR are close but not quite the
               | same thing, so feel free to read up on LDR. It's
               | disappointing you ducked the actual point I was making,
               | which is still there even if you replace LDR with SDR.
               | 
               | What is confusing about the sentence about analog cameras
               | and exposure control? I'm happy to explain it since you
               | didn't get it. I was referring to how the aperture can be
               | adjusted on an analog camera to make a scene with any
               | dynamic range fit into the ~12 stops of range the film
               | has, or the ~8 stops of range of paper or an old TV. I
               | was just trying to clarify _why_ HDR is an attribute of
               | digital images, and not of scenes.
               | 
               | [1] https://www.easypano.com/showkb_228.html#:~:text=The%
               | 20Dynam...
               | 
               | [2] https://www.researchgate.net/figure/shows-digital-
               | photograph...
               | 
               | [3] https://irisldr.github.io/
        
               | sandofsky wrote:
               | You opened this thread arguing that Ansel Adams didn't
               | "use HDR." I linked you to a seminal research paper which
               | argues that he tone mapped HDR content, and goes on to
               | implement a tone mapper based on his approach. This all
               | seems open and shut.
               | 
               | > I'm happy to rescind my critique about Ansel Adams
               | 
               | Great, I'm done.
               | 
               | > and switch instead to pointing out that "HDR" doesn't
               | refer to the range of the scene
               | 
               | Oh god. Here's the first research paper that popped into
               | my head: https://static.googleusercontent.com/media/hdrpl
               | usdata.org/e...
               | 
               | "Surprisingly, daytime shots with high dynamic range may
               | also suffer from lack of light."
               | 
               | "In low light, or in very high dynamic range scenes"
               | 
               | "For high dynamic range scenes we use local tone mapping"
               | 
               | You keep trying to define "HDR" differently than current
               | literature. Not even current-- that paper was published
               | in 2016! Hey, maybe HDR meant something different in the
               | 1990s, or maybe it was just ok to use "HDR" as shorthand
               | for when things were less ambiguous. I honestly don't
               | care, and you're only serving to confuse people.
               | 
               | > the aperture can be adjusted on an analog camera to
               | make a scene with any dynamic range fit into the ~12
               | stops of range the film has, or the ~8 stops of range of
               | paper or an old TV.
               | 
               | You sound nonsensical because you keep using the wrong
               | terms. Going back to your first sentence that made no
               | sense:
               | 
               | > Analog cameras have exposure control and thus can
               | capture any range you want
               | 
               | You keep saying "range" when, from what I can tell, you
               | mean "luminance." Changing a camera's aperture scales the
               | luminance hitting your film or sensor. It does not alter
               | the dynamic range of the scene.
               | 
               | Analog cameras cannot capture any range. By adjusting
               | camera settings or attaching ND filters, you can change
               | the window of luminance values that will fit within the
               | dynamic range of your camera. To say a camera can
               | "capture any range" is like saying, "I can fit that couch
               | through the door, I just have to saw it in half."
               | 
               | > And I've used the Reinhard tone mapper in research
               | papers, I'm quite familiar with it and personally know
               | all three authors of that paper. I've even written a
               | paper or maybe two on color spaces with one of them.
               | 
               | I'm sorry if correcting you triggers insecurities, but if
               | you're going to make an appeal to authority, please link
               | to your papers instead of hand waving about the people
               | you know.
        
               | dahart wrote:
               | Hehe outside is "HDR content"? To me that still comes off
               | as confused about what HDR is. I know you aren't, but
               | that's what it sounds like. A sunny day has a high
               | dynamic range for sure, but the acronym HDR is a term of
               | art that implies more than that. Your article even
               | explains why.
               | 
               | Tone mapping doesn't imply HDR. Tone mapping is always
               | present, even in LDR and SDR workflows. The paper you
               | cited explicitly notes the idea is to "extend" Adams'
               | zone system to very high dynamic range digital images,
               | more than what Adams was working with, by implication.
               | 
               | So how is a "window of luminance values" different from a
               | dynamic range, exactly? Why did you make the incorrect
               | and obviously silly assumption that I was suggesting a
               | camera's aperture changes the outdoor scene's dynamic
               | range rather than what I actually said, that it changes
               | the exposure? Your description of what a camera does is
               | functionally identical. I'm kinda baffled as to why
               | you're arguing this part that we both understand, using
               | hyperbole.
               | 
               | I hope you have a better day tomorrow. Good luck with
               | your app. This convo aside, I am honestly rooting for
               | you.
        
           | smogcutter wrote:
           | > Film negatives have 12-stops of useful dynamic range
           | 
           | No, that's not inherently true. AA used 12 zones, that
           | doesn't mean every negative stock has 12 stops of latitude.
           | Stocks are different, you need to look at the curves.
           | 
           | But yes most modern negatives are very forgiving. FP4 for
           | example has barely any shoulder at all iirc.
        
         | Sharlin wrote:
         | No, Adams, like everyone who develops their own film (or RAW
         | digital photos) definitely worked in HDR. Film has much more DR
         | than photographic paper, as noted by TFA author (and large
         | digital sensors more than either SDR or HDR displays)
         | _especially_ if you're such a master of exposure as Adams;
         | trying to preserve the tonalities when developing and printing
         | your photos is the real big issue.
        
         | pixelfarmer wrote:
         | If I look at one of the photography books in my shelf, they are
         | even talking about 18 stops and such for some film material,
         | and how this doesn't translate to paper and all the things that
         | can be done to render it visible in print and how things behave
         | at both extreme ends (towards black and white). Read: Tone-
         | mapping (i.e. trimming down a high DR image to a lower DR
         | output media) is really old.
         | 
         | The good thing about digital is that it can deal with color at
         | decent tonal resolutions (if we assume 16 bits, not the limited
         | 14 bit or even less) and in environments where film has
         | technical limitations.
        
       | miladyincontrol wrote:
       | I do prefer the gain map approach myself. Do a sensible '32 bit'
       | HDR edit, further tweak an SDR version, and export a file that
       | has both.
       | 
       | Creative power is still in your hands versus some tone mapper's
       | guesses at your intent.
       | 
       | Can people go overboard? Sure, but thats something they will do
       | regardless of any hdr or lack thereof.
       | 
       | On an aside its still rough that just about every site that
       | touches gain map (adaptive HDR as this blog calls them) HDR
       | images will lose that metadata if they need to scale, recompress,
       | or transform the images otherwise. Its led me to just make my own
       | site, but also to handle what files a client gets a bit smarter .
       | For instance if a browser doesnt support .jxl or .avif images, im
       | sure it wont want an hdr jpeg either, thats easy to handle on a
       | webserver.
        
       | ddingus wrote:
       | Well crap, I had written a thank you, which I will gladly write
       | again:
       | 
       | I love when product announcements and ADS in general are high
       | value works. This one was good education for me. Thank you for
       | it!
       | 
       | I had also written about my plasma and CRT displays and how
       | misunderstandings about HDR made things generally worse and how I
       | probably have not seen the best these 10 bit capable displays can
       | do.
       | 
       | And finally, I had written about 3D TV and how fast, at least
       | 60Hz per eye, 3D in my home made for amazing modeling and
       | assembly experiences! I was very sad to see that tech dead end.
       | 
       | 3D for technical content create has a lot of legs... if only more
       | people could see it running great...
       | 
       | Thanks again. I appreciate the education.
        
       | Uncorrelated wrote:
       | I find the default HDR (as in gain map) presentation of iPhone
       | photos to look rather garish, rendering highlights too bright and
       | distracting from the content of the images. The solution I came
       | up with for my own camera app was to roll off and lower the
       | highlights in the gain map, which results in final images that I
       | find way more pleasing. This seems to be somewhat similar to what
       | Halide is introducing with their "Standard" option for HDR.
       | 
       | Hopefully HN allows me to share an App Store link... this app
       | works best on Pro iPhones, which support ProRAW, although I do
       | some clever stuff on non-Pro iPhones to get a more natural look.
       | 
       | https://apps.apple.com/us/app/unpro-camera/id6535677796
        
         | springhalo wrote:
         | I see that it says it stores images "in an HDR format by
         | default" but keeps referencing JPEG output. Are you using JPEG-
         | XT? There aren't a lot of "before and after" comparisons so
         | it's hard to know how much it's taking out. I figure those
         | would probably hurt the reputation of the app considering its
         | purpose is to un-pop photos, but I'm in the boat of not really
         | being sure whether I do actually like the pop or not. Is there
         | live-photo support, or is that something that you shouldn't
         | expect from a artist-focused product?
        
           | Uncorrelated wrote:
           | It's a JPEG + gain map format where the gain map is stored in
           | the metadata. Same thing, as far as I can tell, that Halide
           | is now using. It's what the industry is moving towards; it
           | means that images display well on both SDR and HDR displays.
           | I don't know what JPEG-XT is, aside from what I just skimmed
           | on the Wikipedia page.
           | 
           | Not having before-and-after comparisons is mostly down to my
           | being concerned about whether that would pass App Review; the
           | guidelines indicate that the App Store images are supposed to
           | be screenshots of the app, and I'm already pushing that rule
           | with the example images for filters. I'm not sure a hubristic
           | "here's how much better my photos are than Apple's" image
           | would go over well. Maybe in my next update? I should at
           | least have some comparisons on my website, but I've been bad
           | at keeping that updated.
           | 
           | There's no Live Photo support, though I've been thinking
           | about it. The reason is that my current iPhone 14 Pro Max
           | does not support Live Photos while shooting in 48-megapixel
           | mode; the capture process takes too long. I'd have to come up
           | with a compromise such as only having video up to the moment
           | of capture. That doesn't prevent me from implementing it for
           | other iPhones/cameras/resolutions, but I don't like having
           | features unevenly available.
        
       | npteljes wrote:
       | I have similar feeling with HDR to what I have with movie audio
       | mixing. The range is just too big. I think this distaste also
       | amplified in me with age. I appreciate content that makes use of
       | the limited space it gets, and also algorithms that compress the
       | range for me a bit. KMPlayer for example has this on for volume
       | by default, and it makes home movie watching more comfortable, no
       | doubt sacrificing artistic vision in the process. I feel a bit
       | the same with the loudness war - and maybe a lot of other people
       | feel the same too, seeing how compressed audio got. At the very
       | least they don't mind much.
       | 
       | I really appreciate the article. I could feel that they also have
       | a product to present, because of the many references, but it was
       | also very informative besides that.
        
       | caseyy wrote:
       | The dad's photo in the end in SDR looks so much better on a
       | typical desktop IPS panel (Windows 11). The HDR photo looks like
       | the brightness is smushed in the most awful way. On an iPhone,
       | the HDR photo is excellent and the others look muted.
       | 
       | I wonder if there's an issue in Windows tonemapping or HDR->SDR
       | pipeline, because perceptually the HDR image is really off.
       | 
       | It's more off than if I took an SDR picture of my iPhone showing
       | the HDR image and showed that SDR picture on the said Windows
       | machine with an IPS panel. Which tells me that the manual
       | HDR->SDR "pipeline" I just described is better.
       | 
       | I think Windows showing HDR content on a non-HDR display should
       | just pick an SDR-sized section of that long dynamic range and
       | show it normally. Without trying to remap the entire large range
       | to a smaller one. Or it should do some other perceptual
       | improvements.
       | 
       | Then again, I know professionally that Windows HDR is complicated
       | and hard to tame. So I'm not really sure the context of remapping
       | as they do, maybe it's the only way in some contingency/rare
       | scenario.
        
       | fracus wrote:
       | The article starts by saying HDR can mean different things and
       | gives Apple's HDR vs new TV "HDR" advertsing but doesn't explain
       | at all what the TV HDR means and how it is different, unless I
       | missed something. I always assumed they were the same thing.
        
       | flkiwi wrote:
       | Did I miss something or is the tone mapper not released yet? I
       | admit I'm multitasking here, but just have an exposure slider in
       | the image lab (or whatever it's called).
       | 
       | Sidebar: I kinda miss when Halide's driving purpose was rapid
       | launch and simplicity. I would almost prefer a zoom function to
       | all of this HDR gymnastics (though, to be clear, Halide is my
       | most-used and most-liked camera app).
       | 
       | EDIT: Ah, I see, it's a Mark III feature. That is not REMOTELY
       | clear in the (very long) post.
        
       | neves wrote:
       | Interesting these HDR controls of their app Halide. Does Android
       | have similar apps?
        
       | mightysashiman wrote:
       | I have a question: how can I print HDR? Is there any HDR printer
       | + paper + display lighting setup?
       | 
       | My hypothesis are the following:
       | 
       | - Increase display lighting to increase peak white point + use a
       | black ink able to absorb more light (can Vantablack-style
       | pigments be made into ink?) => increase dynamic range of a
       | printable picture
       | 
       | - Alternatively, have the display lighting include visible light
       | + invisible UV light, and have the printed picture include an
       | invisible layer of UV ink that shines white : the pattern printed
       | in invisible UV-ink would be the "gain map" to increase the peak
       | brightness past incident visible light into HDR range.
       | 
       | What do you folks think?
        
         | redox99 wrote:
         | You would need a very strong light surrounded by a darkish
         | environent. I don't see a point in "HDR Paper" if you need to
         | place it under a special light.
        
           | mightysashiman wrote:
           | usecase: a photo exhibition that wants to show pictures in
           | HDR = An HDR print setup (i.e. paper + appropriately
           | controlled lighting).
           | 
           | Haven't you ever been to a photo exhibition ?
        
         | grumbel wrote:
         | Use projection mapping, instead of lighting the photo
         | uniformly, use a projector and project a copy of the image onto
         | the photo, thus you get detailed control over how much light
         | the parts of the image receive.
         | 
         | Alternatively, use transparent film and a bright backlight.
        
       | labadal wrote:
       | Back in university I implemented a shoddy HDR for my phone
       | camera.
       | 
       | The hardest part of it, by far, was taking hundreds upon hundreds
       | of pictures of a blank piece of paper in different lighting
       | conditions with different settings.
        
       | ByteAtATime wrote:
       | In my opinion, HDR is another marketing gimmick -- the average
       | layman has no idea what it means, but it sounds fancy and is more
       | expensive, so surely it's good.
        
       | kristianp wrote:
       | A related article on dpreview about Sigmas hdr map in jpegs has
       | some tasteful hdr photos. You have to click/tap the photo to see
       | the effect.
       | 
       | https://www.dpreview.com/news/7452255382/sigma-brings-hdr-br...
        
       | chaboud wrote:
       | Having come from professional video/film tooling in the 90's to
       | today, it's interesting to see the evolution of what "HDR" means.
       | I used to be a purist in this space, where SDR meant ~8 stops
       | (powers of two) or less of dynamic range, and HDR meant 10+.
       | Color primaries and transfer function mapping were things I spoke
       | specifically about. At this point, though, folks use "HDR" to
       | refer to combinations of things.
       | 
       | Around this, a bunch of practical tooling surfaced (e.g., hybrid
       | log approaches to luminance mapping) to extend the thinking from
       | 8-bit gamma-mapped content presenting ~8 stops of dynamic range
       | to where we are now. If we get away from just trying to label
       | everyting "HDR", there are some useful things people should
       | familiarize with:
       | 
       | 1. Color primaries: examples - SDR: Rec. 601, Rec. 709, sRGB.
       | HDR: Rec. 2020, DCI-P3. The new color primaries expand the
       | chromatic representation capabilities. This is pretty easy to
       | wrap our heads around: https://en.wikipedia.org/wiki/Rec._2020
       | 
       | 2. Transfer functions: examples - SDR: sRGB, BT.1886. HDR: Rec.
       | 2100 Perceptual Quantizer (PQ), HLG. The big thing in this space
       | to care about is that SDR transfer functions had reference peak
       | luminance but were otherwise _relative_ to that peak luminance.
       | By contrast, Rec. 2100 PQ code points are absolute, in that each
       | code value has a _defined meaning_ in measurable luminance, per
       | the PQ EOTF transfer function. This is a _big_ departure from our
       | older SDR universe and from Hybrid Log Gamma approaches.
       | 
       | 3. Tone mapping: In SDR, we had the comfort of camera and display
       | technologies roughly lining up in the video space, so living in a
       | gamma/inverse-gamma universe was fine. We just controlled the
       | eccentricity of the curve. Now, with HDR, we have formats that
       | can carry tone-mapping information and transports (e.g., HDMI)
       | that can bidirectionally signal display target capabilities,
       | allowing things like source-based tone mapping. Go digging into
       | HDR10+, Dolby Vision, or HDMI SBTM for a deep rabbit hole.
       | https://en.wikipedia.org/wiki/Tone_mapping
       | 
       | So HDR is everything (and nothing), but it's definitely
       | important. If I had to emphasize one thing that is non-obvious to
       | most new entrants into the space, it's that there are elements of
       | description of color and luminance that are _absolute_ in their
       | meaning, rather than relative. That 's a substantial shift. Extra
       | points for figuring out that practical adaptation to display
       | targets is built into formats and protocols.
        
       | squidsoup wrote:
       | As a photographer, one of the things that draws me to the work of
       | artists like Bill Brandt and Daido Moriyama is their use of low
       | dynamic range, and high contrast. I rarely see an HDR image that
       | is aesthetically interesting.
        
       | Terr_ wrote:
       | > Our eyes can see both just fine.
       | 
       | This gets to a gaming rant of mine: Our natural vision can handle
       | these things because our eyes scan sections of the scene with
       | constant adjustment (light-level, focus) while our brain is
       | compositing it together into what feels like a single moment.
       | 
       | However certain effects in games (i.e. "HDR" and Depth of Field)
       | instead _reduce_ the fidelity of the experience. These features
       | limp along only while our gaze is aimed at the exact spot the
       | software expects. If you glance anywhere else around the scene,
       | you instead percieve an _unrealistically wrong_ coloration or
       | blur that frustratingly persists no matter how much you squint.
       | These problems will remain until gaze-tracking support becomes
       | standard.
       | 
       | So ultimately these features _reduce_ the realism of the
       | experience. They make it less like _being there_ and more like
       | you 're watching a second-hand movie recorded on flawed video-
       | cameras. This distinction is even clearer if you consider cases
       | where "film grain" is added.
        
         | glenngillen wrote:
         | I had a similar complaint with the few 3D things I watched when
         | that has been hyped in the past (e.g., when Avatar came out in
         | cinemas, and when 3D home TVs seemed to briefly become a thing
         | 15 years ago). It felt like Hollywood was giving me the freedom
         | to immerse myself, but then simultaneously trying to constrain
         | that freedom and force me to look at specific things in
         | specific ways. I don't know what the specific solution is, but
         | it struck me that we needed to be adopting lessons from live
         | stage productions more than cinema if you really want people to
         | think what they're seeing is real.
        
           | pfranz wrote:
           | Stereo film has its own limitations. Sadly, shooting for
           | stereo was expensive and often corners were cut just to get
           | it to show up in a theater where they can charge a premium
           | for a stereo screening. Home video was always a nightmare--
           | nobody wants to wear glasses (glassesless stereo TVs had a
           | _very_ narrow viewing angle).
           | 
           | It may not be obvious, but film has a visual language. If you
           | look at early film, it wasn't obvious if you cut to something
           | that the audience would understand what was going on. Panning
           | from one object to another implies a connection. It's built
           | on the visual language of still photography (things like rule
           | of thirds, using contrast or color to direct your eye, etc).
           | All directing your eye.
           | 
           | Stereo film has its own limitations that were still being
           | explored. In a regular film, you would do a rack focus to
           | connect something in the foreground to the background. In
           | stereo, when there's a rack focus people don't follow the
           | camera the same way. In regular film, you could show
           | someone's back in the foreground of a shot and cut them off
           | at the waist. In stereo, that looks weird.
           | 
           | When you're presenting something you're always directing
           | where someone is looking--whether its a play, movie, or
           | stereo show. The tools are just adapted for the medium.
           | 
           | I do think it worked way better for movies like Avatar or How
           | to Train Your Dragon and was less impressive for things like
           | rom coms.
        
         | agrnet wrote:
         | This is why I always turn off these settings immediately when I
         | turn on any video game for the first time. I could never put my
         | finger on why I didn't like it, but the camera analogy is
         | perfect
        
         | brokenmachine wrote:
         | I'm with you on depth of field, but I don't understand why you
         | think HDR reduces the fidelity of a game.
         | 
         | If you have a good display (eg an OLED) then the brights are
         | brighter and simultaneously there is more detail in the blacks.
         | Why do you think that is worse than SDR?
        
           | pfranz wrote:
           | Check out this old post:
           | https://www.realtimerendering.com/blog/thought-for-the-day/
           | 
           | HDR in games would frequently mean clipping highlights and
           | adding bloom. Prior the "HDR" exposure looked rather flat.
        
             | brokenmachine wrote:
             | OK, so it doesn't mean real HDR but simulated HDR.
             | 
             | Maybe when proper HDR support becomes mainstream in 3D
             | engines, that problem will go away.
        
               | orthoxerox wrote:
               | It's HDR at the world data level, but SDR at the
               | rendering level. It's simulating the way film cannot
               | handle real-life high dynamic range and clips it instead
               | of compressing it like "HDR" in photography.
        
               | nomel wrote:
               | > Instead of compressing it like "HDR" in photography
               | 
               | That's not HDR either, that's tone mapping to SDR. The
               | entire point of HDR is that you _don 't need_ to compress
               | it because your display can actually make use of the
               | extra bits of information. Most modern phones take true
               | HDR pictures that look great on an HDR display.
        
             | majormajor wrote:
             | That's not what it means since 2016 or so when consumer TVs
             | got support for properly displaying brighter whites and
             | colors.
             | 
             | It definitely adds detail now, and for the last 8-9 years.
             | 
             | Though consumer TVs obviously still fall short of being as
             | bright at peak as the real world. (We'll probably never
             | want our TV to burn out our vision like the sun, though,
             | but probably hitting highs at least in the 1-2000nit range
             | vs the 500-700 that a lot peak at right now would be nice
             | for most uses.
        
           | Sharlin wrote:
           | The "HDR" here is in the sense of "tone mapping to SDR".
           | Should also be said that even "H" DR displays only have a
           | stop or two of more range, still much less than in a real-
           | world high-contrast scenes
        
             | brokenmachine wrote:
             | It's still better though.
             | 
             | HDR displays are >1000nits while SDR caps out at less than
             | 500nits even on the best displays.
             | 
             | Eg for the Samsung s90c, HDR is 1022nits, SDR is 487nits: h
             | ttps://www.rtings.com/tv/reviews/samsung/s90c-oled#test_608
             | https://www.rtings.com/tv/reviews/samsung/s90c-oled#test_4
             | 
             | Double the range is undeniably still better.
             | 
             | And also 10bit instead of 8bit, so less posterization as
             | well.
             | 
             | Just because the implementations have been subpar until now
             | doesn't mean it's worthless tech to pursue.
        
         | pfranz wrote:
         | https://www.realtimerendering.com/blog/thought-for-the-day/
         | 
         | It's crazy that post is 15 years old. Like the OP and this post
         | get at, HDR isn't really a good description of what's
         | happening. HDR often means one or more of at least 3 different
         | things (capture, storage, and presentation). It's just the
         | sticker slapped on advertising.
         | 
         | Things like lens flares, motion blur, film grain, and shallow
         | depth of field are mimicking cameras and not what being there
         | is like--but from a narrative perspective we experience a lot
         | of these things through tv and film. Its visual shorthand. Like
         | Star Wars or Battlestar Galactica copying WWII dogfight footage
         | even though it's less like what it would be like if you were
         | there. High FPS television can feel cheap while 24fps can feel
         | premium and "filmic."
         | 
         | Often those limitations are in place so the experience is
         | consistent for everyone. Games will have you set brightness and
         | contrast--I had friends that would crank everything up to avoid
         | jump scares and to clearly see objects intended to be hidden in
         | shadows. Another reason for consistent presentation is for
         | unfair advantages in multiplayer.
        
           | wickedsight wrote:
           | > the poster found it via StumbleUpon.
           | 
           | Such a blast from the past, I used to spend so much time just
           | clicking that button!
        
           | arghwhat wrote:
           | > Things like lens flares, motion blur, film grain, and
           | shallow depth of field are mimicking cameras and not what
           | being there is like
           | 
           | Ignoring film grain, our vision has all these effects all the
           | same.
           | 
           | Look in front of you and only a single plane will be in focus
           | (and only your fovea produces any sort of legibility). Look
           | towards a bright light and you might get flaring from just
           | your eyes. Stare out the side of a car or train when driving
           | at speed and you'll see motion blur, interrupted only by
           | brief clarity if you intentionally try to follow the motion
           | with your eyes.
           | 
           | Without depth of field simulation, the whole scene is just a
           | flat plane with completely unrealistic clarity, and because
           | it's comparatively small, too much of it is smack center on
           | your fovea. The problem is that these are simulations that do
           | not track your eyes, and make the (mostly valid!) assumption
           | that you're looking, nearby or in front of whatever you're
           | controlling.
           | 
           | Maybe motion blur becomes unneccessary given a high enough
           | resolution and refresh rate, but depth of field either
           | requires actual depth or foveal tracking (which only works
           | for one person). Tasteful application of current techniques
           | is probably better.
           | 
           | > High FPS television can feel cheap while 24fps can feel
           | premium and "filmic."
           | 
           | Ugh. I will never understand the obsession this effect. There
           | is no such thing as a "soap opera effect" as people liek to
           | call it, only a slideshow effect.
           | 
           | The history behind this is purely a series of cost-cutting
           | measures entirely unrelated to the user experience or
           | artistic qualities. 24 fps came to be because audio was
           | slapped onto the film, and was the slowest speed where the
           | audio track was acceptable intelligible, saving costly film
           | paper - the _sole_ priority of the time. Before that, we used
           | to record content at variable frame rates but play it back at
           | 30-40 fps.
           | 
           | We're clinging on to a cost-cutting measure that was a
           | significant compromise from the time of hand cranked film
           | recording.
           | 
           | </fist-shaking rant>
        
             | iamacyborg wrote:
             | > Look in front of you and only a single plane will be in
             | focus (and only your fovea produces any sort of
             | legibility). Look towards a bright light and you might get
             | flaring from just your eyes. Stare out the side of a car or
             | train when driving at speed and you'll see motion blur,
             | interrupted only by brief clarity if you intentionally try
             | to follow the motion with your eyes.
             | 
             | The problem is the mismatch between what you're looking at
             | on the screen and what the in-game camera is looking at. If
             | these were synchronised perfectly it wouldn't be a problem.
        
               | arghwhat wrote:
               | Indeed - I also mentioned that in the paragraph
               | immediately following.
        
               | iamacyborg wrote:
               | Derp
        
         | 7bit wrote:
         | Ok. I like depth of field and prefer it.
        
         | kookamamie wrote:
         | HDR, not "HDR", is the biggest leap in gaming visuals made in
         | the last 10 years, I think.
         | 
         | Sure, you need a good HDR-capable display and a native HDR-game
         | (or RTX HDR), but the results are pretty awesome.
        
         | baxuz wrote:
         | These effects are for the artistic intent of the game. Same
         | goes for movies, and has nothing to do with "second hand movies
         | recorded on flawed cameras". or with "realism" in the sense of
         | how we perceive the world.
        
         | robertlagrant wrote:
         | The most egregious example is 3D. Only one thing is in focus,
         | even though the scene is stereoscopic. It makes no sense
         | visually.
        
           | bmurphy1976 wrote:
           | Hell yeah, this one of many issues I had with the first
           | Avatar movie. The movie was so filled with cool things to
           | look at but none of it was in focus. 10 minutes in I had had
           | enough and was ready for a more traditional movie experience.
           | Impressive yes, for 10 minutes, then exhausting.
        
             | robertlagrant wrote:
             | Hah - Avatar was exactly what I was thinking of.
        
             | jedbrooke wrote:
             | this thread is helping me understand why I always thought
             | 3D movies looked _less_ 3D than 2D movies.
             | 
             | That and after seeing Avatar 1 in 3D, then seeing Avatar 2
             | in 3D over 10 years later and not really noticing any
             | improvement in the 3D made me declare 3D movies officially
             | dead (though I haven't done side by side comparisons)
        
       | Ericson2314 wrote:
       | Half life 2 lost coast was exciting
       | 
       | Glad all this "Instagram influences searing eyeballs with bright
       | whites" is news to me. All I know about is QR code mode doing
       | that.
        
       | sn0n wrote:
       | Tldr; lots of vivid colors.
        
       | hfgjbcgjbvg wrote:
       | Digital camera technology is so fascinating.
        
       | shrx wrote:
       | Question to the author: Why are the images in the "Solution 2:
       | Genuine HDR Displays" section actually videos? E.g.
       | <video src="https://www.lux.camera/content/media/2025/03/new-
       | york-skyline-hdr.mp4"
       | poster="https://img.spacergif.org/v1/4032x3024/0a/spacer.png"
       | width="4032" height="3024" loop="" autoplay="" muted=""
       | playsinline="" preload="metadata" style="background: transparent
       | url('https://www.lux.camera/content/media/2025/03/new-york-
       | skyline-hdr_thumb.jpg') 50% 50% / cover no-repeat;"></video>
        
         | Dylan16807 wrote:
         | Your question is answered in the article.
         | 
         | Look for the word video.
        
           | shrx wrote:
           | Yes but I'm interested in the technical reasons.
        
             | Dylan16807 wrote:
             | Safari doesn't support HDR images and there's no
             | particularly good reason for it.
        
       ___________________________________________________________________
       (page generated 2025-05-15 23:02 UTC)