[HN Gopher] Why does the chromaticity diagram look like that?
       ___________________________________________________________________
        
       Why does the chromaticity diagram look like that?
        
       Author : samwillis
       Score  : 98 points
       Date   : 2024-07-26 18:19 UTC (4 hours ago)
        
 (HTM) web link (jlongster.com)
 (TXT) w3m dump (jlongster.com)
        
       | SirMaster wrote:
       | It's probably good to start with XYZ, but we have much better
       | colorspaces now that do a better job at correlating with our
       | vision.
       | 
       | Mainly CIE 1976 L',u',v' and even more recently ICtCp from Dolby
       | research.
        
         | refulgentis wrote:
         | CAM-16. When in doubt, ask the color scientists :)
        
           | anon2345252 wrote:
           | Oklab is even better.
           | 
           | https://bottosson.github.io/posts/oklab/
        
             | refulgentis wrote:
             | No, it's not, by definition. It's one matrix multiplication
             | to do an approximation of it. More here:
             | https://news.ycombinator.com/item?id=41081832
             | 
             | The only claim to superiority it makes is gradients, and
             | that's a category error: they blend polar opposite hues in
             | the Cartesian space (i.e. x / y / z), rather than polar
             | (i.e. h/s/l). Opposite hues mean lerp'ing would bring it
             | through the center of the circle, 0 saturation. Thus, blue
             | and yellow _do_ combine to a off-white. Engineering around
             | it indicates something fundamentally off, much less that it
             | is _better_. I don 't ascribe ill intent but I do worry
             | very much about how widely this is misunderstood.
        
         | contravariant wrote:
         | The xyY colour space is designed such that the colours of light
         | you get by blending two points all lie on the line between the
         | two corresponding points. This makes it extremely helpful when
         | you want to figure out which colours you can make with a
         | particular set of primaries. Similarly you can draw the colours
         | corresponding to pure wavelenghts and figure out the entire
         | space of physically possible colours by taking its convex
         | closure.
         | 
         | These features are not really replicable in any other colour
         | space, at best you can use a linear transformation of it (which
         | XYZ already is, and it has almost all properties you could want
         | of a choice of basis).
        
           | JKCalhoun wrote:
           | That's true. And I will add that the viewable color gamut of
           | a display can be depicted with a simple triangle on the xyY
           | plot. All you need to know are the three chromaticity values
           | for the reg, green and blue phosphors -- they make up the
           | three corners of the triangle.
        
       | hoherd wrote:
       | Here's another really interesting exploration of color spaces
       | https://ericportis.com/posts/2024/okay-color-spaces/
        
         | kurthr wrote:
         | I prefer this. The failure to discus Lab and OKLab in the main
         | link is quite odd.
         | 
         | Also, I'd mention to those who think that violet/magenta aren't
         | "real" colors that the red X decays more slowly than the blue Z
         | at short wavelengths so you can get saturated violet/magenta
         | single wavelength colors (not well represented on the standard
         | chroma charts) below 400nm at high power. Of course they aren't
         | efficient for monitors (even blue isn't) and they're dangerous
         | to look at for any length of time. But if you see a (single
         | wavelength) violet/magenta laser, it's time to look away or
         | shut your eyes.
        
           | jlongster wrote:
           | Yeah that article is better. I'm the author and I wrote this
           | only for me as I studied it, it's not great as a way to
           | describe it to others
           | 
           | I wanted to start from the very beginning and as far as I
           | know Lab and OKLab didn't come later. Studying the 1931
           | studies and such was a start, and I wanted to later bring up
           | all the other things we've learned since then, but haven't
           | had time to write more about it
        
       | PaulHoule wrote:
       | That particular version of the chromaticity diagram makes it look
       | like the colors missing from your display are various shades of
       | laser pointer green as opposed to all the shades of red and blue
       | that are missing because really saturated red and blue primaries
       | are too dim (per unit of energy) to use.
       | 
       | See https://nanosys.com/blog-archive/2012/08/14/color-space-
       | conf...
       | 
       | I learned a lot more about color management than I wanted to know
       | in the progress of making red-cyan stereograms because I found
       | when I asked for sRGB red I was getting something like
       | (180,16,16) on my high gamut monitor which resulted in serious
       | crosstalk between the channels.
       | 
       | Right now I am working with a seamstress friend on custom printed
       | fabrics and I have a flower print where yellow somehow turned to
       | orange in the midst of processing the image and I want to get it
       | debugged and thoroughly proofed before I send out the order... I
       | am still learning more than I want to know about color
       | management.
        
         | ThrowawayTestr wrote:
         | >Right now I am working with a seamstress friend on custom
         | printed fabrics and I have a flower print where yellow somehow
         | turned to orange in the midst of processing the image
         | 
         | That's why Pantone makes so much money.
        
         | derefr wrote:
         | > as opposed to all the shades of red and blue that are missing
         | because really saturated red and blue primaries are too dim
         | (per unit of energy) to use.
         | 
         | Would another way to put that be, that the chromaticity diagram
         | _could_ keep going southeastward (i.e. the XYZ color-space
         | _could_ have the X and Z activation functions extended leftward
         | and rightward), but due to the frequencies continuing on the
         | spectral line, that area of the diagram would necessarily be
         | made mostly of infrared and ultraviolet frequencies that we can
         | 't see?
        
       | refulgentis wrote:
       | TL;DR: Imagine color space has 3 dimensions in polar coordinates.
       | 
       | - hue, the angle. your familiar red, orange, yellow, green,
       | blue...
       | 
       | - saturation/chroma, radial distance from center. intensity of
       | the pigment
       | 
       | - lightness, top to bottom, white to black
       | 
       | The XY diagram shows 3d color space, from the top, in XYZ.
       | 
       | XYZ is a particular color space that Nathan Myhrvold picked at
       | Microsoft in the early 90s.
       | 
       | There is no privileged "correct" color space, they're developed
       | based on A/B tests and intuition by color scientists.
       | 
       | However, there are _more_ correct color spaces, in that color
       | science matters and is a real field. Commonly agreed state of the
       | art is CAM16.
       | 
       | It's a significant mistake that Oklab is the first space with
       | significant mindshare since HSL, it was a quick hack by a ex-game
       | developer to make something akin to CAM16 with just one matrix
       | multiplication.
       | 
       | CAM16 conversions involve significantly more than one matrix
       | multiplication. But, its ~400 lines of code, and you can do
       | millions a second on modern hardware.
       | 
       | The lightness scale is _way_ off from scientific color spaces,
       | and thus it can't be used to create simple rules like "40 delta
       | L* ~= 3.0 contrast ratio, 50 delta L* ~= 4.5 contrast ratio".
       | Instead you're still manually plugging colors into a contrast
       | checker :(
       | 
       | Then again, its still a step forward. It's even more maddening
       | HSL was used for so long: it's absolutely absurd, ex. lightness =
       | average of two highest RGB components. Great for demo hacks in
       | 1976, not so great in 2016.
        
       | klysm wrote:
       | I think a good explanation of color spaces might be starting at a
       | camera sensor with a bayer array and how that's processed.
        
         | avidiax wrote:
         | I'm not so sure.
         | 
         | I think it's really important to understand spectral colors and
         | metamerism.
         | 
         | If you start at the Bayer array, you are going to have an odd
         | discussion about how the bayer filters have spectral transfer
         | functions that aren't directly related to the cones in our
         | eyes, nor any color space's primaries, etc.
         | 
         | It's going in the deep end.
        
         | contravariant wrote:
         | Starting with the function of the human eye seems like a good
         | choice, I mean how else would you explain why a Bayer matrix
         | only has 3 colours?
        
           | JKCalhoun wrote:
           | Or why there are two green pixels in the Bayer pattern for
           | every red and blue...
        
       | radicality wrote:
       | Kinda related, but does someone maybe have a good set of links to
       | help understand what HDR actually is? Whenever I tried in the
       | past, I always got lost and none of it was intuitive.
       | 
       | There's so many concepts there like: color spaces, transfer
       | functions, HDR vs Apple's XDR HDR, HLG vs Dolby Vision, mastering
       | displays, max brightness vs peak brightness, all the different
       | hdr monitor certification levels, 8 bit vs 10bit, "full" vs
       | "video" levels when recording video etc etc.
       | 
       | Example use case - I want to play iPhone-recorded videos using
       | mpv on my MacBook. There's hundreds of knobs to set, and while I
       | can muck around with them and get it looking close-ish to what
       | playing the file in QuickTime/Finder, I still have no idea what
       | any of these settings are doing.
        
         | wongarsu wrote:
         | HDR is whatever marketing wants it to be.
         | 
         | Originally it's just about being able to show both really dark
         | and really bright colors. Something that's really easy if each
         | pixel is an individual LED, but that's very hard in LCD
         | monitors with one big backlight and pixels are just dimmable
         | filters for that backlight. Or alternatively on the sensor side
         | the ability to capture really bright and really dark spots in
         | the same shot, something our sensors are much worse at than our
         | eyes, but you can pull some tricks.
         | 
         | Once you have that ability you notice that 8 bits of brightness
         | information isn't that much. So you go with 10 bit or 16 bits.
         | Your gamma settings also play a role (the thing that turns your
         | linear color values into exponential values).
         | 
         | And of course the people who care about HDR have a big overlap
         | with people who care about colors, so that's where your color
         | spaces, certifying and calibrating monitors to match those
         | color spaces etc comes in. It's really adjacent but often just
         | rolled in for convenience.
        
           | radicality wrote:
           | More bits to store more color/brightness etc makes sense.
           | 
           | I think my main confusion has usually been that it all feels
           | like some kind of a... hack? Suppose I set my macbook screen
           | to max brightness, and then open up a normal "white" png.
           | Looks fine, and you would think "well, the display is at max
           | brightness, and the png is filled with white", so a fair
           | conclusion would be thats the whitest/brightest that screen
           | goes. But then you open another png but of a more special
           | "whiter white", and suddenly you see your screen actually can
           | go brighter! So you get thoughts like "why is this white
           | brighter", "how do I trigger it", "what are the actual limits
           | of my screen", "is this all some separate hacky code path",
           | "how come I only see it in images/videos, and not UI
           | elements", "is it possible to make a native Mac ui with that
           | brightness".
           | 
           | In any case, thanks for the answer. I might be overthinking
           | it and there's probably lots of historical/legacy reasons for
           | the way things are with hdr.
        
             | wongarsu wrote:
             | > there's probably lots of historical/legacy reasons for
             | the way things are with hdr
             | 
             | That's pretty much it. If you use a HDR TV it will usually
             | work like you describe. It would display the same white for
             | a normal white PNG and an "even whiter" "HDR" PNG.
             | 
             | Apple's decision makes sense if you imagine SDR (so not-
             | HDR) images as HDR images clipped to to some SDR range in
             | the middle of the HDR range (leading to lots of over- and
             | underexposure in the SDR image). If you then show them
             | side-by-side of course the whitest white in the HDR range
             | is whiter than the whitest white in the SDR image. Of
             | course that's a crude simplification of how images work,
             | but it makes for a great demo: HDR images really pop and
             | look visually better. If you stretched everything to the
             | same brightness range the HDR images wouldn't be nearly as
             | impressive, just more detail and less color banding. The
             | marketing people wouldn't like that
        
             | crazygringo wrote:
             | It is all extremely hacky.
             | 
             | Because HDR allows us to encode brightnesses that virtually
             | no consumer displays can display.
             | 
             | And so deciding how to display those on any given display,
             | on a given OS, in a given app, is making whatever "hacky"
             | and totally non-standardized tradeoffs the display+OS+app
             | decide to make. And they're all different.
             | 
             | It's a complete mess. I'm strongly of the opinion that HDR
             | made a fundamental mistake in trying to design for "ideal"
             | hardware that nobody has, and then leaving "degraded"
             | operation to be implementation-specific.
             | 
             | It's a complete design failure that playing HDR content on
             | different apps/devices results in output that is often too
             | dark and often has a telltale green tint. It's ironic that
             | in practice, something meant to enable higher brightness
             | and greater color accuracy has resulted in darker images
             | and color that varies from slightly wrong to totally wrong.
        
             | duskwuff wrote:
             | > So you get thoughts like [...] "what are the actual
             | limits of my screen" [...]
             | 
             | Some of the limitations, at least in Apple's displays, are
             | thermal! The backlight cannot run at full brightness
             | continuously across the full display; it can only hit its
             | peak brightness (1600 nits) in a small area, or for a short
             | time.
        
       | meindnoch wrote:
       | Mostly correct, but I don't understand what the author is trying
       | to do in the last section, where they try to fill the locus by
       | generating spectra with two peaks and projecting it into the
       | chromaticity diagram. Why do it like that?
       | 
       | This is how you should do it:
       | 
       | - You pick a Y value. This is going to be the luminance of your
       | diagram.
       | 
       | - For each pixel inside the area bounded by the spectral locus
       | (and the line of purples - the line connecting the two endpoints
       | of the locus) you take its x, y coordinates.
       | 
       | - Together these 3 values specify your color in the CIE xyY color
       | space. Converting from xyY to XYZ is trivial: X = Y / y * x, Y =
       | Y, Z = Y / y * (1 - x - y)
       | 
       | - You map these XYZ values into your output image's color space
       | (e.g. sRGB). If a given XYZ value maps outside the [0,1] interval
       | in sRGB, then it's outside the sRGB gamut, and you may clip the
       | values to the closest valid value inside the gamut.
        
       | _wire_ wrote:
       | This article illustrates the theory and math that lead to the
       | horseshoe diagram in a very approachable style that is as simple
       | as possible without being too simple.
       | 
       | A Beginner's Guide to (CIE) Colorimetry -- Chandler Abraham
       | 
       | https://medium.com/hipster-color-science/a-beginners-guide-t...
        
       | GrantMoyer wrote:
       | In my opinion, plotting chromaticity on a Cartesian grid -- by
       | far the most common way -- is pretty misleading, since
       | chromaticity diagrams use barycentric coordinates (and to be
       | clear, I blame the institution, not the author). The effect is
       | that the shape of the gamut looks skewed, but only because of how
       | it's plotted; the weird skewedness of a typical XYZ chromaticity
       | diagram doesn't represent anything real about the data.
       | 
       | Instead, a chromaticity diagram is better thought of as a 2D
       | planar slice of a 3D color space, specifically the slice through
       | all three standard unit vectors. From this conception, it's much
       | more natural to plot a chromaticity diagram in an equilateral
       | triangle, such as the diagram at [1]. A plot in a triangle makes
       | it clear, for instance, that the full color gamut in XYZ space
       | isn't some arbitrary, weird, squished shape, but instead was
       | intentionally chosen in a way that fills the positive octant
       | pretty well given the constraints of human vision.
       | 
       | [1]: https://physics.stackexchange.com/questions/777501/why-is-
       | th...
        
       | VanillaCafe wrote:
       | I thought this might be a useful article because I've often had a
       | similar question. But there's a diagram that has text:
       | 
       | > _More simply put: imagine that you have red, green, and blue
       | light sources. What is the intensity of each one so that the
       | resulting light matches a specific color on the spectrum?_
       | 
       | > _..._
       | 
       | > _The CIE 1931 color space defines these RGB color matching
       | functions. The red, green, and blue lines represent the intensity
       | of each RGB light source:_
       | 
       | This seems very oddly phrased to me. I would presume that what
       | that chart is actually showing is the response for each color of
       | cone in the human eye?
       | 
       | In which case it's not a question of "intensity of the light
       | source" but more like "the visual response across different
       | wavelengths of a otherwise uniform intensity light source"?
       | 
       | ... fwiw, I'm not trying to be pedantic, just trying to see if
       | I'm missing the point or not.
        
         | jlongster wrote:
         | I'm the author of the article and the intensity is referring to
         | the level of the light source used in the study to generate the
         | data. See the study explained here: https://medium.com/hipster-
         | color-science/a-beginners-guide-t...
         | 
         | but you're right, the intensity needed of each R, G, and B
         | light sources to produce the correct color is directly related
         | to how our eyes perceive each of those sources, so yes you are
         | correct
        
         | GrantMoyer wrote:
         | The wording on the article is correct, despite being confusing.
         | The CIE 1931 RGB primaries each stimulate multiple types of
         | cone in human eyes, so the RGB Color Matching Functions (CMFs)
         | don't represent cone stimulations.
         | 
         | However, the CMFs for LMS space[1] _do_ directly represent cone
         | stimulations over. Like the CIE RGB CMFs, the LMS CMFs can also
         | be thought of as the required intensities of three primariy
         | colors required to reproduce the color of a given spectrum.
         | However, unlike CIE RGB, no colors of light which stimulate
         | only one type of cone physically exist.
         | 
         | Finally, CIE RGB and LMS space are linear transformations of
         | each other, so the CIE RGB CMFs are linear combinations of the
         | LMS CMFs. I often find it easiest to reason about these color
         | spaces in terms of LMS space, since it's the most physically
         | straightforward.
         | 
         | [1]: https://en.m.wikipedia.org/wiki/LMS_color_space
        
       | carlosjobim wrote:
       | I think the explanation is simple: Color is light and it is
       | linear going from ultraviolet to blue to green to yellow to red
       | to infrared. It's just a line.
       | 
       | In physical reality, there exists no purple light. Our minds make
       | up all the shades of purple and magenta between blue and red when
       | our eyes receive both red and blue light.
       | 
       | So in order to include the magentas, you need to draw another
       | line between blue and red. Meaning you have to bend the real
       | color line. And that's what we see in the chromaticity diagram.
        
       ___________________________________________________________________
       (page generated 2024-07-26 23:01 UTC)