[HN Gopher] Dithering in Colour
___________________________________________________________________
Dithering in Colour
Author : surprisetalk
Score : 138 points
Date : 2025-03-09 23:28 UTC (3 days ago)
(HTM) web link (obrhubr.org)
(TXT) w3m dump (obrhubr.org)
| Clamchop wrote:
| They may not want to imply that didder's linearized rabbit is
| wrong, but I'm comfortable saying so. It's not just a little
| dark, it's way dark, to the point of hiding detail.
|
| The linearized RGB palette is similarly awful. It clobbers a
| whole swath of colors, rendering them as nearly black. Purples
| are particularly brutalized. Yellows disappeared and became
| white.
|
| On my phone, the middle palette doesn't appear too bright to my
| eyes, either.
|
| Even the linearized gradient looks worse, .
|
| Maybe linear is not best for perceptual accuracy.
| obrhubr wrote:
| Thanks for your comment! I'm glad you're seeing the same thing
| :) I re-implemented the linearised dithering in python and got
| similar results. I checked and rechecked the colour profiles in
| GIMP, nothing... At this point I can only hope for an expert to
| appear and tell me what exactly I am doing wrong.
| mkesper wrote:
| Did you try any of the OKlab color space implementations for
| calculating? https://bottosson.github.io/posts/oklab/
| obrhubr wrote:
| I'll try them as soon as I get the chance, I have
| perceptual luminance implemented already. I'll compare :)
| ack_complete wrote:
| It looks like the images on your blog might have gone through
| a non-gamma-corrected scaler. The linear images produced the
| program look correct, they do overall match the original
| image in Krita when scaled in scRGB linear 32-bit float.
| bazzargh wrote:
| I got better results just dithering the rgb channels
| separately (so effectively an 8 colour palette, black, white,
| rgb, yellow, cyan, magenta). In p5js: var
| img var pixel var threshold var error
| = [0, 0, 0] var a0 function preload() {
| img = loadImage("https://upload.wikimedia.org/wikipedia/commo
| ns/thumb/4/44/Albrecht_D%C3%BCrer_-
| _Hare%2C_1502_-_Google_Art_Project.jpg/1920px-
| Albrecht_D%C3%BCrer_-_Hare%2C_1502_-_Google_Art_Project.jpg")
| } function setup() { // I'm just using
| a low discrepancy sequence for a quasirandom //
| dither and diffusing the error to the right, because it's
| // trivial to implement a0 = 1/sqrt(5)
| pixelDensity(2) createCanvas(400, 400);
| image(img, 0, 0, 400, 400) loadPixels()
| pixel = 0 threshold = 0 }
| function draw() { if (pixel > 400*400*16) {
| return } for (var i = 0; i < 2000; i++) {
| threshold = (threshold + a0)%1 for(var j=0; j< 3;
| j++) { var c = pixels[pixel + j]
| pixels[pixel + j] = c + error[j] > threshold * 255 ? 255 : 0
| error[j] += c - pixels[pixel + j] }
| pixel += 4 } updatePixels() }
|
| Of course this isn't trying to pick the closest colour in the
| palette as you're doing - it's just trying to end up with the
| same intensity of rgb as the original image. It does make me
| wonder if you should be using the manhattan distance instead
| of euclidean, to get the errors to add correctly.
| Sesse__ wrote:
| For perceptual color difference, there are much better metrics
| than "distance in linear RGB". CIE has some implementations of
| a metric called DE*, for instance.
|
| I don't know if they actually do well in dithering, though. My
| experience with dithering is that it actually works better in
| gamma space than trying to linearize anything, since the
| quantization is fundamentally after gamma.
| nextts wrote:
| > We have just committed a mortal sin of image processing. I
| didn't notice it, you might not have noticed either, but
| colour-space enthusiasts will be knocking on your door shortly.
| badmintonbaseba wrote:
| I agree. I think the problem is a banal missing color
| transformation somewhere in the pipeline, like converting the
| palette and image to linear colorspace, doing the dithering
| there and mistakenly writing the linear color values instead of
| sRGB color values into the image.
|
| Others suggest that the error is using the wrong metric for
| choosing the closest color, but I disagree. That wouldn't such
| drastic systematic darkening like this, as the palette is
| probably still pretty dense in the RGB cube.
|
| Where the linearisation really matters is the arithmetic for
| the error diffusion, you definitely want to diffuse the error
| in a linear colorspace, and you are free to choose a good
| perceptual space for choosing the closest color at each pixel,
| but calculate the error in a linear space.
|
| Visual perception is weird. But when you squint your eyes to
| blur the image, you are definitely mixing in a linear
| colorspace, as that's physical mixing of light intensities
| before the light even reaches your retina. So you have to match
| that when diffusing the error.
|
| edit:
|
| It also doesn't help that most (all?) browsers do color mixing
| wrong when the images are scaled, so if you don't view the
| dithered images at 100% without DPI scaling than you might get
| significantly distorted colors due to that too.
|
| edit2:
|
| For comparison this is what imageworsener does:
|
| https://imgur.com/a/XmJKQnz
|
| You really need to open the image in a viewer where each image
| pixel is exactly one device pixel large, otherwise the color
| arithmetic used for scaling by viewers is of variable quality
| (often very poor).
| contravariant wrote:
| The linearized gradient does look off, but not because it is
| linearized. It is simply wrong.
|
| The dithered gradient shouldn't be pure black halfway through.
| Const-me wrote:
| Yeah, every time I see articles about importance of linear
| color space for gradients, and see images there, I observe the
| opposite of what's written in the text of these articles.
| Gradients in sRGB color space look better.
|
| I have a suspicion that might be because I usually buy
| designer-targeted wide gamut IPS displays. I also set up low
| brightness on them, e.g. right now I'm looking at BenQ PD2700U
| display with brightness 10/100 and contrast 50/100. However,
| sRGB color space was developed decades ago for CRT displays.
| obrhubr wrote:
| Your monitor and your browser 100% affect the appearance.
| After calibrating your monitor, try opening the image in full
| resolution and take a few steps back.
|
| For me, viewing the images on my phone makes them look off.
| yapyap wrote:
| Dithering is so neat.
| oniony wrote:
| Print halftoning is interesting too
| https://photo.stackexchange.com/questions/5779/what-is-the-d...
| mattdesl wrote:
| It might be worth using a lightness estimate like OKLab,
| OKLrab[1], or CIE Lab instead of the RGB luminance weighting, as
| it should produce a more perceptually accurate result.
|
| The other issue with your code right now, is that it is using
| euclidean distance in RGB space to choose the nearest color, but
| it would be probably also more accurate to use a perceptual color
| difference metric, a very simple choice is euclidean distance on
| OKLab colors.
|
| I think dithering is a pretty interesting area of exploration,
| especially as a lot of the popular dithering algorithms are quite
| old and optimized for ancient compute requirements. It would be
| nice to see some dithering that isn't using 8-bits for errors, is
| based on perceptual accuracy, and perhaps uses something like a
| neural net to diffuse things in the best way possible.
|
| [1] https://bottosson.github.io/posts/colorpicker/
| funks_ wrote:
| If you are interested in color dithering with different color
| difference metrics [1], I've implemented just that [2]. You can
| find an example comparing metrics in my docs [3].
|
| [1]:
| https://juliagraphics.github.io/Colors.jl/stable/colordiffer...
|
| [2]: https://github.com/JuliaImages/DitherPunk.jl
|
| [3]: https://juliaimages.org/DitherPunk.jl/stable/#Dithering-
| with...
| rikroots wrote:
| I moved my canvas library's reduce-palette filter over to OKLAB
| calculations a while back. The calculations are more
| computationally intensive, but worth the effort.
|
| https://scrawl-v8.rikweb.org.uk/demo/filters-027.html
| mattdesl wrote:
| I quite like the look of the blue noise dithering on this.
| Are you using just a texture as a mask, or something else?
| rikroots wrote:
| It's an array of pre-calculated values that I extracted
| from an image donated to the Public Domain by Christoph
| Peters (the link is an interesting read about bluenoise -
| recommend!) - http://momentsingraphics.de/BlueNoise.html
|
| No textures or masks, just brute computing on the CPU.
| obrhubr wrote:
| I am weighting each of the channels according to the formula in
| my post.
|
| I'll try OKLab and compare, thanks for the comment :)
| DDoSQc wrote:
| If you want to do true arbitrary palettes you also need to do
| projection of the unbound Oklab space onto the convex hull of
| the palette points. This is a tricky thing to get right, but
| I've found that the Oklab author's published gamut clamping for
| sRGB also translate well to arbitrary convex hulls.
|
| If anyone's curious I've implemented this here:
| https://github.com/DDoS/Cadre/blob/main/encre/core/src/dithe...
| I use it to map images from their source colour space to the
| lower gamut palettes of E Ink colour displays.
| kaoD wrote:
| > Dithering a black-to-white gradient will be wrong without
| linearising first.
|
| TBH both look wrong to me. If I squint, neither dithering
| patterns match the original gradient... but the non-linearized
| one looks the most similar.
|
| What could be causing this?
| hagbard_c wrote:
| > What could be causing this?
|
| Hypercorrection, in this care over-linearisation.
| badmintonbaseba wrote:
| Apart from implementing it incorrect, an uncalibrated display
| could also cause this. Check out http://www.lagom.nl/lcd-
| test/gamma_calibration.php with DPI scaling turned off, at 100%
| zoom level (how browsers scale images are also horrible, so you
| want to avoid that).
|
| edit:
|
| Reading back, viewing the gradients also not at 100% zoom level
| could also itself cause the mismatch, because browsers just
| suck at image scaling.
| gus_massa wrote:
| Mac vs pc?
|
| They have a different default gamma and they may show a
| different gray level.
|
| (It bite me a long time ago. I made a gif that has the same RGB
| bacground than a webpage. In my pc it was fine, but in a mac
| they the border was very visible and the result horrible. My
| solution was to change the backgroung of the webpage from a RGB
| number to a 1 pixel gif with repetition or scale to fill the
| page.)
| TinkersW wrote:
| I don't know where they got the idea you don't dither in srgb,
| the point of dithering is to map it to the nearest bit pattern
| with a random adjustment so that it could go either way(aside
| from artistic choice), you should dither in srgb if you are
| going to display it in srgb, which is probably why the "not
| linearized" version looks more accurate.
|
| See: Dithering should happen in sRGB
| https://www.shadertoy.com/view/NssBRX
| robinsonb5 wrote:
| I'm far from convinced that shadertoy demonstration is
| correct: If you set the number of bits to 1, the dithered
| version is clearly far too light, which is exactly what
| happens if you dither in gamma-encoded space rather than
| linear space.
|
| It gets much worse if you uncomment the SHOW_CORRECT define
| since the data is then being transformed back to SRGB before
| being quantised, which quite heavily skews the probability of
| which code point will be selected in favour of the lighter
| colour.
|
| Increasing the number of bits hides the effect somewhat by
| making more code points available. But because they're
| distributed in gamma-encoded rather than linear-encoded
| space, it's still not correct to assume that a 50/50 pixel
| mix of two adjacent code points will appear the same as the
| colour numerically halfway between them, unless you're making
| that judgement in linear space.
|
| The mistake the shadertoy is making is transforming the data
| to sRGB before quantising. Both dithering and quantising
| should be done in linear space (which is non-trivial since in
| linear space the codepoints aren't linearly distributed any
| more) - otherwise the dither function's triangular
| distribution is skewed by the sRGB transform.
| bmandale wrote:
| The OP example is clearly wrong, but this doesn't sound right
| either. The point of dithering is eg, if you have a pixel
| value of .5, to recreate the brightness of that with black
| and white pixels. The naive approach would do that with one
| black and one white pixel. But depending on how the display
| usually renders .5, then it might be better to replicate it
| with, say, 2 white pixels and 3 black pixels.
| shiandow wrote:
| They seem to be using some kind of error diffusion. And getting
| error diffusion to play nice with linear colour space is
| nontrivial.
|
| I remember I had quite a bit of discussion with madshi when
| MadVR tried implementing it. You _can_ do something that comes
| close by modifying the colour space into something that is
| gamma light in the integer part and linear light in the
| fractional part.
|
| If the value of a pixel is x you then get something like
| floor(x) + (l - ginv(x)) / (l - u) with l and u the the two
| shades corresponding to floor(x) and ceil(x) in linear light.
|
| Though technically error diffusion will still be incorrect, but
| it does handle constant shades correctly and most alternatives
| are worse somehow.
| obrhubr wrote:
| Thank you for pointing that out. The Atkinson dithering I was
| using was indeed messing with the results. I'll be updating
| the post shortly :)
| Retr0id wrote:
| You're looking at a scaled version of the bitmap (potentially
| re-scaled multiple times) and some or all of those
| interpolations may not have been done in a linear colour space.
|
| But in this case I think it's just wrong. The entire first 40%
| of the bar is black, and I don't think it should be.
| magicalhippo wrote:
| I've always been curious to what degree, if any, color
| constancy[1] affects color dithering.
|
| Seems that at some level it should, though perhaps not directly
| at the pixel level due to the high frequency of the per-pixel
| differences, but maybe at the more coarse "averaged" level?
|
| One of those things I've wanted to explore but remains on my to-
| do list...
|
| [1]: https://en.wikipedia.org/wiki/Color_constancy
| spacejunkjim wrote:
| When I saw this, I immediately had flashbacks to a little project
| I did for my CS course when I was an undergrad! We were all
| assigned a computer graphics algorithm and were tasked to build
| an animation explaining how it works.
|
| This was nearly eight years ago, but I managed to find it this
| morning and uploaded it to YouTube.
|
| Here was the resulting animation: https://youtu.be/FHrIQOWeerg
|
| I remember I used Processing to build it, and it took so long to
| animate as I had to export it frame-by-frame. Fun days!
| AndrewStephens wrote:
| Dithering is something of a lost art now that our displays can
| handle millions of colors in high definition, but it can be a
| striking artistic effect.
|
| If anyone thinks their websites are too colorful, I made a pure
| JavaScript web component to dither images on client in real time,
| taking into account the real pixel size of the current display.
|
| https://sheep.horse/2023/1/improved_web_component_for_pixel-...
| hooli_gan wrote:
| Very cool, but the image in the bottom of the page flickers
| when scrolling.
| tuyiown wrote:
| Dithering does that !
| AndrewStephens wrote:
| My code can (optionally, since it is often not useful) dither
| all the way down to the physical pixels of your display
| device for that really crisp, old-fashioned look. Most
| dithering projects on the web don't take this into account so
| look slightly soft around the edges of the pixels.
|
| The image at the bottom is an example. On some devices this
| interacts weirdly with the pattern of pixels or even the
| refresh rate when in motion due to scrolling.
| Aardwolf wrote:
| I think dithering should still be considered, since a super
| high detailed game otherwise pretty engine that then has
| banding in the sky is pretty ugly. 32-bit RGBA can still have
| visible banding which dithering can fix. 256 brightness levels
| per channel isn't all that much when it comes to subtle
| variations in sky colors, the eye is more sensitive than that
|
| 12-bit per channel color might be enough to never have visible
| banding. Or dithering
| AndrewStephens wrote:
| You are right, of course. Imperceptible dithering is still
| technically used all the time. But the harsh dithered look of
| yesteryear, where images were crunched down to 1-bit or maybe
| 32 colors if you were lucky is seldom done today.
| 01HNNWZ0MV43FF wrote:
| For anyone who hasn't seen it, "Banding in games" by one of
| the Playdead (Limbo, Inside) programmers:
| https://www.loopit.dk/banding_in_games.pdf
|
| Crysis had sky banding... Skyrim has the famous menu smoke
| mentioned in the PDF. All fixable, probably fixable on the
| hardware of the day. (I remember messing with dithering on a
| 2007 DX9 GPU)
| kurthr wrote:
| With 100+Hz displays it's not that hard to do temporal
| dithering as well. Your cones are surprisingly low bandwidth
| (why old color TVs even worked at 30Hz), while your rods
| provide danger/flicker cues outside the fovea.
|
| Getting an extra 2bits of hue (ab) while maintaining
| luminance (L) is quite doable except at the chroma and
| brightness extremes where your eye mostly ignores them
| anyway. That could be done pretty high in the display stack.
| I'd also say that the DACs in many displays are capable of
| higher chroma resolution, but gamma non-linearity eats up a
| bit dynamic range.
| toast0 wrote:
| Old TV wasn't 30Hz. 60i isn't the same as 30p.
|
| Some TVs and monitors do temporal dithering for you...
| Accept 8-bit input, and temporal dither it for the 6-bit
| display doesn't look that bad. Probably extends well to
| 10-bit input.
| crazygringo wrote:
| Did the author forget to finish the blog post?
|
| They show a single example of incorrect dithering, explain it's
| wrong, and then don't show a corrected version. There isn't a
| _single_ example of proper color dithering.
|
| And they talk about the distance to the nearest color (RGB) but
| don't explain how to account for black or white -- how to trade
| off between accuracy of hue, brightness, and saturation, for
| example.
|
| This post doesn't explain at all how to actually dither in color.
| I don't understand why this is on the front page with over 50
| votes.
| obrhubr wrote:
| You're right it kind of isn't finished... I had it done, then
| had an exchange with the author of didder and I'm still in the
| process of rewriting :)
| danybittel wrote:
| Error diffusion dithering is kind of old fashioned. It is a great
| algorithm where you only need to go though the image once, pixel
| by pixel. But it doesn't work well with todays hardware,
| especially GPUs. Would be fun to come up with new algorithms that
| are better parallelizable.
| pmarreck wrote:
| Deterministic random value dithering, where the chance of being
| the dithered color or not is based on the percentage that the
| true value is that color?
| badmintonbaseba wrote:
| Blue noise threshold map works really well on GPUs.
| criddell wrote:
| The dithering work Mark Ferrari did by hand on some of the old
| LucasFilm games was really impressive.
|
| https://www.superrune.com/tutorials/loom_ega.php
| alberth wrote:
| A fun website to try out different Dithering algorithms and
| settings.
|
| https://doodad.dev/dither-me-this/
| somewhereoutth wrote:
| See also FadeCandy by Micah:
|
| https://scanlime.org/2013/11/fadecandy-easier-tastier-and-mo...
|
| > Firmware that uses unique dithering and color correction
| algorithms to raise the bar for quality while getting out of the
| way of your creativity.
| marcusestes wrote:
| Cool project. Sad README update:
| https://git.approximate.life/fadecandy/file/README.md.html
| ggambetta wrote:
| Didn't try error diffusion, but I had good results with Bayer for
| the ZX Spectrum Raytracer [0]. Bayer only ever looks at the pixel
| it's considering, doesn't do math beyond comparing a value to its
| threshold, it was surprisingly easy to implement, and looks nice.
| A great choice for ridiculously underpowered devices :)
|
| https://gabrielgambetta.com/zx-raytracer.html#fourth-iterati...
| vintagedave wrote:
| > If the linearised version looks wrong to you, try opening it on
| a larger monitor in it's original size and check your gamma
| settings.
|
| It looks far too dark, and I'm viewing it on an iPad with a high
| DPI screen. I also strongly suspect I can't change the gamma on
| this device, nor have I ever knowingly done so. Anyone know why
| it looks bad?
| svantana wrote:
| It's the same on my 2024 MBP. Looking at a gamma calibration
| image [1], I estimate the built-in screen to have a gamma of
| ~1.4 (where the stripes and filled areas have the same
| brightness), way below the standard 2.2.
|
| [1] http://www.lagom.nl/lcd-test/gamma_calibration.php
| omoikane wrote:
| See also: "Joel Yliluoma's arbitrary-palette positional dithering
| algorithm"
|
| https://bisqwit.iki.fi/story/howto/dither/jy/
|
| This page focuses on ordered dithering, which tend to work better
| for animations compared to error diffusion based dithering
| schemes (like the linked article).
| JKCalhoun wrote:
| Just a guess -- perhaps Bill Atkinson dropped the two x 1/8 as a
| poor-man's contrast.
___________________________________________________________________
(page generated 2025-03-13 23:01 UTC)