Dithering in Colour

(obrhubr.org)

119 points | by surprisetalk 3 days ago

13 comments

  • ggambetta 19 minutes ago
    Didn't try error diffusion, but I had good results with Bayer for the ZX Spectrum Raytracer [0]. Bayer only ever looks at the pixel it's considering, doesn't do math beyond comparing a value to its threshold, it was surprisingly easy to implement, and looks nice. A great choice for ridiculously underpowered devices :)

    https://gabrielgambetta.com/zx-raytracer.html#fourth-iterati...

  • AndrewStephens 6 hours ago
    Dithering is something of a lost art now that our displays can handle millions of colors in high definition, but it can be a striking artistic effect.

    If anyone thinks their websites are too colorful, I made a pure JavaScript web component to dither images on client in real time, taking into account the real pixel size of the current display.

    https://sheep.horse/2023/1/improved_web_component_for_pixel-...

    • Aardwolf 4 hours ago
      I think dithering should still be considered, since a super high detailed game otherwise pretty engine that then has banding in the sky is pretty ugly. 32-bit RGBA can still have visible banding which dithering can fix. 256 brightness levels per channel isn't all that much when it comes to subtle variations in sky colors, the eye is more sensitive than that

      12-bit per channel color might be enough to never have visible banding. Or dithering

      • AndrewStephens 4 hours ago
        You are right, of course. Imperceptible dithering is still technically used all the time. But the harsh dithered look of yesteryear, where images were crunched down to 1-bit or maybe 32 colors if you were lucky is seldom done today.
      • kurthr 2 hours ago
        With 100+Hz displays it's not that hard to do temporal dithering as well. Your cones are surprisingly low bandwidth (why old color TVs even worked at 30Hz), while your rods provide danger/flicker cues outside the fovea.

        Getting an extra 2bits of hue (ab) while maintaining luminance (L) is quite doable except at the chroma and brightness extremes where your eye mostly ignores them anyway. That could be done pretty high in the display stack. I'd also say that the DACs in many displays are capable of higher chroma resolution, but gamma non-linearity eats up a bit dynamic range.

      • 01HNNWZ0MV43FF 3 hours ago
        For anyone who hasn't seen it, "Banding in games" by one of the Playdead (Limbo, Inside) programmers: https://www.loopit.dk/banding_in_games.pdf

        Crysis had sky banding... Skyrim has the famous menu smoke mentioned in the PDF. All fixable, probably fixable on the hardware of the day. (I remember messing with dithering on a 2007 DX9 GPU)

    • hooli_gan 6 hours ago
      Very cool, but the image in the bottom of the page flickers when scrolling.
      • AndrewStephens 4 hours ago
        My code can (optionally, since it is often not useful) dither all the way down to the physical pixels of your display device for that really crisp, old-fashioned look. Most dithering projects on the web don't take this into account so look slightly soft around the edges of the pixels.

        The image at the bottom is an example. On some devices this interacts weirdly with the pattern of pixels or even the refresh rate when in motion due to scrolling.

      • tuyiown 6 hours ago
        Dithering does that !
  • crazygringo 6 hours ago
    Did the author forget to finish the blog post?

    They show a single example of incorrect dithering, explain it's wrong, and then don't show a corrected version. There isn't a single example of proper color dithering.

    And they talk about the distance to the nearest color (RGB) but don't explain how to account for black or white -- how to trade off between accuracy of hue, brightness, and saturation, for example.

    This post doesn't explain at all how to actually dither in color. I don't understand why this is on the front page with over 50 votes.

    • obrhubr 3 hours ago
      You’re right it kind of isn’t finished… I had it done, then had an exchange with the author of didder and I’m still in the process of rewriting :)
  • mattdesl 8 hours ago
    It might be worth using a lightness estimate like OKLab, OKLrab[1], or CIE Lab instead of the RGB luminance weighting, as it should produce a more perceptually accurate result.

    The other issue with your code right now, is that it is using euclidean distance in RGB space to choose the nearest color, but it would be probably also more accurate to use a perceptual color difference metric, a very simple choice is euclidean distance on OKLab colors.

    I think dithering is a pretty interesting area of exploration, especially as a lot of the popular dithering algorithms are quite old and optimized for ancient compute requirements. It would be nice to see some dithering that isn't using 8-bits for errors, is based on perceptual accuracy, and perhaps uses something like a neural net to diffuse things in the best way possible.

    [1] https://bottosson.github.io/posts/colorpicker/

  • Clamchop 3 days ago
    They may not want to imply that didder's linearized rabbit is wrong, but I'm comfortable saying so. It's not just a little dark, it's way dark, to the point of hiding detail.

    The linearized RGB palette is similarly awful. It clobbers a whole swath of colors, rendering them as nearly black. Purples are particularly brutalized. Yellows disappeared and became white.

    On my phone, the middle palette doesn't appear too bright to my eyes, either.

    Even the linearized gradient looks worse, .

    Maybe linear is not best for perceptual accuracy.

    • badmintonbaseba 8 hours ago
      I agree. I think the problem is a banal missing color transformation somewhere in the pipeline, like converting the palette and image to linear colorspace, doing the dithering there and mistakenly writing the linear color values instead of sRGB color values into the image.

      Others suggest that the error is using the wrong metric for choosing the closest color, but I disagree. That wouldn't such drastic systematic darkening like this, as the palette is probably still pretty dense in the RGB cube.

      Where the linearisation really matters is the arithmetic for the error diffusion, you definitely want to diffuse the error in a linear colorspace, and you are free to choose a good perceptual space for choosing the closest color at each pixel, but calculate the error in a linear space.

      Visual perception is weird. But when you squint your eyes to blur the image, you are definitely mixing in a linear colorspace, as that's physical mixing of light intensities before the light even reaches your retina. So you have to match that when diffusing the error.

      edit:

      It also doesn't help that most (all?) browsers do color mixing wrong when the images are scaled, so if you don't view the dithered images at 100% without DPI scaling than you might get significantly distorted colors due to that too.

      edit2:

      For comparison this is what imageworsener does:

      https://imgur.com/a/XmJKQnz

      You really need to open the image in a viewer where each image pixel is exactly one device pixel large, otherwise the color arithmetic used for scaling by viewers is of variable quality (often very poor).

    • obrhubr 2 days ago
      Thanks for your comment! I'm glad you're seeing the same thing :) I re-implemented the linearised dithering in python and got similar results. I checked and rechecked the colour profiles in GIMP, nothing... At this point I can only hope for an expert to appear and tell me what exactly I am doing wrong.
    • nextts 10 hours ago
      > We have just committed a mortal sin of image processing. I didn’t notice it, you might not have noticed either, but colour-space enthusiasts will be knocking on your door shortly.
    • Const-me 3 hours ago
      Yeah, every time I see articles about importance of linear color space for gradients, and see images there, I observe the opposite of what’s written in the text of these articles. Gradients in sRGB color space look better.

      I have a suspicion that might be because I usually buy designer-targeted wide gamut IPS displays. I also set up low brightness on them, e.g. right now I’m looking at BenQ PD2700U display with brightness 10/100 and contrast 50/100. However, sRGB color space was developed decades ago for CRT displays.

      • obrhubr 59 minutes ago
        Your monitor and your browser 100% affect the appearance. After calibrating your monitor, try opening the image in full resolution and take a few steps back.

        For me, viewing the images on my phone makes them look off.

    • Sesse__ 10 hours ago
      For perceptual color difference, there are much better metrics than “distance in linear RGB”. CIE has some implementations of a metric called ΔE*, for instance.

      I don't know if they actually do well in dithering, though. My experience with dithering is that it actually works better in gamma space than trying to linearize anything, since the quantization is fundamentally after gamma.

    • contravariant 7 hours ago
      The linearized gradient does look off, but not because it is linearized. It is simply wrong.

      The dithered gradient shouldn't be pure black halfway through.

  • spacejunkjim 8 hours ago
    When I saw this, I immediately had flashbacks to a little project I did for my CS course when I was an undergrad! We were all assigned a computer graphics algorithm and were tasked to build an animation explaining how it works.

    This was nearly eight years ago, but I managed to find it this morning and uploaded it to YouTube.

    Here was the resulting animation: https://youtu.be/FHrIQOWeerg

    I remember I used Processing to build it, and it took so long to animate as I had to export it frame-by-frame. Fun days!

  • alberth 2 hours ago
    A fun website to try out different Dithering algorithms and settings.

    https://doodad.dev/dither-me-this/

  • danybittel 5 hours ago
    Error diffusion dithering is kind of old fashioned. It is a great algorithm where you only need to go though the image once, pixel by pixel. But it doesn't work well with todays hardware, especially GPUs. Would be fun to come up with new algorithms that are better parallelizable.
    • badmintonbaseba 1 hour ago
      Blue noise threshold map works really well on GPUs.
    • pmarreck 5 hours ago
      Deterministic random value dithering, where the chance of being the dithered color or not is based on the percentage that the true value is that color?
  • kaoD 8 hours ago
    > Dithering a black-to-white gradient will be wrong without linearising first.

    TBH both look wrong to me. If I squint, neither dithering patterns match the original gradient... but the non-linearized one looks the most similar.

    What could be causing this?

    • shiandow 7 hours ago
      They seem to be using some kind of error diffusion. And getting error diffusion to play nice with linear colour space is nontrivial.

      I remember I had quite a bit of discussion with madshi when MadVR tried implementing it. You can do something that comes close by modifying the colour space into something that is gamma light in the integer part and linear light in the fractional part.

      If the value of a pixel is x you then get something like floor(x) + (l - ginv(x)) / (l - u) with l and u the the two shades corresponding to floor(x) and ceil(x) in linear light.

      Though technically error diffusion will still be incorrect, but it does handle constant shades correctly and most alternatives are worse somehow.

      • obrhubr 1 hour ago
        Thank you for pointing that out. The Atkinson dithering I was using was indeed messing with the results. I'll be updating the post shortly :)
    • TinkersW 7 hours ago
      I don't know where they got the idea you don't dither in srgb, the point of dithering is to map it to the nearest bit pattern with a random adjustment so that it could go either way(aside from artistic choice), you should dither in srgb if you are going to display it in srgb, which is probably why the "not linearized" version looks more accurate.

      See: Dithering should happen in sRGB https://www.shadertoy.com/view/NssBRX

      • bmandale 1 hour ago
        The OP example is clearly wrong, but this doesn't sound right either. The point of dithering is eg, if you have a pixel value of .5, to recreate the brightness of that with black and white pixels. The naive approach would do that with one black and one white pixel. But depending on how the display usually renders .5, then it might be better to replicate it with, say, 2 white pixels and 3 black pixels.
      • robinsonb5 5 hours ago
        I'm far from convinced that shadertoy demonstration is correct: If you set the number of bits to 1, the dithered version is clearly far too light, which is exactly what happens if you dither in gamma-encoded space rather than linear space.

        It gets much worse if you uncomment the SHOW_CORRECT define since the data is then being transformed back to SRGB before being quantised, which quite heavily skews the probability of which code point will be selected in favour of the lighter colour.

        Increasing the number of bits hides the effect somewhat by making more code points available. But because they're distributed in gamma-encoded rather than linear-encoded space, it's still not correct to assume that a 50/50 pixel mix of two adjacent code points will appear the same as the colour numerically halfway between them, unless you're making that judgement in linear space.

        The mistake the shadertoy is making is transforming the data to sRGB before quantising. Both dithering and quantising should be done in linear space (which is non-trivial since in linear space the codepoints aren't linearly distributed any more) - otherwise the dither function's triangular distribution is skewed by the sRGB transform.

    • badmintonbaseba 8 hours ago
      Apart from implementing it incorrect, an uncalibrated display could also cause this. Check out http://www.lagom.nl/lcd-test/gamma_calibration.php with DPI scaling turned off, at 100% zoom level (how browsers scale images are also horrible, so you want to avoid that).

      edit:

      Reading back, viewing the gradients also not at 100% zoom level could also itself cause the mismatch, because browsers just suck at image scaling.

    • Retr0id 5 hours ago
      You're looking at a scaled version of the bitmap (potentially re-scaled multiple times) and some or all of those interpolations may not have been done in a linear colour space.

      But in this case I think it's just wrong. The entire first 40% of the bar is black, and I don't think it should be.

    • gus_massa 8 hours ago
      Mac vs pc?

      They have a different default gamma and they may show a different gray level.

      (It bite me a long time ago. I made a gif that has the same RGB bacground than a webpage. In my pc it was fine, but in a mac they the border was very visible and the result horrible. My solution was to change the backgroung of the webpage from a RGB number to a 1 pixel gif with repetition or scale to fill the page.)

    • hagbard_c 8 hours ago
      > What could be causing this?

      Hypercorrection, in this care over-linearisation.

  • criddell 5 hours ago
    The dithering work Mark Ferrari did by hand on some of the old LucasFilm games was really impressive.

    https://www.superrune.com/tutorials/loom_ega.php

  • magicalhippo 8 hours ago
    I've always been curious to what degree, if any, color constancy[1] affects color dithering.

    Seems that at some level it should, though perhaps not directly at the pixel level due to the high frequency of the per-pixel differences, but maybe at the more coarse "averaged" level?

    One of those things I've wanted to explore but remains on my to-do list...

    [1]: https://en.wikipedia.org/wiki/Color_constancy

  • somewhereoutth 28 minutes ago
    See also FadeCandy by Micah:

    https://scanlime.org/2013/11/fadecandy-easier-tastier-and-mo...

    > Firmware that uses unique dithering and color correction algorithms to raise the bar for quality while getting out of the way of your creativity.

  • yapyap 9 hours ago
    Dithering is so neat.