[HN Gopher] Opacity Precision
       ___________________________________________________________________
        
       Opacity Precision
        
       Author : zdw
       Score  : 31 points
       Date   : 2021-12-06 15:20 UTC (1 days ago)
        
 (HTM) web link (bjango.com)
 (TXT) w3m dump (bjango.com)
        
       | notinty wrote:
       | >For an opacity slider to provide 101 steps, it needs to be at
       | least 101 pixels wide.
       | 
       | There's no reason this needs to be the case: slider elements
       | could trap the cursor and give realtime feedback of input.
       | 
       | The iPad Pro trackpad does this with all UI elements and I
       | believe it's an important step for UX.
        
         | marcedwards wrote:
         | Yep, that's an option as well, and some tools do it. Figma's
         | only layer opacity slider is via holding a modifier and
         | dragging the text input field.
         | 
         | I personally like having opacity sliders as well. They provide
         | a nice visual representation of the state that's easier to
         | parse than reading a number, and you can also do things like
         | set 100% with a single click. Layer opacity is important enough
         | that it should have a text field, a draggable value with a
         | modifier, and a slider.
         | 
         | The other main point of the article is that these tools don't
         | seem to even realise their precision sucks.
        
           | notinty wrote:
           | Agreed; immediate analog feedback of a slider is important.
           | 
           | Thanks for the great article, the images you created are
           | especially fantastic.
           | 
           | Edit: my friend just checked and Clip Studio Paint also has
           | 0-100 with no decimal support. Crazy that it's ubiquitous.
        
             | marcedwards wrote:
             | Sure is!
             | 
             | Interestingly, Acorn can show the full 0-255 range.
        
       | tshaddox wrote:
       | In CSS you can add 2 more hex digits to an RGB hex code to access
       | 256 possible opacity levels, like _background-color: #RRGGBBAA_.
       | 
       | It's difficult to verify, but it seems that browsers also accept
       | opacity values with arbitrary decimal precision in both
       | percentages and real numbers from 0 to 1. I would imagine those
       | get rounded at least to the closest value from 0 to 256.
        
         | marcedwards wrote:
         | Yep, that's my understanding as well.
        
       | dahart wrote:
       | The topic that this article raises, resolution of opacity, is
       | pretty interesting IMO, but I feel like the argument itself could
       | use some help.
       | 
       | > Figma accepts up to two decimal places in its opacity text
       | fields, providing 10,001 possible steps (0.00 to 100.00). That's
       | plenty for most uses.
       | 
       | That's overkill. Because the author assumed 8-bit color channels
       | at the start, it's worth noting that anything above 255 steps is
       | guaranteed to be unnecessary. Having 10k steps to control the
       | opacity of an 8-bit overlay is wasting a minimum of 5 bits.
       | 
       | One thing the author doesn't mention is that when you're blending
       | between two values, you only need as many steps as the difference
       | between them. So if you're putting transparent pink over red, you
       | won't need 255 steps, you might only need 50 steps.
       | 
       | It would be nice to have more examples where single bit precision
       | is necessary, and a demonstration of a visible problem that crops
       | up if the precision is too coarse. I think there are some such
       | examples out there. One that comes to mind is color correcting
       | _after_ doing transparency work. Another might be workflows that
       | involve recovering transparent colors from baked colors. (There
       | is a technical term for this called "un-premultiplying", which is
       | well known to have horrible precision issues.) Another huge
       | reason to consider transparency is with High Dynamic Range
       | workflows, where the source colors have high precision, and may
       | exceed the range of displayable colors.
        
         | marcedwards wrote:
         | >Because the author assumed 8-bit color channels at the start
         | 
         | The tools tested predominately use 8bit per channel. Photoshop
         | and Affinity Designer can use 16bit int or 32bit float, but
         | those modes are typically more for photo editing.
         | 
         | >Having 10k steps to control the opacity of an 8-bit overlay is
         | wasting a minimum of 5 bits.
         | 
         | The article does allude to why more may be important: "There's
         | no real reason to not support the full range of possible 8bit
         | opacities, or even more steps to cover wide gamut colour spaces
         | and higher bit depths."
         | 
         | CSS, iOS, and Android dev all typically just use floats with a
         | normalised range for opacity. The future is likely using
         | normalised floats for all colour values, like many platforms
         | already do. This also allows for extended ranges beyond 1.0 for
         | wide gamut support while still using sRGB.
         | 
         | Wide gamut displays are prevalent, as is sending 10bit per
         | channel to the display (often with a 16bit float per channel
         | window manager). It just seems weird for our tooling to clip
         | this stuff at the input. That's such an easy thing to fix.
         | 
         | >It would be nice to have more examples where single bit
         | precision is necessary, and a demonstration of a visible
         | problem that crops up if the precision is too coarse.
         | 
         | I think there's lots of examples, but some are going to sound
         | contrived.
         | 
         | Here's one that didn't make the cut: Illustrator's blend tool
         | creates intermediate objects by morphing the paths and
         | properties. This is a common way to make gradient-like effects.
         | Object opacities are also interpolated. What's interesting is
         | that Illustrator has 0-255 precision when doing the blend, but
         | if you edit the objects, you're the values display as 0-100 and
         | edits will snap to 0-100. I didn't include it, because it got
         | messy explaining the finer details.
        
         | tobr wrote:
         | > Having 10k steps to control the opacity of an 8-bit overlay
         | is wasting a minimum of 5 bits.
         | 
         | It's a user interface. The highest priority is how it helps the
         | user understand what they are doing, not how many bits of data
         | it takes. For example, if you want to express 1/3rd opacity in
         | percent, you want to be able to type 33.33% and have it stay
         | that way. It would be very frustrating if it was rounded to
         | whatever the closest "actual" value was.
         | 
         | I also seem to recall that Figma supports typing mathematical
         | expressions rather than literal values in many fields, which
         | makes it even more important that the value of the expression
         | isn't rounded destructively.
        
           | dahart wrote:
           | Great point about UI; I'm simply responding to the argument
           | that was made, which was based on the number of steps
           | available between two 8-bit values.
           | 
           | > It would be very frustrating if it was rounded
           | 
           | To your point, 33.33% is rounded, it isn't 1/3. But, at the
           | end of your operation, the result is going to be rounded and
           | clamped, so there are many workflows where professionals are
           | going to want to see the rounded "actual" values. Being able
           | to input 1/3 is valuable, and being able to see the precise
           | result is also valuable.
        
           | marcedwards wrote:
           | That's a great example. I wish I used it in the article.
        
       | egypturnash wrote:
       | My illustration practice relies on a _ton_ of transparency, and I
       | get by fine with a much lower-precision control of it.
       | 
       | Illustrator lets me set up keyboard shortcuts to
       | 0/10/20/30/40/50/60/70/80/90/100% opacity and this is how I
       | assign the vast majority of my opacity levels. Effectively I'm
       | "losing" a _lot_ more steps than just the difference between 100
       | and 255.
       | 
       | I'm also usually using some kind of blur on my shading, possibly
       | filling them with a gradient that has different opacity levels on
       | its stops, so there's a lot of subtler opacity happening in my
       | work. Sometimes there's an additional modifier created by
       | changing the opacity level of a group of translucent shapes, or
       | an entire layer. Or a complex appearance stack of multiple fills
       | and strokes applied to single paths, each of which has their own
       | opacity level (and mode!) that's modified by the path's overall
       | opacity. But in terms of _my_ interaction with the whole system
       | it 's largely just eleven levels spread out across 0%-100%.
       | 
       | More precision would be nice now and then I suppose, I wouldn't
       | complain if it existed, but in practice it's something I rarely
       | need. Bjango does a bunch of super-fiddly UI work so maybe they
       | need it more, I dunno.
        
         | marcedwards wrote:
         | I use those shortcuts as well, and often change things in 5% or
         | 10% jumps. This is a really deep topic though, and some of the
         | issues are from cumulative rounding errors, which can happen if
         | you stack lots of layers or stack layer effects. It'll also
         | depend if you're just moving sliders around until things look
         | good (which is great!), or if you're doing something that
         | require exact values (building LUTs to process map tiles etc).
         | 
         | My main concern is that information is being thrown out at the
         | input UI. As far as I can tell, Illustrator stores opacity as
         | 8bit values internally. This can be tested by using the blend
         | tool on objects with similar opacities.
         | 
         | I just want our tools to be better.
        
           | egypturnash wrote:
           | With Illustrator, getting _any_ change sure is an uphill
           | battle. I wonder how many years it'll be before selection
           | rectangles are drawn properly when the canvas is rotated?
        
       | cmiller1 wrote:
       | Can anyone really tell the difference between these fractional
       | steps of opacity?
        
         | lnanek2 wrote:
         | I can't, but at least the article did add a paragraph claiming
         | designers can: " Does it matter? Quite often, shadows are
         | incredibly sensitive to opacity changes, and many shadows use
         | values from around 5% to 20%. That means there's only 15 or so
         | steps in the usable range, and single step jumps can be quite
         | noticeable. This is not the most pressing issue in the design
         | tools we use, but it is a real problem. "
        
           | marcedwards wrote:
           | There can also be cumulative rounding issues if you'd
           | stacking lots of layers on top of each other. It's a subtle
           | issue, but a strange one. There's no deep technical reason
           | for it -- the inputs are just losing data.
        
       ___________________________________________________________________
       (page generated 2021-12-07 23:03 UTC)