[HN Gopher] Motion blur all the way down (2022)
       ___________________________________________________________________
        
       Motion blur all the way down (2022)
        
       Author : azeemba
       Score  : 304 points
       Date   : 2024-03-04 03:43 UTC (19 hours ago)
        
 (HTM) web link (www.osar.fr)
 (TXT) w3m dump (www.osar.fr)
        
       | pierrec wrote:
       | Ah, I assume this was posted because ambient.garden was just on
       | the front page. Re-reading it now, I think it should really have
       | been 2 articles.
       | 
       | I think I still like the first half, it's a reasonable dive into
       | the detail of what motion blur is, or should be in theory. The
       | second half is a slightly crazy, extremely condensed explanation
       | of how the shader works for this particular "torusphere"
       | animation based on motion blur. I find that part mainly useful
       | because otherwise the code would be impenetrable at this point,
       | at least to me. In retrospect the transition between the 2 parts
       | feels a bit like jumping into a frozen lake, sorry about that!
        
         | chaboud wrote:
         | I read through it and got the sense that it just jumped into
         | one particularly odd special case bag of crazy after providing
         | a clean explanation of motion blur as a function of shutter-
         | angle (exposure window). I'm glad it wasn't just the glass of
         | wine I finished.
         | 
         | I've played with motion blur as a function of projection of 4D
         | extrusions a few times, but the practical context (video,
         | textures) ends up leaving it an impractical approach when
         | compared to sampling in hardware with caching. This write up
         | leaves me thinking "maybe just one more time".
        
           | pierrec wrote:
           | If you're working with triangle meshes, I think the approach
           | of extruding triangles into prisms looks promising. It allows
           | drawing arbitrarily long trails with OK performance. Sadly it
           | did not match the constraints of cramming the whole scene
           | into a shader, hence the weirder volume-based techniques.
           | 
           | Fast analytical motion blur with transparency: https://www.sc
           | iencedirect.com/science/article/pii/S009784932...
        
       | spondylosaurus wrote:
       | Wow, the live comparison demo is wild. When I was following along
       | up to that point, I felt like I "got it" on an intellectual
       | level, but it wasn't until I could toggle the motion blur on/off
       | that the difference was really apparent!
        
       | Animats wrote:
       | This gets past trying to simulate a film camera and tries to
       | simulate the human visual system. That's useful. Another step
       | towards reality and away from simulating obsolete technology.
       | Shutter-type motion blur may go the way of sepia-toned prints, 16
       | FPS black and white movies, and elliptical wheels from mechanical
       | shutters.
        
         | subb wrote:
         | Let me know when we have the tech to render the real sun's
         | power on your TV screen.
         | 
         | Reproduction of reality is not the goal, because it's
         | unachievable.
        
           | Animats wrote:
           | With enough money...
           | 
           | Larry Ellison used to have a TV projector in his house, with
           | the light output for a drive-in movie theater but aimed at a
           | small screen, so he could watch movies in broad daylight.
           | That was before everyone got bright screens.
        
             | account42 wrote:
             | I don't see how the result would avoid being either
             | uncomfortably bright or still having shitty contrast due to
             | the unavoidably high "black" levels from the ambient light.
        
           | account42 wrote:
           | Well HDR screens are becoming more common so we're moving in
           | that direction.
           | 
           | Also, simulating realistic processes is not incompatible with
           | tonemapping the result to be able to display it on limited
           | screens.
        
           | kuschku wrote:
           | Modern HDR screens already cause the same subconscious
           | perception as the sun would. You flinch, your eyes adjust,
           | you even move a little bit back and feel a kind of warmth
           | that isn't actually there.
        
             | subb wrote:
             | Anyone permanently burned their retina yet?
        
           | astrodust wrote:
           | Watching _Sunshine_ (2007) will be a real experience when
           | that happens.
        
       | geor9e wrote:
       | The tradeoff of rendering or filming motion blur at finite
       | refresh rates is the audience can move their eyes to follow an
       | object moving around the screen. In real life, that causes the
       | object to become sharp. So, you either need to track eye motion
       | and blur according to relative motion, or do no motion blur at an
       | infinite refresh rate. Neither is practical with todays's
       | technology, so it's always going to look wrong. A good director
       | or game designer will choose shutter rate or render blur
       | according to how they expect the audience eyes to be moving.
        
         | chaboud wrote:
         | However, the audience is used to shutter angle as part of the
         | visual vernacular (e.g., narrow shutter angle for hyper edgy
         | rap videos, long shutter angle for dreamy retro vibes). If
         | rendered content can't speak the same visual language, a tool
         | is missing, regardless of framerate (up to a point... At 400Hz,
         | I'd be impressed by someone really seeing the difference).
         | 
         | What's interesting about rendered content is that it can extend
         | that (e.g., a shutter angle beyond the duration of a frame),
         | playing with something we thought we had a handle on.
        
           | drc500free wrote:
           | I first learned about shutter angle when reading about its
           | use in Saving Private Ryan; the narrow 45 and 90 degree
           | shutters made the opening scenes much more visceral with so
           | much flying through the air.
           | 
           | https://cinemashock.org/2012/07/30/45-degree-shutter-in-
           | savi...
        
             | arrowsmith wrote:
             | That was a really interesting article, thanks!
        
           | CarVac wrote:
           | It's also used for subject isolation in much the same way as
           | depth of field.
           | 
           | Only objects not moving relative to the frame can be seen
           | sharply.
        
           | orlp wrote:
           | > At 400Hz, I'd be impressed by someone really seeing the
           | difference.
           | 
           | Trivial. Drag your white cursor quickly across the screen
           | against a black background. You will see clear gaps between
           | the individual cursor afterimages on your retina.
           | 
           | Double the FPS, halve the size of the gaps. On a 240Hz
           | monitor I can so clearly see gaps such that if they were
           | halved the gaps would still easily be visible. Ergo 400Hz
           | would still easily be distinguishable from continuous motion.
           | 
           | To put numbers on this, consider a vertical line 1 pixel wide
           | moving across a 4K screen in 1 second (that's not even that
           | fast). At 480Hz that's a shift of 8 pixels per frame. So
           | you'd need at least 8x the framerate for motion of this line
           | to be continuous.
        
             | zozbot234 wrote:
             | > Trivial. Drag your white cursor quickly across the screen
             | against a black background. You will see clear gaps between
             | the individual cursor afterimages on your retina.
             | 
             | That's the outcome of aliasing, not of the FPS limitation
             | itself. You could analytically supersample the motion in
             | the time domain and then blur it just enough to remove the
             | aliasing, and the distinct images would then disappear.
             | Motion blur approximates the same result.
        
               | blauditore wrote:
               | That's not true at all. When moving multiple pixels per
               | frame, there will always be visible jumps. Aliasing only
               | becomes relevant at the pixel level.
        
               | zozbot234 wrote:
               | "Pixels" and "frames" are the exact same thing
               | analytically, only in the spatial vs. time domain. This
               | is very much an instance of aliasing, which is why blur
               | happens to correct it.
        
               | 01HNNWZ0MV43FF wrote:
               | Yeah but if your eyes were tracking it smoothly, it would
               | not appear blurry. You could try to approximate _that_
               | with eye tracking but achieving such low latency might be
               | even harder than cranking up the FPS
        
               | zozbot234 wrote:
               | If your eyes were tracking the motion smoothly, the
               | moving object would not appear blurry, but the static
               | background absolutely would. So you'd need to apply anti-
               | aliasing to the _background_ while keeping the object
               | images sharp. (Similar to how a line segment that 's
               | aligned with the pixel grid will still look sharp after
               | applying line anti-aliasing. Motion tracking here amounts
               | to a skewing of the pixel grid.)
        
             | mrob wrote:
             | Another example is multiplexed LED 7-segment displays.
             | These distort in different ways depending on how you move
             | your eyes. Even 400Hz is too low to display a typical one
             | realistically. If you use motion blur you'll lose the eye-
             | movement-dependent distortion of the real thing.
        
             | Ruarl wrote:
             | Some experiments by a colleague a few years ago indicated
             | 700Hz might be the limit. Will take some verification,
             | obvs.
        
               | mathgradthrow wrote:
               | trying watching ping pong under a 1000hz strobe.
        
               | corysama wrote:
               | I read a study that put people in a dark room with a
               | strobing LED and told them to dart their eyes left and
               | right. 1000Hz was the limit before everyone stopped
               | seeing glowing dashes and saw a solid line streak
               | instead.
        
               | Modified3019 wrote:
               | I'm reminded by an old Microsoft input research video,
               | where 1ms latency response is what is needed for the most
               | lifelike response for touch drawing on a screen:
               | https://m.youtube.com/watch?v=vOvQCPLkPt4
        
           | harimau777 wrote:
           | Can you recommend any resources on how shutter angle is used
           | to communicate different messages in film? The Wikipedia
           | article only gives some very basic examples (short shutter
           | angle when you want to capture particles in the air or for
           | action scenes).
        
         | pierrec wrote:
         | Exactly, while doing background research for this I read a
         | really neat paper that describes what you're saying, and offers
         | a solution based on trying to predict the viewer's eye motion:
         | Temporal Video Filtering and Exposure Control for Perceptual
         | Motion Blur (Stengel et al., 2015)
        
         | Veedrac wrote:
         | You can't properly make a moving image sharp at a finite
         | refresh rate (though you can approach it for sufficiently high
         | ones), because the object can only move in a temporally
         | juddered path, which doesn't match the eye's movement.
        
           | planede wrote:
           | You can with a strobed backlight.
        
         | nyanpasu64 wrote:
         | Moving objects can become even sharper if each frame is
         | displayed in a fixed location for a shorter period of time
         | (reduced MPRT), preventing eye-tracking motion from smearing
         | each displayed frame. This can be achieved through CRT/OLED
         | scanout (often rolling) or LCD backlight strobing (usually
         | full-frame by necessity). Unfortunately displaying each frame
         | for a short time is unbearably flickery at 24 Hz (so movie
         | projectors would show film frames 2-3 times), just barely
         | tolerable at 50 (to the point some European CRT televisions
         | frame-doubled 50 Hz video to 100 Hz, causing objects to appear
         | doubled in motion), and ideally needs 70-75 Hz or above for
         | optimally fluid motion and minimum eyestrain (which can't show
         | 60 FPS recorded video without judder and/or tearing).
        
           | s4i wrote:
           | > to the point some European CRT televisions frame-doubled 50
           | Hz video to 100 Hz, causing objects to appear doubled in
           | motion
           | 
           | Hmm, this is not accurate (or I don't understand what you
           | mean). 100Hz CRT TVs available in the 90s/00s did not
           | interpolate frames to get smoother motion - they only existed
           | to reduce flicker. I think such TVs also existed in NTSC
           | markets (120Hz)?
           | 
           | Anyway, ever since the late 00s, pretty much all the TVs you
           | can buy from a store do come with an interpolation algorithm
           | to artificially display a higher frame rate image (e.g.
           | 100Hz/120Hz) from a lower frame rate source (e.g.
           | 23.976/24/29.97/30/50/59.94/60 fps) - which (personal
           | opinion) looks _terrible_ (and can be turned off from the
           | settings - but the default is always on). This is an
           | interesting side tangent when it comes to motion blur,
           | because the blur is prebaked in the input signal and cannot
           | be easily removed. Thus, the end result always has an
           | artificial look.
           | 
           | For instance, if the source material is shot in 24fps with
           | the typical 180 degree shutter angle, each frame spans 41.6ms
           | of which the shutter was open for 20.8ms. Then your TV
           | interpolates that to be 96Hz or whatever. However, the
           | individual output frames still look (or, with the added
           | artifacts etc., _mostly_ look) like the shutter was open for
           | 20.8ms per frame. However, each frame now spans 10.2ms which
           | is a shorter time than the shutter speed!
        
             | cubefox wrote:
             | As far as I understand, you get doubling not because of any
             | interpolation but because you track objects with your eye.
             | That means the tracked object is in rest in your vision.
             | Which means the screen actually moves relative to your
             | vision. Which means the object smears, where the pixel
             | length of the smear is equal to the pixel distance the
             | tracked object moves in the time the frame is visible on
             | screen. On CRT there is less smear, because the frames are
             | only flashed for a short time on the otherwise dark screen.
             | But if the FPS are not equal to the display Hz, e.g. a
             | half, the tracked object appears on multiple, e.g. two,
             | locations on the screen. Because you show the frame twice,
             | and if you track an object, the screen has moved (relative
             | to your vision) for a few pixels by the time you show the
             | second frame. This happens e.g. if you play a 30 FPS game
             | on a 60 Hz CRT, or when you watch 50 FPS content on a 100
             | Hz screen. Though the effect isn't usually as dramatic as
             | it may sound here. Doubling is also not as bad as smearing,
             | e.g. in the case of scrolling text.
        
           | cubefox wrote:
           | Double images are already present in NTSC/PAL due to
           | interlacing. So when doubling the Hz of PAL on a CRT, you get
           | a quadruple image. Though this probably still looks better
           | than non-interlaced content on LCD/OLED, which exhibits
           | smearing instead.
        
             | mrob wrote:
             | Double images are only present in converted film content,
             | which in the case of PAL is done by speeding it up from
             | 24fps to 25fps, and splitting each frame into two fields.
             | In the case of native PAL video content, each interlaced
             | field shows image data from a different time, so you don't
             | see double images.
        
               | cubefox wrote:
               | In case of PAL video we don't see two duplicate frames,
               | but we still see two different frames (fields) at the
               | same time. That's the "double image" I meant.
        
               | mrob wrote:
               | On an interlaced display, the two different fields are
               | not displayed at the same time. If you merge the two
               | fields into a single non-interlaced frame and display it
               | on a progressive scan display you might see a double
               | image, but that's not inherent to interlaced video.
        
         | cubefox wrote:
         | An alternative to infinite refresh rate is to show each frame
         | only a fraction of the normal frame time, i.e. to quickly
         | "flash" the individual frames and showing a black screen
         | otherwise. This reduces the maximum screen brightness, an it
         | requires a minimum frame rate to avoid flicker, but it reduces
         | the type of "eye tracking blur" (persistence blur) which isn't
         | present in real life. To be precise, to completely remove the
         | tracking blur you would need to flash each frame only for an
         | infinitesimal time period, which of course isn't realistic. But
         | VR headsets do use this flashing/strobing approach.
         | 
         | By the way, this is also the reason why CRT and Plasma screens
         | had much better motion clarity than LCD or OLED. The former
         | flash each frame for a short time, while the latter "sample and
         | hold" the frame for the entire frame time (e.g. 1/60th of a
         | second for 60 Hz). 60 FPS on a CRT looks probably more fluid
         | than 120 FPS on an OLED.
         | 
         | Another option for games is to indeed add a lot of frames by
         | using reprojection techniques. This can approximate the real
         | camera movements without needing to render a ton of expensive
         | frames in the engine. This also is already used in VR, just
         | currently not at overly high frame rates. This great article
         | goes into more detail:
         | 
         | https://blurbusters.com/frame-generation-essentials-interpol...
         | 
         | Something like 1000 FPS with reprojection are apparently quite
         | realistic, which should solve the problem of tracking blur
         | without reducing screen brightness.
        
           | account42 wrote:
           | > Something like 1000 FPS with reprojection are apparently
           | quite realistic, which should solve the problem of tracking
           | blur without reducing screen brightness.
           | 
           | Is a 1000 FPS screen more realistic than a screen capable of
           | the higher maximum brightness needed to compensate for black
           | frame insertion though? HDR screens are already a thing and
           | you could already gain persistence improvements there for LDR
           | scenes without needing any new hardware by always driving
           | pixels at max brighness but only for a reduced time depending
           | on the target luminance.
           | 
           | Or just reduce the ambient light enough - I even run even my
           | LDR monitor at 10% brightness.
        
             | cubefox wrote:
             | As far as I understand, the only thing that matters for
             | reducing unrealistic "tracking blur" is indeed how short a
             | frame is displayed on screen. Which could be achieved by
             | strobing / black frame insertion or by increasing frame
             | rate. Or even both, as in some VR applications. So the
             | effect should be the same, except for the brightness thing,
             | which may not be a problem if there is a lot of screen
             | brightness headroom anyway. For HDR LCD TVs the LED
             | backlights get quite bright actually. OLED not so much.
             | 
             | One advantage of higher frame rates (as opposed to
             | strobing) would be quicker input responses to, e.g., moving
             | the camera. That's not overly important on a normal screen
             | but quite significant on a VR headset where we expect head
             | movements to be represented very quickly.
        
             | mrob wrote:
             | Low persistence only gives perfect motion when your eye
             | motion perfectly matches the object motion[0]. For all
             | other motion, low persistence causes false images to be
             | generated on the retina, which can be seen as "phantom
             | array effect"[1].
             | 
             | I think interpolation up to several kilohertz is the best
             | solution, preferable starting from a moderately high frame
             | rate (e.g. 200fps) to minimize the latency penalty and
             | artifacts.
             | 
             | [0] https://en.wikipedia.org/wiki/Smooth_pursuit
             | 
             | [1] https://en.wikipedia.org/wiki/Flicker_fusion_threshold#
             | Visua...
        
               | cubefox wrote:
               | I think the second effect is quite small compared to the
               | first because CRTs are generally good at the first but
               | don't suffer much from the second, as far as I'm aware.
        
               | mrob wrote:
               | CRTs suffer strongly from phantom array effect, but
               | individual sensitivity to this effect varies. The people
               | who notice it are the same ones who complain about PWM
               | lighting.
        
           | terribleperson wrote:
           | Many gaming monitors offer a 'blur reduction' feature using a
           | strobing backlight.
        
       | nuclearsugar wrote:
       | Related: Gotta love the ReelSmart Motion Blur plugin for adding
       | motion blur based on the automatic tracking of every pixel from
       | one frame to the next. It's not always perfect but still does an
       | amazing job. Sometimes I get experimental and crank the
       | calculated motion blur way past the default value and it looks
       | surreal!
        
         | CyberDildonics wrote:
         | That's not related at all. That is optical flow, which is a 2d
         | filter technique that has been around for 28 years. It can't do
         | this and has nothing to do with anything here except for the
         | term 'motion blur'.
        
       | modeless wrote:
       | The torus demo is really neat. I think it's interesting how high
       | frame rate changes the perception of blur. I'm using a 240 Hz
       | display and on figure 5 I can't see the discrete circles until
       | around 12 rad/s. And even at 40 rad/s I can't see any difference
       | between the traditional and sine shutter options (in motion).
       | 
       | I highly recommend 240 Hz if only for the smoothness and low
       | latency of mouse movements. A 60 Hz mouse pointer is really,
       | really hard to go back to.
        
         | iforgotpassword wrote:
         | I've used both, a 4k 24" display ("once you used retina you
         | can't go back") and a 144Hz display, and I can't be arsed to
         | upgrade my main setup which is three old 24" 1920x1200 screens
         | at 60Hz. Some people really don't care. Heck I don't get those
         | discussions about input latency and terminal emulator holy
         | wars, I can't tell the difference between working on a
         | framebuffer console and working via ssh with an additional
         | 50ms.
         | 
         | I can tell the difference between a 30 and 60fps game, but only
         | if I pay attention to it. As long as it's steady and not
         | bouncing between the two I just don't care.
        
         | Etherlord87 wrote:
         | I agree the, in my case, 144 Hz cursor is nicely smooth, But I
         | have no problems going back to 60 FPS cursor.
         | 
         | I wonder, though, if maybe an interface with smoother
         | transitions is more soothing and maybe having a higher refresh
         | rate does affect one's mental state over time...
        
       | bjano wrote:
       | It seems to me that the demonstration is calculated in sRGB
       | space, so with non-linear brightness values and I suspect that
       | most of the unnaturalness of the smear is due to that. To
       | simulate the physics this would need to be done with linear
       | brightness values and only at the end converted to sRGB.
       | 
       | (unless some non-linear effects in human visual perception cancel
       | all that out, but it should at least be mentioned if so)
        
         | WithinReason wrote:
         | Yes I noticed that the gamma is all wrong, it actually defeats
         | the purpose of the article since you don't get smooth perceived
         | motion blur with the figures. This could have been so much
         | better.
        
         | Solvency wrote:
         | It's wild to me the author would overlook such a crucial thing
         | (ignoring linear space). But then again, even Adobe barely
         | supports linear and it's 2024.
        
         | pierrec wrote:
         | Thanks for pointing that out. I've done color space conversion
         | in other graphics applications but clearly haven't learned my
         | lesson. I'll double check and update the interactive figures
         | (and text) if it makes sense. The main "torusphere" shader
         | should be fine because the motion blur is non-realistic and
         | hand tuned, but the first couple of interactive figures are a
         | direct application of theory so what you're saying applies to
         | those. Overall I don't think it invalidates the main ideas in
         | the text though.
        
       | zubairq wrote:
       | Wow, watching motion blur all the way down really made me think
       | about string theory, atoms, and how the universe is made for some
       | reason!
        
         | cjdell wrote:
         | It's an interesting metaphor. Electron shells are sometimes
         | described as a cloud of probabilistic density, as in the shade
         | is an indication of the probability of finding an electron at
         | that point if one were to freeze time. Obviously it gets
         | weirder the deeper you get.
        
       | Anotheroneagain wrote:
       | It was noted early in the development of cinema that the required
       | framerate would probably be much lower, if (IIRC) the shutter
       | could be somehow replaced by blending the neighboring frames
       | together, that is the exposure would gradually shift from the
       | first frame to the next.
        
       | Rexxar wrote:
       | Shouldn't the torus and the latter sphere be partially
       | transparent if they are build from motion blur ? They seems to
       | become opaque again at some point and it feels strange to me.
        
         | johanvts wrote:
         | Is light discrete or continuous?
        
           | account42 wrote:
           | Yes.
        
         | xeyownt wrote:
         | I guess it cannot become transparent because you cannot see
         | through an infinitely fast moving / bouncing object, because
         | necessarily the object would intercept any ray of light
         | crossing its course.
         | 
         | However the light emitted by the moving object should be lower
         | than the static object. So as the distance increases, the
         | object should become darker and darker.
        
           | Rexxar wrote:
           | Here is a real life similar example:
           | https://mitxela.com/projects/candle. It's clearly transparent
           | (except in middle).
        
             | vultour wrote:
             | Yes but that one is not moving infinitely fast. If it kept
             | accelerating it would eventually become opaque.
        
               | willis936 wrote:
               | If the ball takes up 45 degrees in the torus then it
               | would be 7/8 transparent when moving infinitely fast. It
               | would be another 1/8 transparent when spinning the torus
               | into a sphere.
               | 
               | "Infinitely fast" being a stand-in for "sufficiently
               | fast" and gated by the speed of light, of course.
        
               | CyberDildonics wrote:
               | There is no such thing as "moving infinitely fast",
               | that's something the other person made up. What if light
               | didn't fly straight and made loops and flowed like water
               | and clustered up and stuck together?
               | 
               | If you're going to make up nonsense, you're not talking
               | about physical reality you're talking about cartoons or
               | magic and fantasy.
               | 
               | When something moves fast in photography it has motion
               | blur and that spreads its coverage over a larger area. In
               | this demo the motion blurred 'torus' and 'sphere' both
               | get made artificially more opaque to make the animation
               | work. You can see it happen.
        
         | drewtato wrote:
         | It's definitely blending between sphere and torus, and then
         | torus and sphere. Otherwise it would keep getting fainter as
         | time went on.
        
         | pierrec wrote:
         | Yes, the torusphere concept is physically impossible. Because
         | the whole thing is a loop, the object doesn't even have any
         | theoretical substance, it's all made of (artificially
         | thickened) motion blur. Hence the title, motion blur all the
         | way down.
        
       | dinobones wrote:
       | Alternative title: Visual proof of the Rutherford model
        
       | virtualritz wrote:
       | A good overview is in [1].
       | 
       | Curiously, before the classic paper on modeling shutter
       | efficiency [2] came out in 2005, all renderers used in VFX
       | production used box shutters. I.e. the shutter opens instantly,
       | stays open for the specified duration and then closes instantly.
       | 
       | When you watch movies like "Jurassic Park" or "The Mask" and see
       | scenes with extreme motion blur, that's PhotoRealistic RenderMan
       | with a box shutter.
       | 
       | The first in-the-wild 1:1 implementation of the parameterization
       | in [2] was done in [3] in the same year the paper came out
       | (hasn't changed until today). It was first used on "Charlotte's
       | Web" (only the spider character done by Rising Sun Pictures has
       | it as they used [3] for it).
       | 
       | Pixar added it a few years later too and went a tad overboard
       | with theirs in [4]. Most offline renderers nowadays have this
       | feature and call it a "shutter curve".
       | 
       | [1] O. Navarro et al.: Motion Blur Rendering: State of the Art
       | (https://citeseerx.ist.psu.edu/doc_view/pid/fc23fb525cafa8fe6...)
       | 
       | [2] Stephenson, Ian: Improving Motion Blur: Shutter Efficiency
       | and Temporal Sampling
       | (https://staffprofiles.bournemouth.ac.uk/display/journal-arti...)
       | 
       | [3] https://www.3delight.com/, see specifically
       | https://nsi.readthedocs.io/en/latest/nodes.html
       | 
       | [4]
       | https://renderman.pixar.com/resources/RenderMan_20/cameramod...
        
       | Etherlord87 wrote:
       | At first I was wondering why the shading of the ball is off -
       | turns out it's because it's not a ball but a rotating "torus"
       | (orbiting ball)... The ball probably would look better if there
       | was a longer period between the last and first phases, but that
       | could reduce the power of the surprise.
        
       | yeknoda wrote:
       | Motion is an invisibility cloak.
        
       | causi wrote:
       | Motion blur makes sense for films. I have no idea why anyone
       | wants it in a game. The limitations of display technology and of
       | the human eye introduce all the blur I could ever want.
        
         | JKCalhoun wrote:
         | I added motion blur in a slot-machine like game I wrote.
         | 
         | http://www.softdorothy.com/Blog/Entries/2011/10/26_Making_Th...
        
           | jabroni_salad wrote:
           | You only have it on a moving element so that's okay.
           | 
           | What gamers really dont like is when you move the camera
           | slightly and the entire scene turns into a deep fried oil
           | painting.
        
         | virtualritz wrote:
         | > Motion blur makes sense for films. I have no idea why anyone
         | wants it in a game.
         | 
         | TLDR; motion blur always makes sense atm. Not less so for
         | games.
         | 
         | For games the difference it makes depends on the FPS and the
         | distance an object moves within the FOV.
         | 
         | Estimates say humans can see max. 60FPS. Even if your game runs
         | at 120FPS and something moves accross the entire FOV -- that is
         | two discrete samples of that object at most. You will get
         | strobing.
         | 
         | Only if you render the geometry with motion blur will this look
         | natural.
         | 
         | The alternative is probably to have mutliple 10k FPS but that
         | is wasteful.
         | 
         | If your biological limit is around 60FPS, it should be cheaper
         | to generate motion-blurred frames at this rate (where more
         | samples are used in areas with more motion blur) than rendering
         | every pixel of a frame at muliple 10k FPS.
        
           | causi wrote:
           | Human eyes do not work based on discrete frames at all. The
           | entire "sampling" paradigm is completely wrong.
           | 
           |  _Estimates say humans can see max. 60FPS._
           | 
           | This is a ridiculous statement. You know that's wrong if
           | you've ever switched your phone between 60 and 120hz. Even
           | without personal experience, most HN readers would be aware
           | that, for example, when the Oculus Rift folks were developing
           | their headset they found 90hz to be the minimum to avoid
           | visual disturbances and nausea in an immersive environment.
        
             | virtualritz wrote:
             | Ofc the human visual system does not work at discrete
             | frames.
             | 
             | However, there is a threshold on how much information your
             | brain can process and tests indicate this maxes out at the
             | temporal detail you can fit into 60 'discrete' frames per
             | sec. For many people it is actually much less.
             | 
             | That is: in general.
             | 
             | Specifically this threshold is differs based on certain
             | variables; the most important one being available light
             | (number of photons per unit of time).
             | 
             | I.e. the less light, the more motion blur there is because
             | there is more temporal integration happening in your
             | brain's visual system. Estimates say your brain goes down
             | to a temporal information density of less than 10 FPS in
             | very low light circumstances [1].
             | 
             | Your brain does perceive motion blurred frames differently.
             | How do I know? I worked in VFX on many blockbuster movies.
             | We do preview stuff using realtime graphics (aka what game
             | engines do) w/o motion blur at equal frame rates as the
             | final frames (which are with motion blur, ofc).
             | 
             | Even on projects where we used 60 fps there is visible
             | strobing w/o motion blur. And as I said: even when you
             | preview stuff at 120 fps or more there is visible strobing
             | with geometry that covers big distances in your FOV.
             | 
             | Motion from game engines w/o motion blur does not least
             | look like game engines because of this. There is also a
             | night and day difference between post-process motion blur,
             | based on 'smearing' the image based on a motion vector pass
             | and true 3D motion blur.
             | 
             | "The Hobbit" triology was done at 60FPS. But it still had
             | motion blur, for obvious reasons.
             | 
             | [1] https://www.sciencedirect.com/science/article/pii/S0042
             | 69890...
             | 
             | EDIT: typos/grammar
        
               | causi wrote:
               | The Hobbit was recorded at 48fps, not 60.
               | 
               |  _And as I said: even when you preview stuff at 120 fps
               | or more there is visible strobing with geometry that
               | covers big distances in your FOV._
               | 
               | Exactly. The game has _no idea_ what is and isn 't moving
               | in my FoV because it doesn't know what I'm looking at,
               | only what the camera is looking at. Subjectively, in my
               | experience, games do a terrible job at predicting what
               | I'm looking at and I'd prefer they stop messing up my
               | experience by trying. Strobing is better than smeary
               | mess.
        
             | virtualritz wrote:
             | See also https://news.ycombinator.com/item?id=39589268 for
             | a test you can do yourself, if you have a 120fps screen.
             | 
             | If that screen had e.g. 12,000fps, you would see the mouse
             | cursor as a motion-blurred streak, not individial images of
             | the cursor next to each other.
        
           | kroltan wrote:
           | For (the vast majority of) games the player can be looking at
           | anything on screen, not just a specific point.
           | 
           | In real life, eyes track objects in order to minimise blur.
           | 
           | But realtime motion-blur implementations just blur everything
           | that moves _relative to the camera_ , which is not correct,
           | and makes the human looking at the screen lose track of the
           | objects they were looking at.
           | 
           | If we had pervasive perfect low latency eye tracking then a
           | game could pull off something that actually looks plausible,
           | but that is not the case, so the only games that can get away
           | with it are the ones that are largely a movie and can take
           | advantage of the motion blur cinematographically.
        
         | ranger207 wrote:
         | Yes, motion blur gets turned off immediately on every game that
         | I have. I consider games that don't allow you to turn it off
         | overproduced and likely to have other directorial choices I
         | won't agree with. I would like to see the game environment, not
         | a smeared mess of random colors
        
       | dukeofdoom wrote:
       | I want to try this in my pygame game. Any python version of this.
        
       | de6u99er wrote:
       | I love thise interactive tutorials. Since the method and it's
       | parameters can be changed it would ve interesting to see some
       | form of performance impact.
        
       | jvanderbot wrote:
       | Despite the promise of this making visual media feel more real, I
       | can't help but feel like, for games, this makes the video games
       | look like a cheap approximation of over-edited movies. For things
       | moving real fast, or real close, or especially with non-ego
       | motion, this all makes sense. But it is over-used for things like
       | "the character turns quickly".
       | 
       | When I snap my head (or eyes) around, I don't see a blurry image,
       | I see the new image, and my brain drops the intermediate data.
       | You can test this by looking at yourself in a mirror, focusing on
       | one eye, then another. Do you see your eyes or face or anything
       | blur?
       | 
       | Adding blur just delays the presentation of the new view when
       | moving your POV in games. It's distracting and unrealistic.
        
         | throwaway8481 wrote:
         | I feel like motion blur should be used like mouse acceleration.
         | Like you said: If I whip my head around quickly that image
         | should present clearly and immediately. However, if I'm in a
         | car or flying through the air (in a game) this may distort the
         | image as I'm glancing. This is the kind of drag-as-you-
         | accelerate blur I'm expecting, and it can really make a fast-
         | paced moment look that much more exciting. Motion blur is
         | poorly executed everywhere.
        
         | sxp wrote:
         | > You can test this by looking at yourself in a mirror,
         | focusing on one eye, then another. Do you see your eyes or face
         | or anything blur?
         | 
         | This isn't the best test since your brain compensates for
         | saccades by censoring input for a while. And it has built-in
         | motion stabilization which allows you to read a sign while
         | walking down the street.
         | 
         | A better test would be to look at your finger or hand while
         | waving it very quickly. You'll see motion blur as your finger
         | moves. In certain cases, you can see individual "frames" as an
         | afterimage. E.g, if you look at a modern car tail light, the
         | LED isn't on consistently but uses PWM to blink really quickly.
         | So if you move your eyes while looking at a tail light at
         | night, you'll see a series of dots rather than a blurry image.
         | Once you learn this trick, you can distinguish between analog
         | lights and PWMs by noticing if they are blurry or discrete.
        
           | jvanderbot wrote:
           | I'm more referring to FPS obsession with blurring things when
           | _the character_ moves, for that, turn your head quickly: Do
           | you see a blurry room? I do not. Because your eyes must also
           | move, and as you say, the brain censors because a  "turn and
           | look" includes an eye movement (I hypothesize).
           | 
           | For fast moving things _outside of ego motion_ , I can see a
           | reason to use blur (e.g., your hand waving).
        
             | SamBam wrote:
             | That's a good point, but how would turning your head in a
             | first-person perspective ever look good in a game? As you
             | say, when we turn our own heads our eyes naturally saccade.
             | But I'm not sure having this done unnaturally -- e.g. by
             | having a couple static intermediary images -- would look
             | right at all. But smooth isn't right either.
        
               | jvanderbot wrote:
               | It would rely on the eye's natural saccade. My brain
               | drops the intermediate frames already and I mostly just
               | see the new perspective. _just like real life_. Adding
               | blur makes the motion _stand out_.
        
               | SamBam wrote:
               | Except that your natural saccade is also wired to your
               | motor neurons in your head and neck. For example, there's
               | a strong connection between the neurons that fire to turn
               | your head 100 to the right, and the neurons that fire to
               | turn your eyes 100 to the left, which allows your eyes to
               | fixate on something even as your head moves.
               | 
               | Indeed, it's almost impossible to turn your head slowly
               | without your eyes fixing on points in your field of view,
               | and then saccading to the next point. Try it. This is a
               | hard-wired response. (You can perform some tricks to do
               | it, like completely blurring your eyes, but that's not
               | really the point.)
               | 
               | Meanwhile, if we watch a video with the camera turning,
               | it's easy to keep our eyes staring straight ahead.
               | 
               | This shows that there is going to be a difference between
               | a head naturally turning and watching a video of a head
               | turning.
               | 
               | I think you're describing saccades as just being the
               | "dropping out" of frames, and not including the
               | extremely-relevant motion that our eyes do during a
               | saccade.
        
               | jvanderbot wrote:
               | I appreciate the attention to detail in this thread.
               | 
               | I think the only thing I can say here is - I dont see
               | anything weird when I turn my head, and when FPS games
               | added blur to "character moves his own body", it felt
               | much less natural, not more. So, to answer question a few
               | posts up "how would we do it?" I'd just say "The way we
               | always have before we got enough GPU to make everything
               | look like a movie"
        
           | robertsdionne wrote:
           | The point is that ego-motion-blur is dumb. (It can also make
           | players motion-sick.)
           | 
           | You say it yourself: "your brain compensates for saccades by
           | censoring input for a while. And it has built-in motion
           | stabilization which allows you to read a sign while walking
           | down the street."
        
             | robertsdionne wrote:
             | I think the best implementation (without eye-tracking)
             | would not motion-blur the entire scene when the player
             | moves themselves (ego-motion-blur) but would blur moving
             | objects. However, if the player's view motion happens to
             | coincide with the motion of objects in the scene, for
             | instance if the player is tracking the object (maybe
             | another player so they can shoot it in an FPS) then the
             | object-motion-blur should be reduced according to the
             | degree of tracking. I'm sure some game engines already do
             | this.
        
           | devilbunny wrote:
           | This was one of the interesting side effects of a small
           | retinal hemorrhage. I had a fixed black spot (well, it
           | actually looked a little like a thread) and the fact that it
           | didn't move in my visual field made me realize just _how
           | many_ saccades you do.
        
         | Log_out_ wrote:
         | Actually you do not see a new image, you see a amount of
         | details and your brain rewrites the history of what you saw so
         | there is continuity.
        
         | Wowfunhappy wrote:
         | In video games, I find motion blur helps to compensate for
         | lower framerates. It's particularly necessary at 30 fps, but
         | nice even at 60 fps.
         | 
         | On the rare occasion I've been able to play a video game at 120
         | fps--my projector is limited to 60hz, sadly--I've found that I
         | prefer to play without motion blur.
        
       ___________________________________________________________________
       (page generated 2024-03-04 23:01 UTC)