[HN Gopher] Ever: Exact Volumetric Ellipsoid Rendering for Real-...
       ___________________________________________________________________
        
       Ever: Exact Volumetric Ellipsoid Rendering for Real-Time View
       Synthesis
        
       Author : alphabetting
       Score  : 68 points
       Date   : 2024-10-03 08:07 UTC (14 hours ago)
        
 (HTM) web link (half-potato.gitlab.io)
 (TXT) w3m dump (half-potato.gitlab.io)
        
       | alphabetting wrote:
       | Impressive video:
       | https://twitter.com/alexandertmai/status/1841739387400552826
        
       | xeonmc wrote:
       | Neat, the concept is so elegantly obvious:
       | 
       | - your dataset is already an analytic representation of stretched
       | spheres. Just assume that they are a hard shape of plasma with
       | uniform density throughout the volume.
       | 
       | - for each pixel perform ray intersection against each spheroid,
       | the thickness that the ray went through is precisely how much
       | thickness the spheroid is contributing light to your pixel
       | (obviously also multiplied by solid angle)
       | 
       | - since it's light, there is no occlusion, so just stack the
       | contribution from all the ellipsoids together and you're done.
       | 
       | - since you are rendering a well-defined shape without self
       | occlusion, there is no random popping in and out no matter the
       | angle.
       | 
       | The computation is practically equivalent to rendering CSG
       | shapes, except even easier since you only ever add and not
       | occlude/subtract. It also scales with rat racing hardware
       | directly.
        
         | lufasz wrote:
         | I'm actually in the market for some rat racing hardware.
         | Anything you'd recommend that won't be obsolete in six months?
        
         | amluto wrote:
         | I read this:
         | 
         | > since it's light, there is no occlusion, so just stack the
         | contribution from all the ellipsoids together and you're done.
         | 
         | and then I scratched my head at how you can possibly do a
         | credible rendering of any real scene without occlusion,
         | contemplated that the images in the paper absolutely had
         | occluded objects, and then read a bit more and figured it out:
         | 
         | Each ellipsoid has a "density," which is a single number
         | indicating the degree to which it absorbs light coming from
         | behind it. And this formulation allows the integral along a
         | path from infinity to the camera to be exactly evaluated. So
         | there _is_ occlusion! It just happens to work correctly even
         | when ellipsoids overlap.
         | 
         | [0] It's slightly more complicated, but not much. The raw
         | density scales a term in the integral, but this results in a
         | poorly behaved gradient, so the trained parameter is more or
         | less the opacity when looking through the center of the
         | ellipsoid through the shortest axis.
        
       | pavlov wrote:
       | It's great how the field of realtime volumetric rendering is
       | alive with so many options and new yet simple approaches are
       | being invented.
       | 
       | Reminds me of the mid-1990s before everyone basically agreed that
       | lots of triangles of any size are the right model representation.
       | There were NURBs and metaballs and reyes micropolygons and who
       | knows what else... Even Nvidia's first hardware accelerator chip
       | used quads instead of triangles (just before Microsoft made
       | triangles the standard in Direct3D).
       | 
       | Looking forward to seeing where this settles in a couple of
       | years! The intersection of video and 3D is about to get super
       | interesting creatively.
        
         | WithinReason wrote:
         | maybe something like this:
         | 
         | https://arxiv.org/pdf/2405.16237
        
         | GistNoesis wrote:
         | We all know where it's going since 1873. What matters is the
         | fun along the way.
        
           | cpldcpu wrote:
           | what are you referencing?
        
             | GistNoesis wrote:
             | Genesis 1:3 (Unknighted James Clerk Version).
        
             | pacaro wrote:
             | As @GistNoesis is being somewhat gnomic, I believe that
             | they are referencing
             | 
             | https://en.wikipedia.org/wiki/A_Treatise_on_Electricity_and
             | _...
             | 
             | Written in 1873 by James Clerk Maxwell
             | 
             | They also reference Genesis 1:3 "Let there be light"
        
       | yklcs wrote:
       | The main selling point of Gaussian Splatting over NeRF-based
       | methods is rendering (and training) speed and efficiency. This
       | does come at the cost of physical correctness as it uses
       | splatting instead of a real radiance field.
       | 
       | This method tries to move back to radiance fields, but with a
       | primitive-based representation instead of neural nets. Rendering
       | performance seems to be quite poor (30fps on a 4090), and
       | rendering quality improvements seem to be marginal.
       | 
       | I'm not quite sure I understand where this fits in when NeRFs and
       | 3DGS already exist at the opposite ends of the correctness-speed
       | tradeoff spectrum. Maybe somewhere in the middle?
        
         | jampekka wrote:
         | Primitive-based representations are a lot easier to manipulate
         | (e.g. animate) than NeRFs. Also they can be a lot more
         | efficient/scalable when there's a lot of e.g. empty space.
        
       | desdenova wrote:
       | I always assumed this was already how those were rendered,
       | because that's kinda obvious, and raymarching is a standard
       | technique for real-time volumetric rendering.
        
       ___________________________________________________________________
       (page generated 2024-10-03 23:01 UTC)