[HN Gopher] Physically Based Rendering in Filament
       ___________________________________________________________________
        
       Physically Based Rendering in Filament
        
       Author : jasim
       Score  : 51 points
       Date   : 2021-07-15 11:50 UTC (1 days ago)
        
 (HTM) web link (google.github.io)
 (TXT) w3m dump (google.github.io)
        
       | mncharity wrote:
       | > Figure 44: The Planckian locus visualized on a CIE 1931
       | chromaticity diagram
       | 
       | Visualized... using broken code. Note the Planck curve isn't
       | going through white. Imagine your surprise at setting your
       | monitor to a 6500K or 5000K white point, and finding it green or
       | yellow.
       | 
       | > (source: Wikipedia)
       | 
       | Sigh. WP has Article and Talk, but neither has served as a
       | "writer's notebook" for long-term memory. So when there's lots of
       | brokenness out in the world, and there's been lots of broken
       | color code over the years, WP has difficulty remembering to avoid
       | it.
        
         | jacobolus wrote:
         | It's impossible to plot colors on a chromaticity diagram in a
         | way that isn't grossly misleading. This particular picture is
         | one attempt at a compromise that will give readers the right
         | conceptual impression, not "broken" accidentally.
         | 
         | The best compromise if you want color might instead be to only
         | color a narrow strip around the triangle forming the gamut of
         | your trichromatic additive display (or, say, sRGB), with
         | intensity of the 2-primary mixture at each point adjusted so
         | that lightness doesn't vary too sharply.
         | 
         | A large portion of the horseshoe is outside your display's
         | gamut; common pictures either color the outside of the gamut
         | gray, or try to clip out-of-gamut chromaticities to the nearest
         | in-gamut chromaticity. The former is confusing for viewers who
         | don't already know what they are looking at; the latter gives a
         | false impression. For the colors that are shown, one typical
         | way to plot a 2-dimensional picture of the gamut is "top down"
         | with the most intense available color for each chromaticity,
         | but this introduces substantially misleading lightness
         | artifacts based on the display not based on chromaticities per
         | se.
         | 
         | Beyond that, the xy chromaticity diagram should not be used.
         | Stick to the u'v' chromaticity diagram.
        
         | kelsolaar wrote:
         | Isn't it a consequence of a post rendering creative editing,
         | e.g. blurring of the generated chromaticities colours, rather
         | than the code itself being broken. No display can reproduce the
         | entirety of the chromaticity diagram, thus there will always be
         | some form of "cheating" required to try to picture it. I do
         | agree with you that burring is not great though and does not
         | really serve anything but aesthetics.
        
       | brundolf wrote:
       | What's the usecase for Google building a PBR? Is it just a "we're
       | huge so let's have a foot in everything" deal? Or do they have
       | some novel insight that could put it above and beyond the status
       | quo?
        
         | pjmlp wrote:
         | Originally it was presented as part of Sceneform.
         | 
         | https://developers.google.com/sceneform/develop
         | 
         | But like all big plans presented at Google IO it eventually
         | joined Google graveyard, and apparently they kept on working on
         | Filament alone.
        
           | RomainGuy wrote:
           | Filament is used at Google.
        
             | webmaven wrote:
             | _> Filament is used at Google._
             | 
             | What is it being used for?
        
         | msk-lywenn wrote:
         | Google Earth?
        
         | rsp1984 wrote:
         | Having worked at Google myself, quite honestly, I don't think
         | there is a use case. Romain and Mathias are both very senior
         | engineers. They get a lot of freedom to to work on whatever
         | they desire.
        
         | JoelEinbinder wrote:
         | > This is not an officially supported Google product.
         | 
         | Basically when Google employees ask permission to work on a
         | side project, Google asks them to release it open source under
         | Google's GitHub but mark it as not an official Google product.
         | This is probably something the authors work on nights/weekends
         | and not anything to do with their work.
        
           | brundolf wrote:
           | Oof, that's a pretty draconian constraint. Glad I don't work
           | at Google.
        
             | dekhn wrote:
             | You can also request that you obtain total control over the
             | project (not hosted by google, not copyrighted by google)
             | but those requests are hard to get approved.
        
               | jopsen wrote:
               | > but those requests are hard to get approved.
               | 
               | I had no problems getting approval for a tiny 2D home-
               | trainer game I worked on this winter..
               | 
               | I suppose it varies depending on what you do. But if you
               | work on part of it during 20% time, you probably can't do
               | this.
               | 
               | disclaimer: I work at Google.
        
       | gimmeThaBeet wrote:
       | I was really interested to read about their subsurface scattering
       | model, but alas, it's like the only TODO in this whole thing.
       | 
       | Which is really saying something, there is an incredible wealth
       | of information in here.
        
         | RomainGuy wrote:
         | That subsurface scattering model is not what you would use for
         | skin, etc. It's a fairly simple approximation similar to what
         | Unreal and Frostbite have used (use?) in the past to cheaply
         | approximate somewhat translucent materials. It's mostly still a
         | TODO because it's not that interesting.
        
       | anentropic wrote:
       | Does anyone have any experience of using this system outside of
       | the Android ecosystem?
       | 
       | i.e. the material compiler targeting desktop/vulkan
       | https://google.github.io/filament/Materials.html#compilingma...
        
         | RomainGuy wrote:
         | It's used on desktop and iOS as well.
        
         | pjmlp wrote:
         | On Android this was originally part of Sceneform.
         | 
         | https://developers.google.com/sceneform/develop
         | 
         | I doubt anyone beyond the authors is using it in any form.
        
       | redleader55 wrote:
       | Looking at the Suzanne demo -
       | https://google.github.io/filament/webgl/suzanne.html, you can see
       | some objects being reflected on the back of the head. Is it
       | raytracing?
        
         | [deleted]
        
         | monocasa wrote:
         | No, it's environment mapping.
        
           | [deleted]
        
         | sudosysgen wrote:
         | You can tell it's not raytracing because you can't see the ear
         | of the head reflected into the head. That's the tell-tale sign
         | of raytracing, when an object reflects into itself.
        
         | adamdusty wrote:
         | Without looking at the code, I doubt it. Real-time reflections
         | are generally done with offscreen rendering and cube maps. The
         | objects being reflected are still rendered, you just can't see
         | them because they aren't in the projection/view space.
        
           | dangerbird2 wrote:
           | The scene being reflected is almost certainly pre-rendered.
           | Fun fact: it's the Cornell Box, a very common placeholder
           | scene in computer graphics.
           | 
           | https://en.wikipedia.org/wiki/Cornell_box
        
         | qayxc wrote:
         | All I can see is a cube map. Looking at the source code of the
         | scene confirms this: it's just a traditional skybox projected
         | onto the mesh.
         | 
         | No raytracing there, just plain old cube-mapped reflections.
        
         | arduinomancer wrote:
         | To nitpick the other answers its actually called Image-Based
         | Lighting (IBL), which is a fancier PBR technique compared to
         | traditional environment-mapping
        
       ___________________________________________________________________
       (page generated 2021-07-16 23:02 UTC)