[HN Gopher] Spectral Ray Tracing
___________________________________________________________________
Spectral Ray Tracing
Author : earslap
Score : 249 points
Date : 2024-04-14 23:45 UTC (23 hours ago)
(HTM) web link (larswander.com)
(TXT) w3m dump (larswander.com)
| spacecadet wrote:
| Lovely. Not sure if the author would agree... There was much to
| love and hate about the nascent "new aesthetic" movement, but
| this demonstrates the best of that genre.
| sudosysgen wrote:
| A great spectral ray tracing engine is LuxRender :
| https://luxcorerender.org/ (the older one, that is - the newer
| LuxCore renderer does not have full spectral support)
|
| Beyond the effects shown here, there are other benefits to
| spectral rendering - if done using light tracing, it allows you
| to change color, spectrum and intensity of light sources after
| the fact. It also makes indirect lighting much more accurate in
| many scenes.
| NBJack wrote:
| The beauty of mathematics and physics in action. I wonder if some
| of the tweaks made for the sake of beauty could be useful in
| other means of visualizations.
|
| It also reminds me of a time that I was copying code from a book
| to make polyphonic music on an Apple II. I got something wrong
| for sure when I ran it, but instead of harsh noise, I ended up
| with an eerily beautiful pattern of tones. Whatever happy
| accident I made fascinated me.
| DevOfNull wrote:
| If anyone wants to make their own: The free e-book Ray Tracing
| Gems II [1] covers realtime GPU ray tracing with modern APIs and
| hardware acceleration, and has a chapter about spectral rendering
| (Chapter 42: Efficient spectral rendering on the GPU for
| predictive rendering).
|
| [1] https://www.realtimerendering.com/raytracinggems/rtg2/
| kqr wrote:
| I can't take time to fiddle with raytracing (however much I'd
| want to!) but I skimmed the first half of that book and alias
| sampling is a _very_ elegant and nice technique. Wish I had
| known about it earlier! It is useful in a far broader context
| than graphics.
| superb_dev wrote:
| I would also recommend Ray Tracing in One Weekend [1] if you
| want a very quick introduction, and Physically Based Rendering
| [2] if you want to then go really in depth.
|
| [1] https://raytracing.github.io/
|
| [2] https://pbrt.org/
| jplusequalt wrote:
| Highly recommend Physically Based Rendering. As a book, Pbrt
| is to the study of path tracers, as Gravity is to the field
| of general relativity. Wholly encompassing and rigorous.
| raytopia wrote:
| Does anyone know if someone has attempted real time spectral
| rendering? I've tried finding information before but have never
| had any luck.
| dagmx wrote:
| Real-time is unfortunately a sort of vague term.
|
| If you mean raster rendering pipelines then I don't believe
| it's possible because the nature of the GPU pipelines precludes
| it. You'd likely need to make use of compute shaders to create
| it at which point you've just written a patthtracer anyway.
|
| If you mean a pathtracer , then real-time becomes wholly
| dependent on what your parameters are. With a small enough
| resolution, Mitsuba with Dr.JIT could theoretically start
| rendering frames after the first one in a reasonable time to be
| considered realtime.
|
| However the reality is just that even in film, with offline
| rendering, very few studios find the gains of spectral
| rendering to be worth the effort. Outside of Weta with Manuka ,
| nobody else really uses spectral rendering. Animal Logic did
| for LEGO movie but solely for lens flares.
|
| The workflow change to make things work with a spectral
| renderer and the very subtle differences are just not worth the
| high increase in render time
| zokier wrote:
| Spectral rendering imho is good example how ray tracing in itself
| is not the end-game for rendering, it's more just starting point.
| Occasionally I see sentiment that with real-time ray tracing
| rendering is a solved problem, but imho it's far from truth.
|
| Afaik most spectral rendering systems do not do (thin-film)
| interference or other wave-based effects, so that is another
| frontier. Reality has surprising amount of detail.
| Klaster_1 wrote:
| The closer rendering comes closer to underlying physical
| principles, the more game engines will become world simulation
| engines. Various engine parts commonly seen today will converge
| towards a common point, where, for example, we'll observe less
| distinction between physics and rendering layers. I wonder if
| this trend can be traced to some degree even today. Several
| orders of compute growth later, we'll look upon current
| abstractions in the same manner as on the 30 years old state of
| the art, shaped by technical limitations of the yesteryear,
| obvious in hindsight. Love the perspective this puts the things
| into.
| zokier wrote:
| I disagree. The goal for most games is not to simulate the
| real world accurately, but to be a piece of entertainment (or
| artwork). That sets different requirements than just "world
| simulation", both from mechanics point of view and from
| graphics point of view. So engines will for a long time be
| able to differentiate on how they facilitate such things;
| expressivity is a tough nut to crack and real-world physics
| gets you only so far.
|
| Even photorealism is a shifting target as it turns out that
| photography itself diverges from reality; there is this trend
| of having games and movies look "cinematic" in a way that is
| not exactly realistic, or at least not how things appear to
| human eye. But how scenes appear to human eyes is also tricky
| question as humans are not just simple mechanical cameras.
| z3phyr wrote:
| Gameplay is directly influenced by the "feel" of the world.
| I am not strictly talking about photo-realism, but (1) how
| the world reacts to input (2) how is it consistent.
|
| Physics is not about real world accuracy, but about how
| consistently stuff interacts (and its side effects like
| illumination) in the virtual world. There will be a time in
| the future when the physics engine will become the
| rendering engine, just because there are infinite gameplay
| possibilities in such a craft.
| magicalhippo wrote:
| > Reality has surprising amount of detail.
|
| Another one that few implement, and which can have a quite
| noticeable effect in certain scenes, is polarization of
| light[1].
|
| [1]: https://www.giangrandi.ch/optics/polarizer/polarizer.shtml
| Cthulhu_ wrote:
| It's like a fractal; the closer you look, the more details you
| notice affecting what you see. It's like we're creeping to 100%
| physically accurate rendering, but we'll probably never get to
| 100%, instead we'll just keep adding fractions.
| philsnow wrote:
| > I've been curious what happens when some of the laws dictating
| how light moves are deliberately broken, building cameras out of
| code in a universe just a little unlike our own. Working with the
| richness of the full spectrum of light, spectral ray tracing has
| allowed me to break the rules governing light transport in
| otherworldly ways.
|
| This reminds me of diagnosing bugs while writing my own
| raytracer, and attempting to map the buggy output to
| weird/contrived/silly alternative physics
| uoaei wrote:
| Are there any good resources for spectral ray tracing for other
| frequencies of light, e.g. radio frequencies?
| magicalhippo wrote:
| What are the applications you have in mind?
|
| I'm no RF guy, but I imagine you quickly will have to care
| about areas where the wavelike properties of EM radiation
| dominates, in which case ray tracing is not the right tool for
| the job.
| itishappy wrote:
| It's the same thing! What are you trying to do with it?
|
| One thing you'll run into is that there isn't a clear frequency
| response curve for non-visible, so you need to invent your own
| frequency to RGB function (false color).
|
| Another thing is that radio waves have much longer wavelengths
| than visible, so diffractive effects tend to be a lot more
| important, and ray tracing (spectral or otherwise) doesn't do
| this well. Modeling diffraction is typically done using
| something like FDTD.
|
| https://en.wikipedia.org/wiki/Finite-difference_time-domain_...
| DiogenesKynikos wrote:
| If you want to go all the way, you have to track not only the
| wavelength of each ray, but also its polarization and phase. The
| situations in which these properties actually matter for human
| perception are rare (e.g., thin films and diffraction gratings),
| but they exist.
| dagmx wrote:
| Some examples of Spectral ray tracers
|
| Mitsuba is an open source research renderer with lots of cool
| features like differentiable rendering. https://www.mitsuba-
| renderer.org/
|
| Maxwell has two spectral modes of varying accuracy. The more
| complex method is often used for optics.
| https://maxwellrender.com/
|
| Manuka by Weta FX is spectral and has been used in several
| feature films https://dl.acm.org/doi/10.1145/3182161 and
| https://www.wetafx.co.nz/research-and-tech/technology/manuka
| alkonaut wrote:
| If you want to look under the hood and find the big production
| ones like too complex, I can recommend peeking at this one.
| It's a great example example of just the minimum of a spectral
| path tracer https://github.com/TomCrypto/Lambda
| pixelpoet wrote:
| There's also Indigo Renderer and the open source LuxRender:
|
| https://indigorenderer.com
|
| https://luxcorerender.org
| dagmx wrote:
| Note that Lux isn't spectral anymore. The older version is
| though.
| pixelpoet wrote:
| Funny, I added the original implementation of spectral
| rendering to Lux and then they took it out :D
| 6mian wrote:
| If you want to play with ray tracing implementation, it's
| surprisingly easy to write one by yourself. There's a great free
| book
| (https://raytracing.github.io/books/RayTracingInOneWeekend.ht...)
| or, if you know a bit of Unity a very nice GPU-based tutorial
| (https://medium.com/@jcowles/gpu-ray-tracing-in-one-
| weekend-3...). The Unity version is easier to tinker with,
| because you have scene preview and other GUI that makes moving
| camera around so much easier. There are many implementations
| based of these sources if you don't want to write one from
| scratch, although doing so is definitely worth it.
|
| I spent some great time playing with the base implementation.
| Making the rays act as particles* that bend their path to/away
| from objects, making them "remember" the last angle of bounce and
| use it in the next material hit etc. Most of them looked bad, but
| I still got some intuition what I was looking at. Moving the
| camera by a notch was also very helpful.
|
| A lot of fun, great for a small recreational programming project.
|
| * Unless there's an intersection with an object, then set the
| maximum length of the ray to some small amount, then shoot many
| rays from that point around and for each hit apply something
| similar to the gravity equation. Of course this is slow and just
| an approximation, but it's easy and you can implement a "black
| hole" type of object that will bend light in the scene.
| kragen wrote:
| when i wrote _my very first ray tracer_ it didn 't take me an
| entire weekend; it's about four pages of c that i wrote in one
| night
|
| http://canonical.org/~kragen/sw/aspmisc/my-very-first-raytra...
|
| since then i've written raytracers in clojure and lua and a
| raymarcher in js; they can be very small and simple
|
| last night i was looking at Spongy by mentor/TBC
| https://www.pouet.net/prod.php?which=53871 which is a fractal
| animation raytracer with fog in 65 machine instructions. the
| ms-dos executable is 128 bytes
|
| i think it's easy to get overwhelmed by how stunning raytraced
| images look and decide that the algorithms and data structures
| to generate them must be very difficult, but actually they're
| very simple, at least if you already know about three-
| dimensional vectors. i feel like sdf raymarching is even
| simpler than the traditional whitted-style raytracer, because
| it replaces most of the hairy math needed to solve for precise
| intersections with scene geometry with very simple successive
| approximation algorithms
|
| the very smallest raytracers like spongy and Oscar Toledo G.'s
| bootsector raytracer https://github.com/nanochess/RayTracer are
| often a bit harder to understand than slightly bigger ones,
| because you have to use a lot of tricks to get that small, and
| the tricks are harder to understand than a dumber piece of code
| would be
| dahart wrote:
| > when i wrote my very first ray tracer it didn't take me an
| entire weekend
|
| It's just a catchy title. You can implement the book in an
| hour or two, if you're uncurious, or a month if you like
| reading the research first. Also maybe there are meaningful
| differences in the feature set such that it's better not to
| try to compare the time taken? The Ray Tracing in One Weekend
| book does start the reader off with a pretty strong footing
| in physically based rendering, and includes global
| illumination, dielectric materials, and depth of field. It
| also spends a lot of time building an extensible and robust
| foundation that can scale to a more serious renderer.
| Hammershaft wrote:
| That animated artwork at the end is incredible. Thank you for the
| technical write up and the artwork!
| lwander wrote:
| Author here. Waking up to seeing this on the front page with all
| the wonderful comments made my day! Thank you for sharing and
| reading
| dantondwa wrote:
| I'd love to see more about the artworks the author shares at the
| end. The idea of creating renders of realities where light works
| differently from ours is fascinating.
| geon wrote:
| I was thinking of implementing refraction in my distribution
| raytracer (stochastic, not parallel).
| https://en.m.wikipedia.org/wiki/Distributed_ray_tracing
|
| I would randomly sample a frequency, calculate its color and use
| it to modulate ray color. I would have to scale the result by 3
| to account for the pure refracted color being 1/3 brightness.
| dahart wrote:
| I think the term "distribution ray tracing" was a bit of a mid-
| point on the timeline of evolution from Whitted ray tracing to
| today's path tracing? IIRC distribution ray tracing came from
| one of Rob Cook's Siggraph papers. It's probably worth moving
| toward path tracing as a more general and unified concept, and
| also because when googling & researching, it'll be easier to
| find tips & tricks.
|
| Yes when combining spectral rendering with refraction, you'll
| need to pick a frequency by sampling the distribution. This can
| get tricky in general, good to build it in incremental steps.
| True of reflections as well, but up to you whether you want to
| have frequency-dependent materials in both cases. There are
| still reasons to use spectral even if you choose to use
| simplified materials.
| geon wrote:
| Yes. My implementation is a bidirectional pathtracer, but
| that tends to just confuse people.
|
| https://geon.github.io/programming/2013/09/01/restructured-c.
| ..
| pseudosavant wrote:
| I'd love to understand the performance implications of modeling a
| spectral distribution instead of an RGB pixel for ray tracing.
| dahart wrote:
| There's more than one way to implement spectral rendering, and
| thus multiple different trade offs you can make. Spectral in
| general is a trade for higher color & material accuracy at the
| cost of compute time.
|
| All else being equal, if you carry a fixed-size power spectrum
| along with each ray that is more than 3 elements, instead of an
| rgb triple, then you really might have up to an n/3 perf. For
| example using 6 wavelength channels can be up to twice as slow
| as an rgb renderer. Whether you actually experience n/3
| slowdown depends on how much time you spend shading, versus the
| time to trace the ray, i.e., traverse the BVH. Shading will be
| slowed down by spectral, but scene traversal won't, so check
| Amdahl's Law.
|
| Problem is, all else is never equal. Spectral math comes with
| spectral materials that are more compute intensive, and fancier
| color sampling and integration utilities. Plus often additional
| color conversions for input and output.
|
| Another way to implement spectral rendering is to carry only a
| single wavelength per ray path, like a photon does, and ensure
| the lights' wavelengths are sampled adequately. This makes a
| single ray faster than an rgb ray, but it adds a new dimension
| to your integral, which means new/extra noise, and so takes
| longer to converge, probably more than 3x longer.
| mncharity wrote:
| Perhaps create hyperspectral (>>3 channels) images? I was
| exploring using them for better teaching color to kids by
| emphasizing spectra. Doing image[1] mouseover pixel spectra for
| example, to reinforce associations of colors-and-their-spectra.
| But hyperspectral images are rare, and their cameras
| traditionally[2] expensive. So how about synthetic hyperspectral
| images?
|
| Perhaps a very-low-res in-browser renderer might be fast enough
| for interactively playing with lighting and materials? And
| perhaps do POV for anomalous color vision, "cataract lens removed
| - can see UV" humans, dichromat non-primate mammals (mice/dogs),
| and perhaps tetrachromat zebra fish.
|
| [1] http://www.ok.sc.e.titech.ac.jp/res/MSI/MSIdata31.html [2] an
| inexpensive multispectral camera using time-multiplexed narrow-
| band illumination:
| https://ubicomplab.cs.washington.edu/publications/hypercam/
| sudosysgen wrote:
| It's possible to implement this efficiently using light tracing
| - the final value in the image is the (possibly transformed)
| contribution from each light source, and since you have the
| spectrum of the light source you can have the spectrum of the
| pixel.
|
| Until you encounter significant dispersion or thin film
| effects, that is, then you need to sample wavelengths for each
| path, so it becomes (even more of) an approximation.
___________________________________________________________________
(page generated 2024-04-15 23:01 UTC)