[HN Gopher] Simulating fluids, fire, and smoke in real-time
___________________________________________________________________
Simulating fluids, fire, and smoke in real-time
Author : ibobev
Score : 277 points
Date : 2023-12-19 17:51 UTC (5 hours ago)
(HTM) web link (andrewkchan.dev)
(TXT) w3m dump (andrewkchan.dev)
| kragen wrote:
| it blows my mind that cfd is a thing we can do in real time on a
| pc
| s-macke wrote:
| Yes, we can since a long time. They were already a thing 14
| years ago on the PS3 in the game "Pixeljunk shooter" [0].
|
| But real time simulations often use massive simplifications.
| They aim to look real, not to match exact solutions of the
| Navier-Stokes equations.
|
| [0] https://www.youtube.com/watch?v=qQkvlxLV6sI
| mcphage wrote:
| Heck, the Wii had Fluidity around the same time period
| (2010), and that's a lot weaker than the PS3. Fluidity was
| pretty neat--you played as a volume of water, changing states
| as you moved through a level that looked like a classic
| science textbook:https://youtu.be/j7IooyXp3Pc?si=E79rCrq2mdyZ
| SKoF&t=120
| kragen wrote:
| thanks, i had no idea about pixeljunk shooter!
|
| btw, almost unrelatedly, you have no idea how much i
| appreciate your exegesis of the voxelspace algorithm
| cherryteastain wrote:
| As a person who did a PhD in CFD, I must admit I never
| encountered the vorticity confinement method and curl-noise
| turbulence. I guess you learn something new every day!
|
| Also, in industrial CFD, where the Reynolds numbers are higher
| you'd never want something like counteracting artificial
| dissipation of the numerical method by trying to applying noise.
| In fact, quite often people want artificial dissipation to
| stabilize high Re simulations! Guess the requirements in computer
| graphics are more inclined towards making something that looks
| right instead of getting the physics right.
| ChuckMcM wrote:
| > ... the requirements in computer graphics are more inclined
| towards making something that looks right instead of getting
| the physics right.
|
| That is exactly correct. That said, as something of a physics
| nerd (it was a big part of the EE curriculum in school) people
| often chuckled at me in the arcade pointing out things which
| were violating various principles of physics :-). And one of
| the fun things was listening to a translated interview with the
| physicist who worked on Mario World at Nintendo who stressed
| that while the physics of Mario's world were not the same as
| "real" physics, they were _consistent_ and had rules just like
| real physics did, and how that was important for players to
| understand what they could and could not do in the game (and
| how they might solve a puzzle in the game).
| jwoq9118 wrote:
| Was the PhD worth it in your opinion?
| cherryteastain wrote:
| From a purely economic standpoint, difficult to say. I was
| able to build skills that are very in demand in certain
| technical areas, and breaking into these areas is notoriously
| difficult otherwise. On the other hand, I earned peanuts for
| many years. It'll probably take some time for it to pay off.
|
| That said, never do a PhD for economic reasons alone. It's a
| period in your life where you are given an opportunity to
| build up any idea you would like. I enjoyed that aspect very
| much and hence do not regret it one bit.
|
| On the other hand, I also found out that academia sucks so I
| now work in a completely different field. Unfortunately it's
| very difficult to find out whether you'll like academia short
| of doing a PhD, so you should always go into it with a plan B
| in the back of your head.
| mckn1ght wrote:
| I feel this comment. I did a masters with a thesis option
| because I was not hurting for money with TA and side
| business income, so figured I could take the extra year (it
| was an accelerated masters). Loved being able to work in
| that heady material, but disliked some parts of the
| academic environment. Was glad I could see it with less
| time and stress than a PhD. Even so, I still never say
| never for a PhD, but it'd have to be a perfect confluence
| of conditions.
| jgeada wrote:
| Hard to say whether economically a PhD always makes sense,
| but it certainly can open doors that are otherwise firmly
| closed.
| smcameron wrote:
| I think the curl noise paper is from 2007:
| https://www.cs.ubc.ca/~rbridson/docs/bridson-siggraph2007-cu...
|
| I've used the basic idea from that paper to make a surprisingly
| decent program to create gas-giant planet textures:
| https://github.com/smcameron/gaseous-giganticus
| shahar2k wrote:
| reminds me of this -
| https://www.taron.de/forum/viewtopic.php?f=4&t=4
|
| it's a painting program where the paint can be moved around
| with a similar fluid simulation
| Sharlin wrote:
| > Guess the requirements in computer graphics are more inclined
| towards making something that looks right instead of getting
| the physics right.
|
| The first rule of real-time computer graphics has essentially
| always been "Cheat as much as you can get away with (and
| usually, even if you can't)." Also, it doesn't even have to
| look _right_ , it just has to look cool! =)
| FirmwareBurner wrote:
| Why not cheat? I'm not looking for realism in games, I'm
| looking for escapism and to have fun.
| digging wrote:
| I don't think this is a great argument because everybody is
| looking for some level of realism in games, but you may
| want less than many others. Without any, you'd have no
| intuitive behaviors and the controls would make no sense.
|
| I'm not saying this just to be pedantic - my point is that
| some people do want some games to have very high levels of
| realism.
| JaggerJo wrote:
| does not work on ios for me.
| bee_rider wrote:
| It works fine on iOS/safari for me, in the sense that the text
| is all readable, but the simulations don't seem to all execute.
| Which... given that other folks are reporting that the site is
| slowed to a crawl, I guess because it is running the
| simulations, I'll take the iOS experience.
| askonomm wrote:
| Interesting. I have 64GB of RAM and yet this page managed to kill
| the tab entirely.
| Fervicus wrote:
| Weird. I only have 16GB but seems to run fine for me?
| askonomm wrote:
| Maybe it's Windows? Seems on MacOS it runs fine with much
| less RAM.
| Fervicus wrote:
| I am on Windows.
| ghawkescs wrote:
| FWIW, works great and animates smoothly on a Surface Laptop 4
| with 16 GB RAM (Firefox).
| CaptainOfCoit wrote:
| EmberGen is absolutely crazy software that does simulation of
| fire and smoke in real-time on consumer GPUs, and supports a
| node-based workflow which makes it so easy to create new effects.
|
| Seriously, my workflow probably went from spending hours on
| making something that now takes minutes to get right.
|
| https://jangafx.com/software/embergen/
|
| I was sure that this submission would be about EmberGen and I'm
| gonna be honest, I'm a bit sad EmberGen never really got traction
| on HN
| (https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...)
|
| (Not affiliated with EmberGen/JangaFX, just a happy customer)
| PKop wrote:
| Written in Odin
|
| https://odin-lang.org/showcase/embergen/
| imiric wrote:
| Odin is such a neat language.
|
| I was equally impressed by Spall: https://odin-
| lang.org/showcase/spall/
| llm_nerd wrote:
| Page is basically unusable on my Intel MBP, and still a beastly
| consumer on my AS Mac. I presume simulations are happening
| somewhere on the page, but it would be a good idea to make them
| toggled. Ideally via user interaction, but if one insists on
| auto-play when scrolled into the viewport.
| kqr wrote:
| Same thing on a modern flagship smartphone.
| andygeorge wrote:
| Pixel 8 Pro here, using DDG browser, smooth for me.
| chefandy wrote:
| What I can use works great on FF on my S22 Ultra, but it's
| not interactive because the shader fields won't register
| taps as click events.
| pdabbadabba wrote:
| Interestingly, no noticeable performance issues on my M2
| MacBook Air. Maybe he's already made some of the changes you
| recommended?
| whalesalad wrote:
| Testament to the insane power and efficiency of Apple silicon
| aeyes wrote:
| Seems to work only in Safari, using Chrome I can't scroll past
| the first quarter of the page and CPU is at 100%.
| pudquick wrote:
| Runs buttery smooth on my M2 here in Safari on macOS 14.2.1
|
| Tried them out in Chrome and they're mostly all the same though
| I do notice a slight jitter to the rendering in the smoke
| example.
| seb1204 wrote:
| Maybe it is one for the many other tabs you have open
| dplavery92 wrote:
| I was encountering the same problem on my Intel MBP, and per
| another one of the comments here, find that switching from
| Chrome to Safari to view the page allows me to view the whole
| page, view it smoothly, and without my CPU utilization spiking
| or my fans spinning up.
| lelandbatey wrote:
| Works well for me on my Intel MBP, though I'm using Firefox
| inside of Ubuntu running inside of Parallels; maybe I'm not
| hitting the same issue that you are. The author may have put in
| a new "only runs when you click" setting since you wrote your
| original comment.
| andygeorge wrote:
| Same hardware(s) as you, smooth and 0 issues. Pebkac
| jhallenworld wrote:
| Runs fine in Ubuntu Linux using Firefox, but not in Google
| Chrome on my Lenovo Core i7-12700H 32GB laptop with Nvidia
| T600.
| bee_rider wrote:
| They mention simulating fire and smoke for games, and doing fluid
| simulations on the GPU. Something I've never understood, if these
| effects are to run in a game, isn't the GPU already busy? It
| seems like running a CFD problem and rendering at the same time
| is a lot.
|
| Can this stuff run on an iGPU while the dGPU is doing more
| rendering-related tasks? Or are iGPUs just too weak, better to
| fall all the way down to the CPU.
| photoGrant wrote:
| Welp. It used to be PhysX math would run on a dedicated gpu of
| choice. I remember assigning or realising this during the Red
| Faction game with forever destructing walls.
|
| Almost Minecraft, but with rocket launchers on mars
| bee_rider wrote:
| I remember PhysX, but my feeling at the time was "yeah I
| guess NVIDIA would love for me to buy two graphics cards." On
| the other hand, processors without any iGPUs are pretty rare
| by now.
| dexwiz wrote:
| You don't have to be gaming to use a GPU. Plenty of rendering
| software have a GPU mode now. But writing GPU aglos are often
| different from a CPU simulation algo, because it is highly
| parallelized.
| Valgrim wrote:
| Now that LLMs run on GPU too, future GPUs will need to juggle
| between the graphics, the physics and the AI for NPCs. Fun
| times trying to balance all that.
|
| My guess is that the load will become more and more shared
| between local and remote computing resources.
| teh_infallible wrote:
| Maybe in the future, personal computers will have more than
| one GPU, one for graphics and one for AI?
| MaxBarraclough wrote:
| Many computers already have 2 GPUs, one integrated into the
| CPU die, and one external (and typically enormously more
| powerful).
|
| To my knowledge though it's very rare for software to take
| advantage of this.
| softfalcon wrote:
| In high performance scenarios, the GPU is running full
| blast while the CPU is running at full blast just feeding
| data and pre-process work to the GPU.
|
| The GPU is the steam engine hurtling forward, the CPU is
| just the person shoveling coal into the furnace.
|
| Using the integrated GPU heats up the main die where the
| CPU is because they live together on the same chip. The
| die heats up, CPU thermal throttles, CPU stops
| efficiently feeding data to the GPU at max speed, GPU
| slows down from under utilization.
|
| In high performance scenarios, the integrated GPU is
| often a waste of thermal budget.
| MaxBarraclough wrote:
| Doesn't this assume inadequate cooling? A quick google
| indicates AMD's _X3D_ CPUs begin throttling around
| 89degC, and that it 's not overly challenging to keep
| them below 80 even under intense CPU load, although
| that's presumably without any activity on the integrated
| GPU.
|
| Assuming cooling really is inadequate for running both
| the CPU cores and the integrated GPU: for GPU-friendly
| workloads (i.e. no GPU-unfriendly preprocessing
| operations for the CPU) it would surely make more sense
| to use the integrated GPU rather than spend the thermal
| budget having the CPU cores do that work.
| softfalcon wrote:
| What the specs say they'll do and what they actually do
| are often very different realities in my experience.
|
| I've seen thermal throttling happening at 60degC because
| overall, the chip is cool, but one core is maxing or two
| cores are maxing. Which is common in game dev with your
| primary thread feeding a GPU with command buffer queues
| and another scheduling the main game loop.
|
| Even when water cooled or the high end air cooling on my
| server blades, I see that long term, the system just hits
| a trade-off point of ~60-70degC and ~85% max CPU clock
| even when the cooling system is industry grade, loud as
| hell, and has an HVAC unit backing it. Probably part of
| why scale out is so popular to distribute load.
|
| When I give real work to any iGPU's on these systems, I
| see the temps bump 5-10degC and clocks on the CPU cores
| drop a bit. Could be drivers, could be temp curves, I
| would think these fancy cooling systems I'm running are
| performing well though. _shrug_
| softfalcon wrote:
| > Something I've never understood, if these effects are to run
| in a game, isn't the GPU already busy?
|
| Short answer: No, it's not "already busy". GPU's are so
| powerful now that you can do physics, fancy render passes,
| fluid sims, "Game AI" unit pathing, and more, at 100+ FPS.
|
| Long answer: You have a "frame budget" which is the amount of
| time between rendering the super fast "slide show" of frames at
| 60+ FPS. This gives you between 5 and 30 ms to do a bunch of
| computation to get the results you need to compute state and
| render the next frame.
|
| That could be moving units around a map, calculating fire
| physics, blitting terrain textures, rendering verts with
| materials. In many game engines, you will see a GPU doing
| dozens of these separate computations per frame.
|
| GPU's are basically just a secondary computer attached to your
| main computer. You give it a bunch of jobs to do every frame
| and it outputs the results. You combine results into something
| that looks like a game.
|
| > Can this stuff run on an iGPU while the dGPU is doing more
| rendering-related tasks?
|
| Almost no one is using the iGPU for anything. It's completely
| ignored because it's usually completely useless compared to
| your main discrete GPU.
| bee_rider wrote:
| It looks like the GPU is doing most of the work... from that
| point of view when do we start to wonder if the GPU can
| "offload" anything to the whole computer that is hanging off
| of it, haha.
| softfalcon wrote:
| > It looks like the GPU is doing most of the work
|
| Yes. The GPU is doing most of the work in a lot of modern
| games.
|
| It isn't great at everything though, and there are
| limitations due to its architecture being structured almost
| solely for the purpose of computing massively parallel
| instructions.
|
| > when do we start to wonder if the GPU can "offload"
| anything to the whole computer that is hanging off of it
|
| The main bottleneck for speed on most teams is not having
| enough "GPU devs" to move stuff off the CPU and onto the
| GPU. Many games suffer in performance due to folks not
| knowing how to use the GPU properly.
|
| Because of this, nVidia/AMD invest heavily in making
| general purpose compute easier and easier on the GPU. The
| successes they have had in doing this over the last decade
| are nothing less than staggering.
|
| Ultimately, the way it's looking, GPU's are trying to
| become good at everything the CPU does and then some. We
| already have modern cloud server architectures that are 90%
| GPU and 10% CPU as a complete SoC.
|
| Eventually, the CPU may cease to exist entirely as its
| fundamental design becomes obsolete. This is usually called
| a GPGPU in modern server infrastructure.
| vlovich123 wrote:
| I'm pretty sure CPUs destroy GPUs at sequential
| programming and most programs are written in a sequential
| style. Not sure where the 90/10 claim comes from but
| there's plenty of cloud servers with no GPU installed
| whatsoever and 0 servers without a CPU.
| softfalcon wrote:
| Yup, and until we get a truly general purpose compute GPU
| that can handle both styles of instruction with automated
| multi-threading and state management, this will continue.
|
| What I've seen shows me that nVidia is working very hard
| to eliminate this gap though. General purpose computing
| on the GPU has never been easier, and it gets better
| every year.
|
| In my opinion, it's only a matter of time before we can
| run anything we want on the GPU and realize various speed
| gains.
|
| As for where the 90/10 comes from, it's from the emerging
| architectures for advanced AI/graphics compute like the
| DGX H100 [0].
|
| [0] https://www.nvidia.com/en-us/data-center/dgx-h100/
| kqr wrote:
| This is somewhat reassuring. A decade ago when clock
| frequencies had stopped increasing and core count started
| to increase I predicted that the future was massively
| multicore.
|
| Then the core count stopped increasing too -- except only
| if you look in the wrong place! It has in CPUs, but they
| moved to GPUs.
| TillE wrote:
| A game is more than just rendering, and modern games will
| absolutely get bottlenecked on lower-end CPUs well before
| you reach say 144 fps. GamersNexus has done a bunch of
| videos on the topic.
| softfalcon wrote:
| You are not wrong that there are many games that are
| bottle-necked on lower end CPUs.
|
| I would argue that for many CPU bound games, they could
| find better ways to utilize the GPU for computation and
| it is likely they just didn't have the knowledge, time,
| or budget to do so.
|
| It's easier to write CPU code, every programmer can do
| it, so it's the most often reached for tool.
|
| Also, at high frame rates, the bottleneck is frequently
| the CPU due to it not feeding the GPU fast enough, so you
| lose frames. There is definitely a real world requirement
| of having a fast enough CPU to properly utilize a high
| end video card, even if it's just for shoving command
| buffers and nothing else.
| vlovich123 wrote:
| > No one is using the iGPU for anything. It's completely
| ignored because it's usually completely useless compared to
| your main discrete GPU.
|
| Modern iGPUs are actually quite powerful is my understanding.
| I think the reason no one does this is that the software
| model isn't actually there/standardized/able to work cross
| vendor since the iGPU and the discrete card are going to be
| different vendors typically. There's also little motivation
| to do this because not everyone has an iGPU which dilutes the
| economy of scale of using it.
|
| It would be a neat idea to try to run lighter weight things
| on the iGPU to free up rendering time on the dGPU and make
| frame rates more consistent, but the incentives aren't there.
| softfalcon wrote:
| I agree the incentives aren't there. Also agree that it is
| possible to use the integrated GPU for light tasks, but
| only light tasks.
|
| In the high performance scenarios where there is all three
| (discrete GPU, integrated GPU, and CPU) and we try and use
| the integrated GPU alongside the CPU, it often causes
| thermal throttling on the shared die between iGPU and CPU.
|
| This slows the CPU down from executing well, keeping up
| with state changes, and sending needed data to the keep the
| discrete GPU utilized. In short, don't warm up the CPU, we
| want it to stay cool, if that means not doing iGPU stuff,
| don't do it.
|
| When we have multiple discrete GPU's available (render
| farm), this on-die thermal bottleneck goes away and there
| are many render pipelines that are made to handle hundreds,
| even thousands of simultaneous GPU's working on a shared
| problem set of diverse tasks, similar to trying to utilize
| both iGPU and dGPU on the same machine but bigger.
|
| Whether or not to use the iGPU is less about scheduling and
| more about thermal throttling.
| johnnyanmac wrote:
| In theory it's perfectly possible to do all you describe in
| 8ms (i.e. VR render times). In reality we're
|
| 1. still far from properly utilizing modern graphics APIs as
| is. Some of the largest studios are close but knowledge is
| tight lipped in the industry.
|
| 2. even when those top studios can/do, they choose to focus
| more of the budget on higher render resolution over adding
| more logic or simulation. Makes for superficially better
| looking games to help sell.
|
| 3. and of course there are other expensive factors right now
| with more attention like Ray traced lighting which can only
| be optimized so much on current hardware.
|
| I'd really love to see what the AA or maybe even indie market
| can do with such techniques one day. I don't have much faith
| that AAA studios will ever prioritize simulation.
| pbowyer wrote:
| I'm really impressed by the output of the distill.pub template
| and page building system used for this article. It's a shame it
| was abandoned in 2021 and is no longer maintained.
| jvans wrote:
| Does anyone have any recommendations for a former math major
| turned SWE to get into CFD simulations? I find this material
| fascinating but it's been a while since I've done any vector
| calculus or PDEs so my math is very rusty.
| chefandy wrote:
| If you're more interested in the physics simulation for
| research, I can't help ya. However, SideFX Houdini is tough to
| beat if you're going more for entertainment-focused
| simulations.
|
| https://www.youtube.com/watch?v=zxiqA8_CiC4
|
| Their free non-commercial "Apprentice" version is only limited
| in its rendering and collaboration capabilities. It's pretty...
| uh... deep though. Coming from software and moving into this
| industry, the workflow for learning these sorts of tools is
| totally different. Lots of people say Houdini is more like an
| IDE than a 3D modelling program, and I agree in many ways.
| Rather than using the visual tools like in, say, blender, it's
| almost entirely based on creating a network of nodes, and
| modifying attributes and parameters. You can do most stuff in
| Python more cleanly than others like 3ds Max, though it won't
| compile, so the performance is bad in big sims. Their own
| C-like language, vex, is competent, and there's even a more
| granular version of their node system for the finer work with
| more complex math and such. It's almost entirely a data-
| oriented workflow on the technical side.
|
| However, if you're a "learn by reading the docs" type, you're
| going to have to learn to love tutorials rather quickly. It's
| very, very different from any environment or paradigm I've
| worked with, and the community at large, while generally
| friendly, suffers from the curse of expertise big-time.
| jbverschoor wrote:
| Good explanation why CG explosions suck:
| https://www.youtube.com/watch?v=fPb7yUPcKhk
| randyrand wrote:
| this is so useful thank you!
| riidom wrote:
| While not the main point of the article, the introductory premise
| is a bit off IMO. When you choose simulation, you trade artistic
| control for painful negotiation via an (often overwhelming)
| amount of controls.
|
| For a key scene like the balrog, you will probably never decide
| for simulation and go for full control on every frame instead.
|
| To stay in the tolkien fantasy example, a landscape scene with a
| nice river, doing a lot of bends, some rocks inside, the
| occasional happy fish jumping out of water - that would suit a
| simulation much better.
| johnnyanmac wrote:
| Not impossible to do both, but very few tools are built where
| simulation comes for "free". At least not free and real-time as
| of now. Maybe one day.
|
| But I agree with you. You'd ideally reserve key moments for
| full control and leave simulation for the background polish
| (assuming your work isn't bound by a physically realistic
| world).
| holoduke wrote:
| What will be quicker. Realtime raytraced volumetric super
| realistic particle effects or somekind of diffusion model
| shaderlike implementation.
| Phelinofist wrote:
| I recently watched this video about implementing a simple fluid
| simulation and found it quite interesting:
| https://www.youtube.com/watch?v=rSKMYc1CQHE
| nox100 wrote:
| This is very nice! Another person explaining this stuff is "10
| Minute Physics"
|
| https://matthias-research.github.io/pages/tenMinutePhysics/i...
___________________________________________________________________
(page generated 2023-12-19 23:00 UTC)