[HN Gopher] A trip through the Graphics Pipeline
___________________________________________________________________
A trip through the Graphics Pipeline
Author : shkhuz
Score : 119 points
Date : 2022-12-13 13:58 UTC (1 days ago)
(HTM) web link (fgiesen.wordpress.com)
(TXT) w3m dump (fgiesen.wordpress.com)
| billfruit wrote:
| Why does Graphics APIs especially 3D, still remain very much low
| level? I think 3d graphics is still not as ubiquitous as it could
| be perhaps due to complex and low level APIs.
| pjmlp wrote:
| Anyone that wants to explore graphics programming should start
| with middleware, 20 years ago we would use something like Open
| Inventor or VTK for that purpose.
|
| No need to always start by learning on how to render a triangle
| or a rotating cube from scratch.
| corysama wrote:
| Low-level APIs are used to build lots of varied high-level
| APIs.
|
| They are also used to build stuff like
| https://www.adriancourreges.com/blog/2016/09/09/doom-2016-gr...
| and https://mamoniem.com/behind-the-pretty-frames-death-
| strandin... which really cannot be built with non-specific
| high-level APIs.
| Agentlien wrote:
| More specifically: This type of article is typically based on
| render captures using graphics debuggers such as RenderDoc.
|
| Graphics debuggers themselves are a great example of programs
| written using low level graphics APIs.
| dahart wrote:
| It's not just remaining low level, the trend has been moving
| steadily toward lower level. The primary reason is performance.
| It's hard to squeeze the best performance out of the high level
| APIs. Another reason is because the people who care the most
| and get involved in the API and standards committees are people
| who know the APIs and their limitations inside and out, and
| care deeply about performance; people like game engine render
| devs, browser developers, OS developers, and hardware
| developers. And so newcomers and the ease of learning these
| APIs aren't well represented. On one hand, there's a valid
| argument to be made that people can write wrapper libraries if
| they want easy-to-use libraries. On the other hand, some of us
| really miss how easy and fun using SGI's GL library was before
| it turned into OpenGL.
| rwbt wrote:
| Earlier versions of GL were indeed a lot of fun and easy to
| get started without much learning curve, but very easy to get
| into a pickle and blow your foot once the project gets
| reasonably complex with everything being a state in the
| context.
| Agentlien wrote:
| OpenGL 4 was a huge shift and felt like such an improvement
| once you got used to it. It was a bit more work to set
| things up, but it was so much faster and had a lot less
| messy global state.
| rwbt wrote:
| Yep, GL4 with DSA was such a big improvement.
| pjmlp wrote:
| Still is much easier than Vulkan.
| Sharlin wrote:
| Classic OpenGL is anything but low level, and quite fun to
| learn. But it turns out it's a poor model for modern hardware
| (as opposed to early 90s SGI graphics workstations with
| architectures very different from modern GPUs), which resulted
| in impedance mismatches that made writing performant drivers
| (and performant client code) quite difficult. So modern
| graphics APIs went down an abstraction layer or two to make it
| easier to write code that corresponds to how GPUs actually
| work. But programming against something like raw Vulkan
| requires a vast amount of boilerplate code (that used to be
| handled by the driver) to get anything on screen, so a
| reasonable middle ground probably looks something like OpenGL
| 4.
| trap_goes_hot wrote:
| Products in the game space heavily differentiate from one
| another by "showing more" data (textures, models, detail, etc)
| or "doing more" with data - (massively multiplayer, more
| physics, ai, sound) or "doing it faster" - (more fps, low
| latency, etc)
|
| Local compute resources are finite, and if you want to target
| the widest market, then you need to fully utilize mainstream
| hardware; which means anyone willing to do the grunt work
| benefits from a higher quality product.
| zokier wrote:
| The high-level apis are stuff like Unity/Unreal/Godot.
| mkl95 wrote:
| > This series is intended for graphics programmers that know a
| modern 3D API (at least OpenGL 2.0+ or D3D9+)
|
| In other words, it should still be relevant today.
| nicopappl wrote:
| I've seen this still be recommended in 2021. I personally read it
| last year and it was definitively helpful.
|
| Just excellent if you want to understand what is going on under
| the hood. Which is necessary if you are writing anything more
| than a toy graphic program, or understand the output of GPU
| debugging/profiling tools from Nvidia and AMD.
| Agentlien wrote:
| For anyone interested in computer graphics this is an absolutely
| wonderful series which is still very relevant despite being 11
| years old.
|
| As an aside, I mentioned this series earlier today on the
| Graphics Programming Discord server and now it's on the front
| page of HN?
|
| _edit: on the other hand, searching the server it seems this
| article is linked there every few days, so it might just be
| coincidence_
| psychphysic wrote:
| Possibly the wrong place/person to ask, but is there a resource
| on how to mcguyver calculation from a graphics card.
| Agentlien wrote:
| Definitely not the wrong place nor person, but what do you
| mean by mcguyver in this context?
|
| If you want to use it for general purpose computation not
| tied to rendering I would suggest CUDA. If you want to play
| with rendering try shadertoy.com
| psychphysic wrote:
| Thanks for replying!
|
| I'll try shadertoy.com
|
| Basically how would one use a GPU to compute something
| without CUDA/OpenCL mostly for curiosity but say I wanted
| to use a PS2s graphics chip to perform a calculation. How
| was/is that actually done?
| 33985868 wrote:
| TL;DR yes and no (thanks to DSPs, technically some were
| an integral part of the GPU):
|
| In the era of semi-programmable graphics pipelines, of
| which there was the N64, PS2 and GameCube, you would not
| be able to solve general-purpose problems using only that
| pipeline: given a handful of colour mixing formulas,
| registers, textures and vertices to control the pixels to
| write back to memory, I fail to come up with any tangible
| problem you could hack them into solving.
|
| However, what the 3D generation of video game consoles
| had to bundle with the hardware in order to process
| complex scenes at interactive frame rates were digital
| signal processors (DSPs). Think of them as regular
| processors, but with an extra vector unit, allowing you
| to crunch enormous amounts of data in parallel (single
| instruction, multiple data instructions, or SIMD). This
| was essential to perform geometry transformations, which
| benefit greatly from parallelism, in order to build
| graphics commands to pass down to the rasteriser.
|
| I am unsure if the SEGA Saturn and PS1's vector-
| accelerated instructions are sufficiently general-purpose
| to allow them perform any computation you wish to do, but
| at least starting with the Nintendo 64, you could write
| microcode to accelerate any task you'd like, provided you
| were brave enough to deal with the harsh memory
| constraints of microcode and obscure documentation.
|
| To give concrete examples, the PS1 came with a dedicated
| chip (the MDEC, or motion decoder) to decode MPEG video
| frames from the CD directly into the VRAM to display on
| screen.
|
| While the N64 doesn't have dedicated video decoding
| chips, its signal processor was designed in such a way to
| allow microcode to do an equivalent job (not to mention
| YUV texture decoding on the rasteriser's side).
|
| So this did not stop a talented Resident Evil 2 developer
| fresh out of school to write an MPEG video decoder
| microcode to be able to cram two CDs worth of video data
| (around 1 gigabyte, down to a 64 megabyte cartridge), of
| course with other smart decisions like data deduplication
| and further compression/interlacing.
|
| Another one was Rare developers writing an MP3 audio
| decoder microcode to store large amounts of dialogue.
|
| And finally, in recent times, a developer managed to
| write an H.264 video decoder microcode, for a machine
| that was released almost a decade prior to this codec's
| birth.
|
| You can also accelerate physics and anything else that
| would benefit from SIMD, really, and in fact, while more
| modern CPUs integrate SIMD instructions in their tool
| chest that compilers can take advantage of, the PS3's
| Cell processor was a brief throwback to the old hardcore
| ways, before GPGPU became king.
|
| You could almost treat the DSPs as the compute units of
| modern GPU architectures: they definitely processed the
| vertices, and nothing stops you from adding a "vertex
| shader" pass, however, because it's not directly
| integrated into the graphics pipeline, it's harder to
| emulate a true pixel shader, you might be limited to
| full-screen pixel shading since there's no feedback from
| the rasteriser about which pixels exactly get written to
| memory.
| psychphysic wrote:
| Amazing answer thank you so much!
| Agentlien wrote:
| Thanks for giving such a thorough explanation with
| examples.
|
| I know most of this stuff anecdotally but I didn't start
| working with 3D until programmable shaders were already
| becoming commonplace.
| 33985868 wrote:
| You're very welcome.
|
| Computer graphics has a very interesting history, and
| like many things, there's a lot to learn from studying
| it. Surprisingly, many techniques and principles are
| still relevant to this day.
|
| Thankfully, the Internet is full of documentation, post-
| mortem accounts and reference implementations to learn
| more about all this.
|
| Here's a bunch of random noteworthy things and references
| I forgot to link (sorry for potentially digging rabbit
| holes):
|
| The best information tends to come from official
| documentation detailing almost everything you want to
| know about the inner workings of systems. Additionally,
| unofficial community documentation can also be of great
| quality and complement official sources.
|
| The architecture overview posts from Rodrigo Copetti
| (https://www.copetti.org/writings/consoles/) pack a lot
| of accurate information at a glance, and are a great
| starting point if a specific topic piques your interest.
|
| The Resident Evil 2 microcode post-mortem:
| https://ultra64.ca/files/other/Game-Developer-
| Magazine/GDM_S...
|
| Improved lighting, reverberation, Dolby surround and MP3
| microcodes from Rare: https://youtu.be/jcIupBUAy98?t=41
|
| H.264 decoding microcode: https://www.reddit.com/r/n64/co
| mments/gpwxx0/my_dragons_lair...
|
| The Reality Signal Processor's programming guide:
| https://ultra64.ca/files/documentation/silicon-
| graphics/SGI_...
|
| Emulators' progress reports can reveal a lot about the
| details and intricacies required for accurately
| replicating these systems' features: https://dolphin-
| emu.org/blog/ is one such amazing source of information,
| but all other major emulation efforts have equally
| interesting content.
|
| Transparency sorting of surfaces
| (https://en.wikipedia.org/wiki/Order-
| independent_transparency) is a hard problem to solve for
| traditional scanline renderers (most PC/console GPUs), to
| the point that even today's releases can ship with
| somewhat obvious rendering errors.
|
| On the other hand, tiled renderers (used in the
| Dreamcast, mobile hardware, and Apple silicon) are able
| to solve this problem due to their very nature, albeit
| trading another set of drawbacks, though the completely
| different hardware approach is a nice read
| (https://en.wikipedia.org/wiki/Tiled_rendering).
|
| Someone once shared a video demonstrating a 3D package
| from ancient Lisp machines
| (https://youtu.be/gV5obrYaogU?t=30), and it was almost
| shocking to see how many things were familiar and done
| right from a modern perspective.
|
| Reimplementing new techniques on old hardware. For
| example, someone implemented real-time shadows and toon
| shading on N64 (https://youtu.be/VqDAxcWnq3g).
|
| For fun, you can grab RenderDoc (https://renderdoc.org/)
| and copies of your favourite games to analyse their
| frames (even via emulation): see how developers implement
| or fake visual effects and generally how these games
| render their world.
|
| For instance, Dolphin emulates the GameCube's semi-
| programmable texture environment unit (TEV) via a pixel
| shader, and its shader source code is directly visible
| and editable, with the resulting output shown in the
| texture viewer. With the aid of textures, you can
| implement refraction, reflection and heat haze, among
| other effects.
|
| Retro Game Mechanics Explained talks about both hardware
| and software concepts:
| https://www.youtube.com/@RGMechEx/videos
| Agentlien wrote:
| PS2 was before the GPGPU age and its GPU is fixed
| functionality, so it's not suitable for general
| calculations. Your problem would have to align perfectly
| with the computations done for regular rendering.
|
| If we instead look at something more modern like an early
| GeForce with programmable shaders what was typically done
| was to set up your regular rendering of a full screen
| quad and write a pixel shader which performs your desired
| computations instead of actual lighting calculations.
|
| APIs such as CUDA and OpenCL were introduced to allow
| people who weren't making games to make GPU calculations
| without pretending that they were colour calculations.
| psychphysic wrote:
| Thanks for the detailed answer!
| joenot443 wrote:
| Can you share a link or some other info about said Discord?
| Sounds like a comfy place.
| pixelpoet wrote:
| I left the server after 2 years or something because I
| couldn't stand the abysmal signal:noise ratio (of many
| kinds), and that's as someone who dedicated their life to CG.
| Just my 2c.
| Agentlien wrote:
| It is. Come see for yourself: https://discord.gg/XpMH8Dn5
| bmurphy1976 wrote:
| Given that this is from 2011 and that is an eternity in computer
| graphics, how applicable is this to the modern era?
| 4RealFreedom wrote:
| Reminds me of when I first started learning about GPUs. They've
| come a long way. I think back then it was basically
| rasterization and culling. It's been a long time, though.
| antegamisou wrote:
| The fundamental math is always the same so it's still very
| relevant.
| sharpneli wrote:
| It's very relevant. The parts shown here haven't changed. The
| biggest things we've gotten is just more control about command
| submission and dependencies etc. Otherwise the rasterization
| pipeline still chucks along exactly as shown here. (Yeah you
| might want to use mesh shaders, but they're not always
| supported. And there is RT stuff too. But what's in this
| article also works fine)
| superjan wrote:
| Agreed, but don't forget about compute shaders: their
| programmer audience extends way beyond 3D, whereas RT with
| all it's current limitations is only interesting for AAA game
| studios.
| wg0 wrote:
| Interesting. I came across this previously too.
|
| Not knit picking, genuinely asking how much is this still
| relevant in context of modern GPUs such as RTX 3090 etc and how
| much has changed?
|
| I'd guess this Ray tracing etc would be new step somewhere in
| between?
| dahart wrote:
| OptiX dev here. Ray tracing is more or less completely separate
| from the raster pipeline, which is what this article is
| discussing. There's some conceptual overlap when it comes to
| shading, and both pipelines share compute resources, but the
| "pipeline" part doesn't overlap much (and personally I think
| the reasons why are interesting). The RTX cards have a separate
| ray tracing core in addition to the raster processing hardware,
| so both pipelines still exist. What this means is Fabien's
| article is still absolutely relevant to the raster pipeline,
| it's just missing information about today's ray tracing
| pipeline.
| Agentlien wrote:
| Most of this is still perfectly relevant. Looking over this
| again recently I was surprised how little had changed since.
| superjan wrote:
| It's nitpicking.
| sharpneli wrote:
| RT is not a different step. Just like compute shaders are not a
| different step in the graphics pipeline but they're an
| independent thing. RT dispatches are their own concept and have
| their own rules.
| ryandrake wrote:
| The title is likely a nod to Jim Blinn's highly influential book
| (more like a collection of articles), _A Trip Down The Graphics
| Pipeline_ [1]. That was the first book I read on 3D graphics that
| helped me to actually intuitively understand fundamental 3D
| graphics concepts.
|
| 1: https://www.amazon.com/Jim-Blinns-Corner-Graphics-
| Pipeline/d...
| monocasa wrote:
| (2011)
| jcalabro wrote:
| This is from 2011. Fabian is incredible also, definitely give it
| a read!
___________________________________________________________________
(page generated 2022-12-14 23:01 UTC)