[HN Gopher] Raytracing on Meteor Lake's iGPU
       ___________________________________________________________________
        
       Raytracing on Meteor Lake's iGPU
        
       Author : rbanffy
       Score  : 123 points
       Date   : 2024-04-16 09:34 UTC (1 days ago)
        
 (HTM) web link (chipsandcheese.com)
 (TXT) w3m dump (chipsandcheese.com)
        
       | lynguist wrote:
       | - How does this compare to the raytracing units in Apple A17
       | Pro/M3 series? They also provide additional eye candy for a
       | relatively large cost I would say.
       | 
       | - Why are relatively expensive and large GPUs like in the RTX
       | 3070 commonly called "midrange" online? The meaning of "midrange"
       | seems to be creeping upward.
        
         | josephg wrote:
         | > Why are relatively expensive and large GPUs like in the RTX
         | 3070 commonly called "midrange"
         | 
         | Because the 30xx generation of graphics cards includes the
         | 3050, 3060, 3070, 3080 and 3090. The 3070 is right in the
         | middle of the range. Hence, midrange.
        
           | henriquecm8 wrote:
           | And also because it's a generation old. A card like 1080 was
           | high-end during its time, and still works well, but it isn't
           | high-end anymore.
           | 
           | High-end gpus right now are the 4090 and 4080.
        
         | jrk wrote:
         | _Within_ the market of discrete gaming GPUs, there has been a
         | hierarchy from  "low-" to "high-end" for decades -- since
         | before integrated GPUs even existed. For almost 20 years,
         | things with 192- or 256-bit memory busses have been "mid-range"
         | (vs. 384- or occasionally 512-bit memory busses at the high
         | end, and smaller at the low-end). NVIDIA's "7"-tier GPUs have
         | historically been the top of the mid-range in this world.
         | 
         | Within this world "midrange" has been creeping upward not via
         | big shifts in these fundamental characteristics but via:
         | 
         | 1. Prices increasing steadily across the board, due to
         | shortages and market power 2. Power budgets (and corresponding
         | board/cooler sizes) increasing across the board
         | 
         | The fundamentals (memory bus width -- still 256-bit; die size
         | and performance relative to the top of the line) remain "mid-
         | range" in exactly this same sense.
        
           | MrBuddyCasino wrote:
           | Expressed differently: the memory bus and chip size of a 4070
           | is equivalent to a 3060, NOT a 3070.
           | 
           | Nvidia has shifted its portfolio to the left, so they can
           | charge more for a smaller chip. I suspect this is mostly due
           | to increasing manufacturing costs, not (just) pricing power.
           | 
           | This is also a reason why they are banking on AI upscalers to
           | drive improvements in the future.
        
             | xmodem wrote:
             | Put another way, they took the gains in technology and
             | process improvements and banked them for themselves,
             | releasing a new generation that achieves similar
             | performance (in some cases worse) to last gen, albeit at an
             | improved power envelope, at the same price as last gen, but
             | that they have higher margins on.
             | 
             | The fact they can do this speaks to the lack of competition
             | in the GPU market at the midrange and up. Compare this to
             | the CPU market, where we now have Intel and AMD giving it
             | everything to leapfrog each other every 6-9 months.
             | 
             | I don't begrudge nvidia wanting to spend a generation
             | consolidating their market position - that's their right -
             | just as it's mine to look at a GPU that performs roughly
             | the same as one I bought 7 years ago at almost the same
             | price and say "no thanks".
        
               | MrBuddyCasino wrote:
               | > _releasing a new generation that achieves similar
               | performance (in some cases worse) to last gen_
               | 
               | Afaik the new gen still manages to improve over the old
               | one, albeit modestly, do you know of an example where is
               | that not the case?
        
               | xmodem wrote:
               | See this GN review:
               | https://youtu.be/WS0sfOb_sVM?t=666&si=Xt62b_2BfQM-fnuH
               | 
               | Specifically, at 4K, the 3060 outperforms the 4060 in
               | Cyberpunk. Most of the charts do show a gain, but I'd
               | describe it as marginal rather than modest.
        
               | MrBuddyCasino wrote:
               | Yeah the smaller memory interface hurts at 4k, that is
               | unfortunately to be expected. Raw rasterizer/raytracer
               | power shouldn't regress It think.
        
               | dahart wrote:
               | > a GPU that performs roughly the same as one I bought 7
               | years ago
               | 
               | Out of curiosity, which GPUs are you referring to, and
               | how are you measuring or comparing perf?
        
               | xmodem wrote:
               | 1080Ti vs 4060Ti. Factoring in nvidia's price creep and a
               | worse exchange rate, the 4060Ti costs today roughly what
               | I paid for my 1080Ti 7 years ago.
               | 
               | I'm comparing perf by looking at sites likes
               | userbenchmark and reading/watching reviews of newer
               | cards. I'm running into games I can't play at a level
               | that I'm happy with, but based on reviews I don't think a
               | 4060 or 4060Ti would make a meaningful difference.
        
               | dahart wrote:
               | That's a very tricky comparison at best, 3 gens up but 2
               | product lines down. Comparing 1060 to 4060, or 1080 to
               | 4080 might be more fair, even accounting for the skew in
               | product lines.
               | 
               | I'm not sure how you're doing the price comparison.
               | 1080ti launched at $699 (with inflation that's $870
               | today). The 4060ti launched at $399. Why does that seem
               | the same to you?
               | 
               | We're in a ray tracing thread and for ray tracing,
               | there's no contest, the 4060ti wins there by a long way.
               | 
               | UserBench says today the 4060ti is 17% faster for 20%
               | lower price today, which is about 33% better perf per
               | dollar... which interestingly tends to match their user
               | score and sentiment.
               | 
               | https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4060-Ti-
               | vs-...
               | 
               | Also worth noting that some old games are bottlenecked on
               | the software, not on the GPU, so it's not surprising to
               | find a game that was around for the 1080 or before and
               | doesn't get any better with more modern GPUs. Comparing
               | the 1080 to the 4080, fp32 perf went up ~5x, and due to
               | Amdahl's law, game benchmarks tend to see more like ~3x
               | in overall system perf.
        
               | neogodless wrote:
               | Guessing this mostly comes down to exchange rate for you.
               | 
               | As sibling pointed out, the $699 1080 Ti would be $870
               | today in U.S. dollars.
               | 
               | The 4060 Ti is also a remarkably bad product (for the
               | price). If you swap in the 4070, you're looking at a
               | launch MSRP of $599 which is less than the 1080 Ti even
               | before inflation, and about 50% faster (and over 30%
               | faster than the 4060 Ti.)
               | 
               | Still not mind-blowing given the nearly 7 years between
               | launches. There's a big delta going up to the 4080, both
               | in price and performance. On UserBenchmark it's "135%"
               | faster than a 1080 Ti, but at a mind-boggling $1200.
        
         | rbanffy wrote:
         | > The meaning of "midrange" seems to be creeping upward.
         | 
         | That has been the case since before I touched my first
         | computer. The first mainframe I touched had 16 megabytes of
         | memory, and my first desktop had 48 kilobytes. Both were
         | midrange in their respective categories (although the mainframe
         | was in the top in memory, it was only average in processing
         | power - the Apple II+ was better than a VIC-20 and performed a
         | little better than a C-64, but was slower than most of the
         | "professional" personal computers).
         | 
         | My work laptop is midrange today and is many, many orders of
         | magnitude more powerful than the supercomputers of that period.
         | 
         | To put things into perspective, my phone runs Unix on a RISC
         | CPU closely coupled with an array processor.
         | 
         | That's Moore's law. It has slowed down a bit, in part from the
         | difficulty of doubling density every couple years, but also
         | from the fact even a 5th generation Intel i3 is vastly more
         | capable than what the average user needs. In GPUs it's a little
         | bit different, with games requiring increasingly ludicrous
         | amounts of compute power, pushing the "good enough" range
         | upwards every year.
         | 
         | If you really like to play Pac-Man, an 8-bit CPU and a simple
         | CRTC should suffice.
        
       | kjkjadksj wrote:
       | Do gamers actually want ray tracing? Or is this something like
       | bloom/post effects/motion blur where is computationally expensive
       | and gamers who care about their k/d shut it off anyway to see the
       | other team easier?
        
         | jackling wrote:
         | For single player games it's probably still desireable.
         | Cyberpunk 2077 with Ray Tracing looks amazing and enhances the
         | gaming experience. Other than framerate issues, there's no
         | drawback to having it on. For single player games, immersion is
         | important.
        
         | ascagnel_ wrote:
         | The better question is if developers want ray tracing; for
         | them, it can represent a massive time and cost savings in terms
         | of lighting games.
        
           | itishappy wrote:
           | Raytracing is currently not universally supported and slow,
           | so it's used to supplement existing pipelines. It's more
           | work, not less.
        
             | corysama wrote:
             | That's the story with all new tech. Difficult transition to
             | a better tomorrow. Rinse and repeat.
        
               | itishappy wrote:
               | Totally agree! We're currently quite early in the
               | transition, and the tech is not ready to stand on it's
               | own, but the future is bright!
        
         | bhewes wrote:
         | For someone who played outside growing up, ray tracing is a
         | nice addition, the baked lighting always drove me crazy in my
         | games. I put it up there when I noticed Physics engines stopped
         | feeling like I was floating through a game.
        
         | 127 wrote:
         | Do I want global illumination? Yes, absolutely. Will it make
         | games substantially better? Only marginally. The core
         | experience will still be the gameplay.
         | 
         | Still, the visual spectacle of what some of these products can
         | create is a fantastic experience in itself. It can enhance
         | immersion in a drastic way, and especially for story heavy
         | games, that can act as a direct multiplier on an already good
         | story and structure. Bad games will still never go past being
         | just a tech demo.
        
         | itishappy wrote:
         | I think people who kneecap their graphics settings to eek out
         | the last few millis of latency are the vast minority.
        
         | kimixa wrote:
         | From my experience it's the thing gamers enable _after_ getting
         | 120fps  "ultra" in rasterization already. Rasterization-based
         | estimations of many of the visual features are _very_ good now,
         | and most people I know tend to have a better end experience
         | with higher frame rate than fixing some of the less-noticeable
         | issues with them.
         | 
         | So it's probably useful if you've got a 4090, but beyond that
         | more try it once, go "Huh, that's neat", and then disable it.
         | 
         | But there's always a chicken and egg problem for every new
         | feature - it may make sense to push support even if it's not
         | particularly useful right now, as once it's ubiquitous there
         | may be more use cases for lower total performance cards -
         | "light" ray tracing may be a better solution than some
         | rasterization tricks for things like GI, or even things like AI
         | line of sight or other non-specifically-graphics tasks.
         | 
         | But as gamedevs have to support the non-RT path anyway, it
         | probably doesn't make sense to develop two separate paths. So
         | it's relegated to "optional" visual features only.
        
         | ErneX wrote:
         | Not every game (or gamer) is of the competitive type.
         | 
         | RT is pretty nice and more and more games are featuring it,
         | problem is at least on consoles they cannot go crazy with it so
         | some games only do reflections for example or just shadows or
         | just global illumination.
         | 
         | Sony is supposedly releasing their Pro model of the PS5 later
         | this year with improved support for RT.
        
       ___________________________________________________________________
       (page generated 2024-04-17 23:02 UTC)