[HN Gopher] AMD Reveals Next-Gen Desktop Processors for Extreme ...
       ___________________________________________________________________
        
       AMD Reveals Next-Gen Desktop Processors for Extreme PC Gaming
        
       Author : doener
       Score  : 120 points
       Date   : 2024-01-10 10:02 UTC (1 days ago)
        
 (HTM) web link (www.amd.com)
 (TXT) w3m dump (www.amd.com)
        
       | dralley wrote:
       | These aren't Zen 5 CPUs, they're Zen 4 chips with better
       | integrated graphics (which gamers likely won't care about anyway)
       | and some AI accelerators (which probably won't see wide adoption
       | for at least another year)
        
         | kkzz99 wrote:
         | Do these AI accelerators even support common implementations
         | for LLM inference or stable diffusion models?
         | 
         | Last time I checked, AMD's support was terrible.
        
           | diggan wrote:
           | I'm guessing no, as these newly announced processors are the
           | first with those accelerators.
           | 
           | > AMD is also bringing the power of a dedicated AI neural
           | processing unit (NPU) to desktop PC processors for the first
           | time with the introduction of Ryzen(tm) AI
        
             | wtallis wrote:
             | They're merely the first _desktop_ processors from AMD with
             | these NPUs. This is last year 's laptop silicon repackaged
             | for their desktop socket. They also re-branded the laptop
             | version with new model numbers, because laptop OEMs
             | basically demand new model numbers on an annual cadence.
             | This year their marketing for these chips is heavily
             | emphasizing the NPUs, because they actually have some
             | tooling for them now, but they were basically dead weight
             | when the silicon first shipped.
        
           | zamalek wrote:
           | AMD's support is terrible, if you believe their
           | documentation. ROCm/hip works just fine (well, with Torch -
           | Tensorflow is shamelessly an NVIDIA shill) on many
           | "unsupported" GPUs if you enable an override envar.
        
             | NekkoDroid wrote:
             | To add a bit more context: I remember reading somewhere
             | (may have been in the Phoronix forum) from an official AMD
             | engeneer that "supported" means validated and actively
             | tested in their pipeline, while "unsupported" generally
             | just means "we don't do any or minimal testing on these,
             | they should work and we don't explicitly prevent them from
             | working but we don't guarantee anything" (at least when it
             | comes to same die/gen cards).
             | 
             | In the same post they also wrote that they are gonna look
             | at integrating more of those "unsupported" cards into their
             | suite.
             | 
             | Honestly, I hope they change their wording for this to
             | something like "validated", "supported" and "unsupported"
             | with actual explenations what each of these mean (fully
             | tested, works theoretically, does not work even
             | theoretically)
             | 
             | Edit: I actually found the post I was talking about
             | https://www.phoronix.com/forums/forum/linux-graphics-x-
             | org-d...
        
               | makomk wrote:
               | ROCm is also incredibly fragile and buggy at the best of
               | times, so anything not actively tested by them stands a
               | good chance of not working. Hell, I remember that a while
               | back people were having problems with machine learning
               | code giving garbage results on one of the few consumer
               | GPUs that was officially listed as supported and AMD
               | eventually replying to the bug report and declaring that
               | actually, no they weren't going to support it and they'd
               | remove it from the list ratther than try and fix the
               | issue. I think this was back when newer consumer GPUs
               | were genuinely unsupported as in the code simply wasn't
               | there too. Integrated GPUs have also alwayu had a lot of
               | problems.
               | 
               | There's also questionable OS compatiblity. ROCm is Linux-
               | based and has extremely limited and rather experimental
               | Windows support. Their fancy new neural processing unit
               | is Windows-only, tied in with a Microsoft framework, and
               | they don't seem to have any kind of definite plan for
               | supporting it elsewhere. So there's quite possibly no
               | single OS where all the hardware that theoretically could
               | be used for machine learning and AI in these chips
               | actually has even vaguely functioning code that works on
               | that OS to make it work that way.
        
             | wmf wrote:
             | Does ROCm even work on XDNA? I don't think it does?
        
             | jiggawatts wrote:
             | Tensorflow is open source on GitHub. What's stopping AMD
             | engineers from contributing with Pull Requests? A casual
             | search through the open PRs shows nothing of interest being
             | submitted by team red.
        
             | antx wrote:
             | ah yes, the good old HSA_OVERRIDE_GFX_VERSION=10.3.0
             | switcheroo
        
           | caycep wrote:
           | as long as PyTorch supports it, does it matter?
        
             | antx wrote:
             | for as long as I'll need to recompile PyTorch for it to
             | adequately support my card, yes, it matters.
        
         | coffeebeqn wrote:
         | Extreme integrated GPUs? Marketing department going wild
        
           | asmor wrote:
           | They are enough to run the Ally, and they're closely related
           | to the one in the Steam Deck. So maybe they're extreme in the
           | segment of integrated graphics??
        
           | bonton89 wrote:
           | They're just trying to compete with Intel Extreme Graphics
           | from 2002. Must be they're reaching performance parity.
           | 
           | https://en.wikipedia.org/wiki/List_of_Intel_graphics_process.
           | ..
        
             | seabrookmx wrote:
             | Parity? AMD integrated GPU's have outperformed Intel's for
             | many years.
        
               | dathinab wrote:
               | only APU like the 5700G, 8700G or the recent Steam Deck
               | APU (in reasonable comparison targets)
               | 
               | The integrated GPUs of the 7000 serious (not APUs) are
               | maximal minimalist to a point I think they couldn't have
               | shrunken them more without running into unusual issues
               | (assuming not changing the architecture/design). They are
               | only suited for web browsing, office use-cases and
               | debugging (but also kinda where made only for that use-
               | case). Well I guess Dead Cells and similar "low
               | requirement/high optimized" games still run nicely on it
               | on 1080p.
               | 
               | Still nice to have them if you only have such use-cases
               | saves a ton of money.
        
           | ahoka wrote:
           | They did not say which one of extremes.
        
         | Dalewyn wrote:
         | >they're Zen 4 chips with better integrated graphics (which
         | gamers likely won't care about anyway)
         | 
         | Integrated GPUs are becoming far more appealing to the general
         | consumer, including (especially?) gamers, with how fucking
         | expensive discrete GPUs are getting these days (AMD is part of
         | that problem).
         | 
         | We might just witness dGPUs becoming the sound cards of this
         | decade.
        
           | mirsadm wrote:
           | Extremely unlikely. The types of GPUs these compete against
           | don't cost a lot of money.
        
             | Dalewyn wrote:
             | An RTX 4060 (with a pitiful 8GB vram) is about $300.
             | Something a bit more practical like an RTX 4070 (with 12GB
             | vram) is about $550.
             | 
             | That is bullshit expensive. It's still nowhere near as bad
             | as during the cryptomining craze, but remember a GTX 1080
             | MSRP'd for about $430 back in the old days. An RTX 4080 for
             | context is around $1,500 to $2,000 right now.
             | 
             | If I'm just looking to game and only have a reasonable
             | budget like most people, I'll just grab some AMD APU and be
             | done with it. Better yet just go and buy a console or game
             | on my phone.
        
               | BenjiWiebe wrote:
               | How do the best integrated graphics compare performance-
               | wise though? If they are way behind in performance, you
               | might be able to buy a cheap 2GB VRAM GPU for less than a
               | cutting edge integrated graphics CPU.
        
               | justsomehnguy wrote:
               | > you might be able to buy a cheap 2GB VRAM GPU
               | 
               | Lol?
               | 
               | It's another ~$100 what would not yield you any
               | performance gains.
               | 
               | Older 5700G was ~$300 so it was hardly a part of a budget
               | build, but 5 5500GT is _$125_ (and would probably drop
               | below $100 in a half /year) and would clearly offer at
               | least 30fps on FullHD[0] and you _would buy it anyway
               | because it 's a CPU_.
               | 
               | https://www.tomshardware.com/reviews/amd-
               | ryzen-5-5600g-revie...
        
               | Dalewyn wrote:
               | >and you would buy it anyway because it's a CPU.
               | 
               | This is the part that needs to be emphasized more, that
               | iGPU is practically _free_ because it comes with the CPU
               | we 're buying anyway.
               | 
               | It's already hard enough to compete with _free_ , but on
               | top of that dGPUs are asking for Keksimus Maximus monies.
               | The value proposition for gamers, let alone most consumer
               | users, is very strongly favored towards iGPUs.
        
               | sofixa wrote:
               | > How do the best integrated graphics compare
               | performance-wise though
               | 
               | A bunch of handhelds (Steam Deck , Asus ROG Ally) that
               | can comfortably run even AAA games with 30fps+ on small
               | screens (as an example Red Dead Redemption 2 runs with
               | 50fps on my LCD Deck) with battery and thermal
               | limitations would imply that desktop versions would be
               | totally acceptable, unless you're looking at 4K on the
               | latest big titles.
        
               | BirAdam wrote:
               | The best integrated graphics are about equivalent to a
               | 1650. This assumes that the machine also has decently
               | speedy RAM.
        
               | whatwhaaaaat wrote:
               | Your last sentence clearly shows you are not a pc gamer
               | which is fine but you cannot use a persons opinion who is
               | not a pc gamer about .. pc gamers.
        
               | bryanlarsen wrote:
               | Lots of PC gamers are very happy with "slightly better
               | than Steam Deck" levels of performance.
        
               | Arrath wrote:
               | Yo! I tolerate relatively poor performance out of my
               | laptop (a several year old model with a Ryzen and iGPU)
               | while having a fairly kickass home desktop[0]. So chalk
               | me in that group.
               | 
               | [0]Which currently sees me playing a whole bunch of Rule
               | the Waves 3..... poor underutilized 3080.
        
               | Dalewyn wrote:
               | Don't worry, the 3080 Ti in my Kickass Desktop(tm) also
               | spends most of its gaming time playing _Princess Connect!
               | Re:Dive_ and _Uma Musume_. :V
        
               | wiseowise wrote:
               | PC gamer as a Reddit stereotypical le gaming masterrace?
               | Because most gamers I know don't give a shit about this
               | stuff. They care about games.
        
               | wmf wrote:
               | The 4060 is really bad though. I would guess that a
               | 12100F + A750 or 6600 XT would be faster than the 8700G
               | with its "extreme" graphics.
        
               | zokier wrote:
               | More appropriate comparison point would be something like
               | Arc A580 which retails around $180 and is already
               | dramatically faster than this igpu.
        
             | toast0 wrote:
             | If you want to buy new (which I don't think is
             | unreasonable), there's not much in the market that doesn't
             | cost a lot of money.
             | 
             | Maybe a Radeon 6500XT or a Arc A380; Geforce 1650 are still
             | kind of a lot of money, and you're looking at at Geforce
             | 1030 if you want something for not a lot of money. But why
             | buy a Geforce 1030, unless you really just need something
             | low profile.
        
         | 0x457 wrote:
         | Well, look at the steam hardware survey, you're going to be
         | surprised.
        
           | sva_ wrote:
           | People playing on laptops and the steamdeck? All legit uses
           | for the iGPU (I have one myself)
           | 
           | https://store.steampowered.com/hwsurvey/videocard/
        
             | dathinab wrote:
             | I think they meant people playing on really old dedicated
             | GPU which likely are slower then the 8700G to a point where
             | a 8700G might be a good replacement if their setup breaks
             | or similar.
        
               | wiseowise wrote:
               | That's obviously not-legit. /s
        
         | shmerl wrote:
         | Steam Deck users would disagree about integrated graphics.
         | 
         | Besides, APUs really got good for gaming unless you push
         | resolution high.
        
           | sva_ wrote:
           | I think they were hinting at the fact that the iGPU is more
           | interesting to users of the mobile CPU (Steamdeck/Laptops),
           | since this is a desktop CPU, which usually have dedicated
           | graphics.
        
         | jabroni_salad wrote:
         | I used to play WoW on a Llano APU. Plenty of gamers are
         | addicted to something like that and just want a serviceable
         | cheap machine. Pretty sure my current 3070 cost more than the
         | entire computer I was killing Deathwing with.
        
           | Arrath wrote:
           | I played healer so I could just point my camera at the floor
           | zoomed in as far as possible and suffer the least bad
           | performance possible. And scramble to get back in-game after
           | guaranteed disconnects on big events e.g. every time Nefarion
           | spawned a wave of ads. What a struggle.
        
         | dathinab wrote:
         | The 5700G had for some time been one of the best options for
         | budged gaming.
         | 
         | The 8700G should be seen as a successor of that I think.
         | 
         | Enough to play a huge amount of games nicely under 1080p.
         | 
         | But in a certain way at the lowest end of desktop gaming.
         | 
         | Still better then the low-mid range of gaming on a laptop, tho.
         | 
         | But for a lot of people that is exactly the right balance of
         | what they get to what they pay. Weather that's because they
         | don't have a lot money or they don't do a lot gaming and don't
         | want to wast money.
         | 
         | The main "problem" the 8700G might run into is I guess that the
         | cost of AM5 motherboards and RAM.
         | 
         | Also AMD (and I think Intel, too) have in recent years slowly
         | worked to take advantage of integrated graphics even if you
         | have also dedicated graphics (of the same vendor) if that sees
         | some more improvements the 8700G could also become interesting
         | in some use-case you wouldn't expect today. We will see.
        
           | NohatCoder wrote:
           | A quick check puts a Ryzen 5 5600 and a Radeon RX 6600 at
           | $360 combined, where the 8700G is set to $330. And a RX 6600
           | will deliver around double the graphics performance. So even
           | without factoring in motherboard and memory the 8700G is hard
           | to justify for a cheap gaming rig.
        
             | chx wrote:
             | Source:
             | 
             | https://www.newegg.com/amd-
             | ryzen-5-5600-ryzen-5-5000-series/... $150
             | 
             | https://www.newegg.com/p/pl?N=100007709%20601394871&Order=1
             | starts at $210 indeed.
             | 
             | People might argue the 5600 is Zen 3 where the 8700G is Zen
             | 4.
        
       | kube-system wrote:
       | > New AMD Ryzen(tm) 5000 Series Desktop Processors Bring More
       | Performance to Legacy Socket AM4 Platforms
       | 
       | Wait, what? New chips for the old platform? I wonder what the
       | driving factor even is to compel them to produce these?
        
         | diggan wrote:
         | > Wait, what? New chips for the old platform? I wonder what the
         | driving factor even is to compel them to produce these?
         | 
         | A lot of people are already using AM4 socket and might not want
         | to do a socket upgrade, but want a new CPU.
         | 
         | Also, I seem to remember AMD made a commitment before/at AM4
         | launch that they would support it for at least N years, but
         | don't remember the details about that. Maybe that could be
         | related?
        
           | borissk wrote:
           | Nope, AMD promised to support AM4 socket until 2020.
           | 
           | AM4 came out in 2016, supporting it with new CPUs for over 7
           | years is unheard of in the PC industry.
        
             | zamalek wrote:
             | And it earns them a lot of good will, and sales.
        
               | MPSimmons wrote:
               | I might buy one for one of my old desktops that's still
               | kicking around as a server. 16 cores isn't something to
               | shake a stick at!
        
             | FirmwareBurner wrote:
             | _> 7 years is unheard of in the PC industry_
             | 
             | LOL. Intel's LGA-775 which span support for CPUs from
             | single core Pentium 4, to dual core Core-2-Duo, to quad
             | core Core-2-Quad would like to have a word with you.
             | 
             | And 7 years back in those days was equivalent to an
             | etternity in today's tech progress.
             | 
             | https://en.wikipedia.org/wiki/LGA_775
        
               | borissk wrote:
               | Kind of - early socket 775 motherboards don't support
               | later Core CPUs.
        
               | universa1 wrote:
               | Not sure if the Mainboard Chipsets initially released
               | with the Pentium iv supported the later released core2
               | quads... Which is actually the case for the am4, where
               | even the cheapest, oldest Chipset (a320) can support the
               | newest am4 CPU, as long as the MB maker provides a bios
               | update.
        
               | FirmwareBurner wrote:
               | Sure, but the technological leaps from Pentium 4 to Core
               | 2 Duo and then to Quad that Intel made in those days over
               | 15 years ago, were massive enough to justify the
               | limitation of the same chipset from the Pentium 4 era not
               | supporting the later multi-core Core CPUs, compared to
               | the Ryzen 2-3-4 jumps that AMD made in a similar
               | timeframe which aren't as radically different to each
               | other.
        
               | Arrath wrote:
               | And since then has an Intel socket supported more than
               | one revolution of a tick-tock CPU generation?
               | 
               | Genuine question because I've now built several desktops
               | in the Intel 'Core iX' era and have never truly had on
               | opportunity to reuse a mobo.
        
               | FirmwareBurner wrote:
               | Of course not, but I was only pointing out that 7 years
               | of socket support is not unheard of as grandparent
               | claimed.
        
         | noobface wrote:
         | Motherboard prices have increased pretty significantly. First
         | comment on this hardforum post a entitled: AMD and Intel
         | motherboard prices skyrocket past surging inflation rates
         | thanks to 35-40% ASP increases[1]:
         | 
         | "Motherboard prices is why I haven't upgraded my CPU. More
         | important things going on right now."
         | 
         | [1]https://hardforum.com/threads/amd-and-intel-motherboard-
         | pric...
        
           | treprinum wrote:
           | AM5 mobos are more expensive than Threadripper x399 mobos
           | when they were new...
        
             | kvemkon wrote:
             | Hope AM5 is the last such an obsolete 2-channel memory
             | platform. Was it 2004-2005 as we got 2-channel memory with
             | single or first dual-core CPUs? Now we have 16 cores
             | with... still 2-channel memory. Looking forward to
             | 4-channel AM6 socket.
             | 
             | Memory bound software like OpenFOAM saturates memory
             | bandwidth already on 6-8 cores with 2-channel memory [1].
             | So actually we should have got 4-channel with first
             | 16-cores CPUs long ago.
             | 
             | [1] https://www.cfd-
             | online.com/Forums/hardware/198378-openfoam-b...
        
         | wtallis wrote:
         | Same chips, different binning. Zero design effort and almost
         | zero QA effort went into those new products, and it lets them
         | fine tune their margins.
        
         | dotnet00 wrote:
         | AM5 is a more expensive upgrade, not only are the motherboards
         | more expensive, they also take DDR5 and the overall performance
         | improvement is not so meaningful. So, for many people who
         | already have an AM4 PC, AM5 is not yet a justifiable upgrade.
         | 
         | There may also be some hesitance since while AMD technically
         | did support AM4 for longer than the promised duration, they
         | tried to pull out of it halfway through and overall it was kind
         | of a mess for a while in terms of compatibility with chipset
         | revision.
        
         | phkahler wrote:
         | >> New chips for the old platform? I wonder what the driving
         | factor even is to compel them to produce these?
         | 
         | I might upgrade my 2400G, but I'd want 8 cores in 65W which is
         | sadly not an option.
        
           | toast0 wrote:
           | > I might upgrade my 2400G, but I'd want 8 cores in 65W which
           | is sadly not an option.
           | 
           | 5700G is 8 cores, with graphics, 65w tdp, released in 2021.
           | This release also includes the 5700 (no letters) with no
           | graphics at 65w tdp. Looks like 5700X is also 65w tdp? Also,
           | you can usually set a 65w power limit on a cpu with a higher
           | tdp and get most of the performance.
        
         | NohatCoder wrote:
         | They are produced on an older processing node than the AM5
         | lineup. So if AMD switched to only produce AM5 CPUs and Radeon
         | 7000 graphics chips, the old factories would lack something
         | worthwhile to produce. Therefore AMD can negotiate much lower
         | rates for using those older facilities, making it viable to
         | continue production as long as the products sell. There aren't
         | any truly new chips, it is just new binning and marketing of
         | the old models.
        
         | flumpcakes wrote:
         | AMD historically supported their sockets a lot longer than
         | Intel's two years. Recently they moved to AM5 and only had a
         | single generation on a threadripper socket. This greatly
         | angered tech media and consumers, despite it still being better
         | than Intel support.
         | 
         | Perhaps releasing new SKUs on the "old" platform is a way to
         | bump up the numbers even more: "see we supported AM4 for 5+
         | generations!"
        
       | bloopernova wrote:
       | Having just recently built a Ryzen 7 7800X3D CPU based desktop, I
       | was worried I had bought just before a new line came out, but
       | from my reading of the press release, none of these CPUs are
       | considered a replacement for that chip. I think?
        
         | dawnerd wrote:
         | Literally did the same last month but heard rumblings about
         | what was coming and felt much better. The 7800x3d is really
         | awesome though.
        
           | slowmovintarget wrote:
           | Same with the Ryzen 9 7900X3D. I got a System76 machine, and
           | it is really good. Nvidia card for GPU / local inference.
        
             | asmor wrote:
             | The 7900X3D is by far the worst value AMD currently offers.
             | It only 6 cores per CCX, but two of them, so not only do
             | you get the scheduling issues of the 7950X3D where
             | sometimes games don't use the X3D CCX, you also only get 6
             | cores if it works. And you get a "7600X3D" (not a real
             | thing) if you disable the other CCX, something 7950X3D
             | owners would sometimes do to benchmark the scheduling
             | difference to get what's essentially a 7800X3D.
        
               | theogravity wrote:
               | This is the reason why I went with the 7950X and not the
               | 3D variant. Too much additional work involved to get the
               | other cores to play well.
               | 
               | Also I read that only one set of 6 cores as full access
               | to cache while the other doesn't or has partial access.
        
               | asmor wrote:
               | The cache is on a single CCX, yes. Which is why the
               | 7800X3D is so good: There is no second CCX.
               | 
               | Another big advantage is that to get them to work, they
               | need to use higher bin chips. CCX overheats easily
               | because the stacked cache acts like a heat shield, so you
               | get the most power efficient ones.
        
               | kvemkon wrote:
               | The disadvantage of single CCD is only half of available
               | write memory bandwidth (using at least 1 computing thread
               | on each CCD) [1].
               | 
               | [1] 16 B/cycle write vs 32 B/cycle read per CCD
               | https://www.servethehome.com/amd-ryzen-7000-series-
               | platform-...
        
               | asmor wrote:
               | I'd wager the number of workloads that'd actually take
               | advantage of operating on entirely different data on a
               | per-thread basis is much, much lower than ones that
               | benefit from massive L3 cache.
               | 
               | And note that the infinity fabric and memory controller
               | don't run on the same clock. The fabric tops out at ~2200
               | (though testing shows sweet spot at 2033 usually) and the
               | memory controller usually tops out at ~3100 (DDR5-6200).
        
               | kvemkon wrote:
               | Those might be not necessary 2 threads of one program,
               | but 2 independent programs.
        
               | slowmovintarget wrote:
               | The interesting thing was that in testing it turned out
               | that having 3D cache for all cores didn't increase
               | performance. The sweet spot for heat/performance was for
               | half.
               | 
               | Granted, I don't think I'm running anything so CPU
               | intensive that I'll be able to tell. For gaming, I was
               | more interested in the DDR5 memory than in worrying about
               | the L3 cache. For everything else I do (including running
               | LLMs and compilers) it's beyond adequate.
               | 
               | For a mere $20K I could have had an overall slower gaming
               | experience but be able to locally run things like the
               | Goliath 120B model, but that seemed like a poor trade-
               | off.
        
         | asmor wrote:
         | No, the "extreme gaming" line in the press release must have
         | been chosen by someone that have no relation to AMD products
         | and were just told "it does gaming without a GPU pretty good".
         | Which is true, it does do that.
         | 
         | These are less important CPUs in this lifecycle though, usually
         | the G series are pretty good for a small computer where you
         | don't want a dedicated tiny GPU, but all but one Ryzen 7000
         | desktop CPUs had integrated graphics (though, much fewer CUs).
        
         | dragontamer wrote:
         | AMD is fleshing out the $100 to $250 market with this
         | announcement.
         | 
         | Top end $300+ chips (like 7800x3d) are safe.
         | 
         | ---------
         | 
         | I'm seriously considering the $250 5700x3d for my nieces, or
         | maybe something cheaper. They don't need the latest-and-
         | greatest PC. But having something "pretty good" (and the
         | 5700x3d will be pretty good) is surely going to be appreciated.
        
           | PartiallyTyped wrote:
           | It's a bit funny how the 5800x3d was creeping in the 7xxx
           | series during launch.
        
       | piinbinary wrote:
       | The press release has a strange title for new chips targeted at
       | lower-end PCs (those without a dedicated GPU).
        
         | phkahler wrote:
         | >> The press release has a strange title for new chips targeted
         | at lower-end PCs
         | 
         | It's a pretty high-end CPU though, and the graphics support AV1
         | encode/decode. This would be perfect for software development,
         | video production, or pretty much anything but high-end gaming.
        
           | piinbinary wrote:
           | True, I should have said "low-end gaming PCs"
        
           | KennyBlanken wrote:
           | By the looks of how deep they had to dig to get frame rates
           | to show off, this chip struggles at almost any 3D game that's
           | been made in the last few years. It's equivalent to a six
           | year old mid-tier discreet GPU.
        
       | trynumber9 wrote:
       | > the fastest integrated desktop processor graphics in the world
       | 
       | I'm pretty sure the Mac Pro is a desktop. I'm pretty sure the M2
       | Ultra configuration includes an integrated graphics processor.
       | And I'm pretty sure it's faster than the 780M in almost every
       | workload.
       | 
       | AMD marketing team needs to be a bit less wrong. At least add a
       | "PC" caveat.
        
         | MacNCheese23 wrote:
         | Right, the topic "Extreme PC Gaming" - PC - PC - PC - yeah just
         | for you spelled again PC - is extremely misleading - next time
         | they should include "Not Overpriced" that people like you know
         | their beloved Hardware isnt meant.
        
           | trynumber9 wrote:
           | It's in certain spots but not the text where they claim that
           | nor the footnote. And in any case only the best by the
           | technicality of Apple selling desktop personal computers that
           | aren't IBM PC lineage.
        
           | ClassyJacket wrote:
           | You know Mac is a specific brand of PC, right?
        
         | fourteenfour wrote:
         | Yeah, this "article" is terrible. "users can expect immense
         | power and dominant performance for intensive workloads
         | including gaming and content creation."
        
       | shmerl wrote:
       | Kind of a misleading title. It's not next gen processors (you'd
       | expect Zen 5). It's still Zen 4.
        
         | brucethemoose2 wrote:
         | Very misleading.
         | 
         | As I have said before, HN needs a "misleading title" flag.
        
         | flumpcakes wrote:
         | Is it actually misleading? AMD never makes claim that their
         | consumer branding relates to uArch. AMD still has Zen 2 in
         | their 'modern' lineups. Intel also has previous generation
         | uArch in current SKUs, and let's not talk about the 10->11 and
         | 12->13 gen "refreshes".
         | 
         | Maybe the technically inclined of us would expect Zen 5 because
         | we're keeping note of CPU uArches but to the general public
         | they won't care. They will just see it's the biggest numbered
         | AMD chip and expect it to be their best offering - which for
         | integrated graphics on the desktop it is.
        
       | GeekyBear wrote:
       | If you consider "Extreme PC Gaming" to be AAA games at 1080p on
       | low settings, sure.
       | 
       | https://www.anandtech.com/show/21208/amd-unveils-ryzen-8000g...
        
         | declan_roberts wrote:
         | For a desktop without an additional $200-$500 graphics card
         | this is impressive.
        
           | brucethemoose2 wrote:
           | Not really, since the APUs themselves are historically quite
           | expensive.
        
             | Moto7451 wrote:
             | That hasn't been true for the Zen based models at least.
             | The 5700G and now the 8700G occupy the $350-370 price slot.
             | My comparable 13th gen Intel CPU with lesser graphics was
             | the same price. That's a great value for a casual gamer.
             | 
             | I'm sure at some point we'll see Intel based APU
             | equivalents on desktop with some teeth. Intel needs to
             | solve their driver issues first.
        
               | brucethemoose2 wrote:
               | > That's a great value for a casual gamer.
               | 
               | It depends. If you're truly performance insensitive and
               | _have_ to buy new, maybe, but these APUs are extremely
               | slow compared to even older, low end discrete GPUs.
               | 
               | AM5 is an expensive platform. The total cost of a DDR4
               | platform is peanuts in comparison.
        
               | toast0 wrote:
               | The cpus are currently significantly more expensive, yes,
               | but is the AM5 platform that much more expensive than
               | AM4?
               | 
               | On AM4, pcpartpicker says I can get an A520M board for
               | $70, and 2x16GB DDR4-3600 CL18 for $55; total $125
               | 
               | On AM5, an A620M board is $75, and 2x16GB DDR5-5600
               | starts at $76; total $151.
               | 
               | The real question is where the Microcenter bundle pricing
               | will end up, if you live near a Microcenter.
        
               | brucethemoose2 wrote:
               | Mmm, A620 has come down considerably since I last
               | checked. That's good.
        
           | CivBase wrote:
           | That price range might have been accurate 7 years ago, but
           | modern graphics cards for gaming range from $300 to $1000
           | with the highest level pushing $2000 thanks to the AI craze.
           | The higher prices make quality iGPUs all the more relevent -
           | especially since the "gamer" demographic continues growing
           | and many of the most popular games are not particularly
           | demanding.
        
             | brucethemoose2 wrote:
             | Platforms, especially AM5, are very expensive too. Ryzen
             | 8000 is far slower than even the slowest new discrete GPU.
             | 
             | The real value option is buying an older platform (AM4) and
             | a used discrete GPU, from back when prices were more sane.
        
               | snvzz wrote:
               | ... if electricity was free.
               | 
               | Power efficiency wise, a discrete GPU won't compare
               | favorably.
        
           | GeekyBear wrote:
           | A used graphics card good enough to play 1080p games on low
           | settings would not be very expensive at all.
        
             | ozarker wrote:
             | You can find used 1080ti's for like $100 nowadays. I still
             | use one for 1440p gaming and get excellent frames with most
             | games. Kinda mind blowing value
        
               | readyplayernull wrote:
               | Aren't these the remnants of the Bitcoin gold-rush age?
        
               | currymj wrote:
               | this GPU was so good at the time that it started
               | cannibalizing sales of higher end data center cards, and
               | Nvidia had to put terms in the EULA for the drivers to
               | try to legally prevent this.
        
               | thot_experiment wrote:
               | 100% 1080Ti is insane value, I have two of them and zero
               | complaints, the 11 gigs of vram make it great for
               | inference as well.
        
         | blibble wrote:
         | it's average frame rate too
         | 
         | 99%ile (or even minimum) would be far more interesting
        
           | NavinF wrote:
           | I'm pretty sure this product line targets people who can't
           | afford a new GPU and aren't aware of the used hardware
           | market. 60fps average and arbitrarily low 1%-low fps is
           | totally acceptable for this segment.
           | 
           | Recall that a lot of games ran at 30fps in the PS3 era. Back
           | then people would unironically say "the human eye can't see
           | 60fps". Even today a lotta gamers have never experienced low
           | latency
        
             | brucethemoose2 wrote:
             | Precisely. This is for when used hardware is not an option.
        
             | blibble wrote:
             | I'd agree if AMD's title didn't explicitly say "Extreme PC
             | Gaming"
             | 
             | Extreme PC Gaming in 2024 is not 60fps at 1080p, it wasn't
             | even in 2018
        
         | KennyBlanken wrote:
         | It's six year old mid-tier performance.
         | 
         | AMD marketing had to dig pretty deep to find decent
         | numbers...4-5 year old games. Metro Exodus? Shadow of the Tomb
         | Raider? GTA 5? And some random kid's game nobody has ever heard
         | of?
         | 
         | Given the choices are between "buggy as hell" (Intel) "space
         | heater and slightly buggy" (AMD discreet) and "good but
         | stupidly overpriced" (Nvidia), a fourth option, for the low end
         | of the market, is welcome.
         | 
         | Edit: since I'm being downvoted for claiming the games listed
         | aren't relevant: go look at steamcharts. GTA5 and Dota are the
         | only top 25 games; Cyberpunk is #26. The rest aren't even top
         | 100 games.
        
           | chmod775 wrote:
           | That game selection isn't an accident. They're all extremely
           | popular and/or have a benchmark built-in.
           | 
           | Look at any third-party review and you'll find a very similar
           | selection.
        
             | KennyBlanken wrote:
             | With the exception of GTA5 and DOTA2, none of the games
             | they list are even in the top 50 on Steamcharts.
             | 
             | "and/or have a benchmark built in" doing a whole lot of
             | lifting...
        
           | drzaiusapelord wrote:
           | Kid's game? Tiny Tina/Borderlands is a huge franchise in
           | gaming. I think its clear these two games were chosen because
           | the cell-shaded style they use is less GPU demanding, but its
           | still impressive. But note things like Cyberpunk 2077 is
           | there too. These are all popular games. I don't think its
           | this dishonest ploy you're making out to be.
           | 
           | My 2070 barely handles those games at that fps.
           | 
           | Yes those are older games because this APU is not going to
           | play modern AAA at 4k, but it can handle some pretty hefty
           | games fairly well and might be tempting to budget gamers
           | especially when mid-tier cards start at $500-600 nowadays.
        
             | KennyBlanken wrote:
             | There are 2,000 people playing Tiny Tina on Steam.
             | 
             | There are _a million_ people playing CS2. Fortnite sees
             | about 2.6 million and peaks at _eleven million_.
             | 
             | The Finals has 70,000+ people playing.
             | 
             | Tiny Tina has about 1/5th of the player count it would need
             | to be in the top 100. So yes, AMD marketing was pretty
             | fucking desperate when they listed that game.
        
               | wiseowise wrote:
               | > There are 2,000 people playing Tiny Tina on Steam.
               | 
               | The game is cross-platform and available on PS4/PS5, Xbox
               | X/S, Xbox One, Steam and Epic Store. And on top of that
               | it is a paid game. I'm not even aware what this game is,
               | but I'm aware of Borderlands franchise and they're
               | quality games.
               | 
               | Why would you compare player numbers to top f2p games is
               | beyond me.
        
           | fireflash38 wrote:
           | Crysis was used in benchmarks for how long exactly? And it
           | wasnt like it was a blockbuster either...
           | 
           | Maybe you should make an actual performance argument instead
           | of a popularity contest.
        
         | drzaiusapelord wrote:
         | 63 fps on Cyberpunk 2077 which when it came out was "unplayable
         | but on the most powerful PCs" is incredibly impressive without
         | a GPU.
         | 
         | This is pretty close to my 2070 GPU does, which cost me $400+ a
         | couple years ago and uses 215W. My CPU also uses 100W, so about
         | 300W compared to 65W for very roughly similar performance (in
         | some games) is still pretty incredible.
         | 
         | Now GPUs are almost twice that for that xx70's and xx80's
         | cards. I don't know what market this is aimed at, but this is
         | very impressive for an APU. There's a pretty strong budget PC
         | gamer community that could benefit from this. There are a lot
         | of people who can't afford gaming PCs anymore and this could be
         | a big seller to the budget community. Also at 65TDP power
         | supply and fans and ventilation costs will be low, so they can
         | be sold in cheap and modest cases and ps's.
         | 
         | I'm not sure if these chips translate into laptops, but a
         | laptop that games well is always desirable in the gaming
         | market.
        
           | KennyBlanken wrote:
           | > This is pretty close to my 2070 GPU does
           | 
           | Not even remotely close. It's equivalent to an RX570 or 580,
           | which is roughly 1050 territory. Your 2070 is equivalent
           | roughly to a 1080, plus raytracing.
           | 
           | > Cyberpunk when it came out [..] unplayable but on the most
           | powerful PCs is incredibly impressive without a GPU.
           | 
           | The game has seen numerous patches in the last _three years
           | since it was released_ that have _significantly_ increased
           | its performance.
        
           | vient wrote:
           | > 63 fps on Cyberpunk 2077 which when it came out was
           | "unplayable but on the most powerful PCs" is incredibly
           | impressive without a GPU.
           | 
           | Cyberpunk got a ton of (performance) fixes after release, so
           | not exactly relevant.
        
       | LeifCarrotson wrote:
       | Not an expert, so excuse me if this is obvious, but would these
       | integrated graphics be any good for NLP? A GPU with 24GB of video
       | memory costs $2000, you can put one of these in a system with 128
       | GB or 256 GB of DDR4 or DDR5 and give your neural network
       | training software over 100GB of video memory if you want.
       | 
       | You only have 12 CUs, 768 shading units, 48 texture mapping
       | units, and 32 ROPs but huge amounts of cheap memory. I'm not sure
       | where the bottleneck is but at least it won't crash and burn if
       | you ask it to start neural network training routine that requires
       | 100 GB of RAM, and you don't have to take out a second mortgage
       | for a video card with the requisite amount of graphics memory.
        
         | brucethemoose2 wrote:
         | They're slow, but OK for inference.
         | 
         | In practice no one uses AMD/Intel IGPs because no knows about
         | the mlc-llm vulkan backend. llama.cpp is en vogue on the
         | desktop, which does not support IGPs outside of Apple, and
         | otherwise people use backends targeted at server GPUs.
        
         | eightysixfour wrote:
         | They're good from the "do it at home" perspective, not from the
         | business or enterprise performance perspective.
         | 
         | One of the ways folks do this now is use the Mac M* chips,
         | since they have so much combined RAM. The raw performance isn't
         | as high as GPUs, but they can fit substantially larger models
         | in memory.
        
         | bb88 wrote:
         | The bottleneck would most certainly be memory, as you'll
         | quickly overwhelm the on-die cache, without careful
         | optimization.
         | 
         | That said, I think AMD's chiplet strategy might come into play.
         | I could see AMD release a 4 core 8 thread processor with
         | increased on die cache and other chiplets being neural compute
         | units.
        
           | brucethemoose2 wrote:
           | People keep reiterating this, but in practice one needs
           | compute _and_ bandwidth, especially outside of tiny context
           | test prompts. On my 4900HS, mlc-llm vulkan is far faster than
           | CPU inference on the same memory bus, with less cache, which
           | wouldn 't be the case if it was bandwidth/cache bound (since
           | the CPU has far more cache as well).
           | 
           | My 7800X3D has 96MB of L3 and a golden-bin DDR5 overclock,
           | but its absolutely dreadful for inference.
        
       | throwup238 wrote:
       | _> "AMD continues to lead the AI hardware revolution by offering
       | the broadest portfolio of processors with dedicated AI engines in
       | the x86 market," said Jack Huynh, senior vice president and
       | general manager, Computing and Graphics Group at AMD._
       | 
       | I can't believe anyone actually said this with a straight face.
        
         | jasongill wrote:
         | It's not an inaccurate statement... I think that Intel's AI
         | engine (NPU) is only available in one very-recently-released
         | mobile processor, correct? AMD has had their dedicated AI on-
         | chip thing for a year or more on multiple lines of CPU's.
        
       | flumpcakes wrote:
       | This looks perfect to me. I have a big desktop, but I also want a
       | tiny NUC-like PC with a decent integrated graphics for playing
       | with Linux and trying gaming on Linux - without having to have a
       | full sized PC. I also don't want to bother dual booting, or
       | running hypervisors with passthroughs.
        
       ___________________________________________________________________
       (page generated 2024-01-11 23:00 UTC)