Post AxKgoAcvmdwfERJ39k by danil@mastodon.gamedev.place
(DIR) More posts by danil@mastodon.gamedev.place
(DIR) Post #AxKgo9E4zYTKt4fgmW by shivoa@mastodon.gamedev.place
2025-08-19T11:42:56Z
0 likes, 0 repeats
AMD Strix Halo being RDNA3.5 is such a shame. 40 CUs? I'm thinking about RX 9060 XT cards in comparison to this SoC design (if only they'd used the latest graphics IP for it).
(DIR) Post #AxKgoAcvmdwfERJ39k by danil@mastodon.gamedev.place
2025-08-19T14:02:50Z
0 likes, 0 repeats
@shivoa >if only they'd used the latest graphics IP for itwhy?you wont pay twice thenmaximizing profit is the target of amd-hardwareno hardware video encoding till 2020no physx no hardware bvh rdna2/3no compute till 2020no proper blender support (hip 2022+)no multiple hardware video encoding threads (2025 still no) (cheapest rtx gpu have/can do atleast 4 video encoding threads)...
(DIR) Post #AxKgoC4GQVP3hV6OOm by shivoa@mastodon.gamedev.place
2025-08-19T14:43:43Z
0 likes, 0 repeats
@danil I mean, given their final realisation that ML is important and sells hardware (because it allows them to offer more with less) then it seems a shame that FSR4 will never be an option for these systems.Given how huge of an iGPU block this SoC has, it would have seemed to make sense to use the same arch as their main cards rather than the arch they're using for iGPUs at a small fraction of the size.
(DIR) Post #AxKgoD02xfckai8XSq by danil@mastodon.gamedev.place
2025-08-19T15:00:55Z
0 likes, 0 repeats
@shivoa >that ML is important and sells hardwarefirst time?- 2008-2020 - nvidia had hardware video encoding- 2015+ video streaming - every communication app use video encoding- even intel CPU got video encoding integrated2020 - amd integrated video encoding to consumer gpustook them long enough to realize?now - "ML is important"ML is important for atleast decade already publicly - similar to "video encoding"and "just now" they realized
(DIR) Post #AxKgoDyfKI75ciUwwy by ignaloidas@not.acu.lt
2025-08-19T16:10:29.665Z
0 likes, 0 repeats
@danil@mastodon.gamedev.place @shivoa@mastodon.gamedev.place that's just how long the HW product cycles take, stuff that's coming out now has started design 5 years ago, with the blocks decided 4 years ago. AMD graphics team doesn't seem to be looking into the future.
(DIR) Post #AxKibn9TRBierHkkcK by danil@mastodon.gamedev.place
2025-08-19T16:26:28Z
0 likes, 0 repeats
@ignaloidas @shivoa I see it as - "they no longer can go without it because 90% of market use it - this is only reason it integrated for free"AMD had video encoding accelerators since 2010 - but as "separate gpu"in 2010+ when "clouds" become popular - AMD were literally selling separate "graphics accelerator" and "video encoding accelerator" - as separate hardware
(DIR) Post #AxKibo7NqRdpr5mazw by ignaloidas@not.acu.lt
2025-08-19T16:30:46.004Z
0 likes, 0 repeats
@danil@mastodon.gamedev.place @shivoa@mastodon.gamedev.place I don't recall video encoding accelerators from AMD - I know Xilinx did them, maybe you're mistaking them?Though again, there's one thing in having the video encoding IP, and another to making a decision to include them into the design of the GPU - sure you may have the IP for accelerators for video serving, but if you only then see a need for that in consumer GPUs, it's gonna be 5 years before it's going to be included. AMD has been historically quite shit at predicting and/or driving the need for GPU technologies, while Nvidia has been very good at it.