[HN Gopher] Nvidia won, we all lost
       ___________________________________________________________________
        
       Nvidia won, we all lost
        
       Author : todsacerdoti
       Score  : 864 points
       Date   : 2025-07-04 21:58 UTC (1 days ago)
        
 (HTM) web link (blog.sebin-nyshkim.net)
 (TXT) w3m dump (blog.sebin-nyshkim.net)
        
       | d00mB0t wrote:
       | Sounds about right :D
        
       | leakycap wrote:
       | This article goes much deeper than I expected, and is a nice
       | recap of the last few years of "green" gpu drama.
       | 
       | Liars or not, the performance has not been there for me in any of
       | my usecases, from personal to professional.
       | 
       | A system from 2017/2018 with an 8700K and an 8GB 2080 performs so
       | closely to the top end, expensive systems today that it makes
       | almost no sense to upgrade at MSRP+markup unless your system is
       | older than this.
       | 
       | Unless you need specific features only on more recent cards,
       | there are very few use cases I can think of needing more than a
       | 30 series card right now.
        
         | pixl97 wrote:
         | I mean, most people probably won't directly upgrade. Their old
         | card will die, or eventually nvidia will stop making drivers
         | for it. Unless you're looking around for used cards, the price
         | difference between something low end like a 3060 isn't that
         | much less in price for the length of support you're going to
         | get.
         | 
         | Unless nvidia's money printing machine breaks soon, expect the
         | same to continue for the next 3+ years. Crappy expensive cards
         | with a premium on memory with almost no actual video rendering
         | performance increase.
        
           | leakycap wrote:
           | > Unless you're looking around for used cards, the price
           | difference between something low end like a 3060 isn't that
           | much less in price for the length of support you're going to
           | get.
           | 
           | This does not somehow give purchasers more budget room now,
           | but they can buy 30-series cards in spades and not have to
           | worry about the same heating and power deliveries as a little
           | bonus.
        
         | theshackleford wrote:
         | > A system from 2017/2018 with an 8700K and an 8GB 2080
         | performs so closely to the top end, expensive systems today
         | 
         | This is in no way true and is quite an absurd claim. Unless you
         | meant for some specific isolated purposed restricted purely to
         | yourself and _your_ performance needs.
         | 
         | > there are very few use cases I can think of needing more than
         | a 30 series card right now.
         | 
         | How about I like high refresh and high resolutions? I'll throw
         | in VR to boot. Which are my real use cases. I use a high
         | refresh 4K display and VR, both have benefited hugely from my
         | 2080Ti > 4090 Shift.
        
           | Der_Einzige wrote:
           | I have this exact CPU sans a 3090 (I started with 2080 but
           | upgraded due to local AI needs). 8700k is perfectly fine for
           | todays workloads. CPUs have stagnated and also the amount of
           | RAM in systems has too (Apple still macbook air defaults of 8
           | GB in 2025??????)
        
             | theshackleford wrote:
             | It wasn't "workloads" being talked about, it was gaming
             | performance, the one area in which there is an absolutely
             | huge difference mainly on the GPU side. We are taking a
             | difference of close too if not 100%.
             | 
             | And despite CPUs stagnating it's absolutely still possible
             | to be held back on a stronger GPU with an older CPU
             | especially in areas such as 1% lows, stuttering etc.
        
       | ryao wrote:
       | > The RTX 50 series are the second generation of NVIDIA cards to
       | use the 12VHPWR connector.
       | 
       | This is wrong. The 50 series uses 12V-2x6, not 12VHPWR. The 30
       | series was the first to use 12VHPR. The 40 series was the second
       | to use 12VHPWR and first to use 12V-2x6. The 50 series was the
       | second to use 12V-2x6. The female connectors are what changed in
       | 12V-2x6. The male connectors are identical between 12V-2x6 and
       | 12VHPWR.
        
         | ohdeargodno wrote:
         | Nitpicking it doesn't change the fact that the 12v2x6 connector
         | _also_ burns down.
        
           | ryao wrote:
           | The guy accuses Nvidia of not doing anything about that
           | problem, but ignored that they did with the 12V-2x6
           | connector, which as far as I can tell, has had far fewer
           | issues.
        
             | Gracana wrote:
             | It still has no fusing, sensing, or load balancing for the
             | individual wires. It is a fire waiting to happen.
        
               | ryao wrote:
               | It is a connector. None of the connectors inside a PC
               | have those. They could add them to the circuitry on the
               | PCB side of the connector, but that is entirely separate
               | from the connector.
               | 
               | That said, the industry seems to be moving to adding
               | detection into the PSU, given seasonic's announcement:
               | 
               | https://www.tomshardware.com/pc-components/power-
               | supplies/se...
               | 
               | Finally, I think there is a simpler solution, which is to
               | change the cable to use two large gauge wires instead of
               | 12 individual ones to carry current. That would eliminate
               | the need for balancing the wires in the first place.
        
               | Gracana wrote:
               | Previous well-designed video cards used the technologies
               | I described. Eliminating the sense circuits and fusing is
               | a recent development.
               | 
               | I do like the idea of just using big wires. It'd be so
               | much cleaner and simpler. Also using 24 or 48V would be
               | nice, but that'd be an even bigger departure from current
               | designs.
        
               | ryao wrote:
               | > Previous well-designed video cards used the
               | technologies I described. Eliminating the sense circuits
               | and fusing is a recent development.
               | 
               | My point is that the PCB is where such features would be
               | present, not the connector. There are connectors that
               | have fusing. The UK's AC power plugs are examples of
               | them. The connectors inside PCs are not.
        
               | Gracana wrote:
               | Oh, sure, I'm not proposing that the connector itself
               | should have those features, rather that it shouldn't be
               | used without them present on the device.
        
             | MindSpunk wrote:
             | The 50 series connectors burned up too. The issue was not
             | fixed.
        
               | ryao wrote:
               | It seems incredibly wrong to assume that there was only 1
               | issue with 12WHPWR. 12V-2x6 was an improvement that
               | eliminated some potential issues, not all of them. If you
               | want to eliminate all of them, replace the 12 current
               | carrying wires with 2 large gauge wires. Then the wires
               | cannot become unbalanced. Of course, the connector would
               | need to split the two into 12 very short wires to be
               | compatible, but those would be recombined on the GPU's
               | PCB into a single wire.
        
           | numpad0 wrote:
           | (context: 12VHPWR and 12V-2x6 are the exact same thing. The
           | latter is supposed to be improved and totally fixed, complete
           | with the underspecced load-bearing "supposed to be" clause.)
        
             | AzN1337c0d3r wrote:
             | They are not the exact same thing.
             | 
             | https://www.corsair.com/us/en/explorer/diy-builder/power-
             | sup...
        
       | bigyabai wrote:
       | > Pretty much all upscalers force TAA for anti-aliasing and it
       | makes the entire image on the screen look blurry as fuck the
       | lower the resolution is.
       | 
       | I feel like this is a misunderstanding, though I admit I'm
       | splitting hairs here. DLSS _is_ a form of TAA, and so is FSR and
       | most other modern upscalers. You generally don 't need an extra
       | antialiasing pipeline if you're getting an artificially
       | supersampled image.
       | 
       | We've seen this technique variably developed across the lifespan
       | of realtime raster graphics; first with checkerboard rendering,
       | then TAA, then now DLSS/frame generation. It has upsides and
       | downsides, and some TAA implementations were actually really good
       | for the time.
        
         | kbolino wrote:
         | Every kind of TAA that I've seen creates artifacts around fast-
         | moving objects. This may sound like a niche problem only found
         | in fast-twitch games but it's cropped up in turn-based RPGs and
         | factory/city builders. I personally turn it off as soon as I
         | notice it. Unfortunately, some games have removed traditional
         | MSAA as an option, and some are even making it difficult to
         | turn off AA when TAA and FXAA are the only options (though you
         | can usually override these restrictions with driver settings).
        
           | ohdeargodno wrote:
           | It's not that it's difficult to turn off TAA: it's that so
           | many modern techniques do not work without temporal
           | accumulation and anti-aliasing.
           | 
           | Ray tracing? Temporal accumulation and denoising. Irradiance
           | cache? Temporal accumulation and denoising. most modern light
           | rendering techniques cannot be done in time in a single
           | frame. Add to that the fact that deferred or hybrid rendering
           | makes implementing MSAA be anywhere between "miserable" and
           | "impossible", and you have the situation we're in today.
        
             | kbolino wrote:
             | A lot of this is going to come down to taste so _de
             | gustibus_ and all that, but this feels like building on a
             | foundation of sand. If the artifacts can be removed (or at
             | least mitigated), then by all means let 's keep going with
             | cool new stuff as long as it doesn't detract from other
             | aspects of a game. But if they can't be fixed, then either
             | these techniques ought to be relegated to special uses
             | (like cutscenes or the background, kinda like the pre-
             | rendered backdrops of FF7) or abandoned/rethought as pretty
             | but impractical.
        
               | ohdeargodno wrote:
               | So, there is a way to make it so that TAA and various
               | temporal techniques look basically flawless. They need a
               | _lot_ of information and pixels.
               | 
               | You need a 4k rendering resolution, at least. Modern
               | effects look stunning at that res.
               | 
               | Unfortunately, nothing runs well at 4k with all the
               | effects on.
        
           | user____name wrote:
           | The sad truth is that with rasterization every renderer needs
           | to be designed around a specific set of antialiasing
           | solutions. Antialiasing is like a big wall in your rendering
           | pipeline, there's the stuff you can do before resolving and
           | the stuff you can do afterwards. The problem with MSAA is
           | that it is pretty much tightly coupled with all your
           | architectural rendering decisions. To that end, TAA is simply
           | the easiest to implement and it kills a lot of proverbial
           | birds with one stone. And it can all be implemented as
           | essentially a post processing effect, it has much less of the
           | tight coupling.
           | 
           | MSAA only helps with geometric edges, shader aliasing can be
           | combatted with prefiltering but even then it's difficult to
           | get rid of it completely. MSAA also needs beefy multisample
           | intermediate buffers, this makes it pretty much a non-starter
           | on heavily deferred rendering pipelines, which throw away
           | coverage information to fit their framebuffer budget. On top
           | of that the industry moved to stochastic effects for
           | rendering all kinds of things that were too expensive before,
           | the latest being actual realtime path tracing. I know people
           | moan about TAA and DLSS but to do realtime path tracing at 4k
           | is sort of nuts really. I still consider it a bit of a
           | miracle we can do it at all.
           | 
           | Personally, I wish there was more research by big players
           | into things like texture space lighting, which makes shading
           | aliasing mostly go away, plays nice with alpha blending and
           | would make MSAA viable again. The issue there is with shading
           | only the stuff you see and not wasting texels.
        
             | kbolino wrote:
             | There's another path, which is to raise the pixel densities
             | so high we don't need AA (as much) anymore, but I'm going
             | to guess it's a) even more expensive and b) not going to
             | fix all the problems anyway.
        
               | MindSpunk wrote:
               | That's just called super sampling. Render at 4k+ and down
               | sample to your target display. It's as expensive as it
               | sounds.
        
               | kbolino wrote:
               | No, I mean high pixel densities all the way to the
               | display.
               | 
               | SSAA is an even older technique than MSAA but the results
               | are not visually the same as just having a really high-
               | DPI screen with no AA.
        
       | cherioo wrote:
       | High end GPU has over the last 5 years slowly turning from an
       | enthusiast product into a luxury product.
       | 
       | 5 or maybe 10 years ago, high-end GPU are needed to run games at
       | reasonably eye candy setting. In 2025, $500 mid-range GPUs are
       | more than enough. Folks all over can barely tell between High and
       | Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling.
       | There's just no point to compete at $500 price point any more,
       | that Nvidia has largely given up and relegating to the AMD-built
       | Consoles, and integrated graphics like AMD APU, that offer good
       | value in low-end, medium-end, and high-end.
       | 
       | Maybe the rumored Nvidia PC, or the Switch 2, can bring some
       | resurgence.
        
         | ohdeargodno wrote:
         | Not quite $500, but at $650, the 9070 is an absolute monster
         | that outperforms Nvidia's equivalent cards in everything but
         | ray tracing (which you can only turn on with full DLSS framegen
         | and get a blobby mess anyways)
         | 
         | AMD is truly making excellent cards, and with a bit of luck
         | UDNA is even better. But they're in the same situation as
         | Nvidia: they could sell 200 GPUs, ship drivers, maintain them,
         | deal with returns and make $100k... Or just sell a single
         | MI300X to a trusted partner that won't make any waves and still
         | make $100k.
         | 
         | Wafer availability unfortunately rules all, and as it stands,
         | we're lucky neither of them have abandoned their gaming
         | segments for massively profitable AI things.
        
           | enraged_camel wrote:
           | I have a 2080 that I'm considering upgrading but not sure
           | which 50 series would be the right choice.
        
             | thway15269037 wrote:
             | Grab a used/refurb 3090 then. Probably as legendary card as
             | a 1080Ti.
        
               | k12sosse wrote:
               | Just pray that it's a 3090 under that lid when you buy it
               | second hand
        
             | magicalhippo wrote:
             | I went from a 2080 Ti to a 5070 Ti. Yes it's faster, but
             | for the games I play, not dramatically so. Certainly not
             | what I'm used to doing such a generational leap. The 5070
             | Ti _is_ noticeably faster at local LLMs, and has a bit more
             | memory which is nice.
             | 
             | I went with the 5070 Ti since the 5080 didn't seem like a
             | real step up, and the 5090 was just too expensive and
             | wasn't in stock for ages.
             | 
             | If I had a bit more patience, I would have waited till the
             | next node refresh, or for the 5090. I don't think any of
             | the other current 50-series cards are worth besides the
             | 5090 it if you're coming from a 2080. And by worth it I
             | mean will give you a big boost in performance.
        
             | Rapzid wrote:
             | I went from a 3070 to 5070 Ti and it's fantastic. Just
             | finished Cyberpunk Max'd out at 4k with DLSS balanced, 2x
             | frame gen, and reflex 2. Amazing experience.
        
           | cosmic_cheese wrote:
           | Some models of 9070 use the well-proven old style PCI-E power
           | connectors too, which is nice. As far as I'm aware none of
           | the current AIB midrange or high end Nvidia cards do this.
        
             | Henchman21 wrote:
             | As I understand it, for the 50-series nvidia _requires_ the
             | 12VHPWR connector
        
         | dukeyukey wrote:
         | I bought a new machine with an RTX 3060 Ti back in 2020 and
         | it's still going strong, no reason to replace it.
        
           | rf15 wrote:
           | same, 2080 Super here, I even do AI with it
        
         | gxs wrote:
         | I think this is the even broader trend here
         | 
         | In their never ending quest to find ways to suck more money out
         | of people, one natural extension is to just turn the thing into
         | a luxury good and that alone seems to justify the markup
         | 
         | This is why new home construction is expensive - the layout of
         | a home doesn't change much but it's trivial to throw on some
         | fancy fixtures and slap the deluxe label on the listing.
         | 
         | Or take a Toyota, slap some leather seats on it, call it a
         | Lexus and mark up the price 40% (I get that these days there
         | are more meaningful differences but the point stands)
         | 
         | This and turning everything into subscriptions alone are
         | responsible for 90% of the issues I have as a consumer
         | 
         | Graphics cards seem to be headed in this direction as well -
         | breaking through that last ceiling for maximum fps is going to
         | be like buying a bentley (if it isn't already) where as before
         | it was just opting for the v8
        
           | bigyabai wrote:
           | Nvidia's been doing this for a while now, since at least the
           | Titan cards and technically the SLI/Crossfire craze too. If
           | you sell it, egregiously-compensated tech nerds will show up
           | with a smile and a wallet large enough to put a down-payment
           | on two of them.
           | 
           | I suppose you could also blame the software side, for
           | adopting compute-intensive ray tracing features or getting
           | lazy with upscaling. But PC gaming has always been a luxury
           | market, at least since "can it run Crysis/DOOM" was a
           | refrain. The homogeneity of a console lineup hasn't ever
           | really existed on PC.
        
         | Tadpole9181 wrote:
         | Just going to focus on this one:
         | 
         | > DLSS vs FSR, or DLSS FG and Lossless Scaling.
         | 
         | I've used all of these (at 4K, 120hz, set to "balanced") since
         | they came out, and I just don't understand how people say this.
         | 
         | FSR is a vaseline-like mess to me, it has its own _distinct_
         | blurriness. Not as bad as naive upscaling, and I 'll use it if
         | no DLSS is available and the game doesn't run well, but it's
         | distracting.
         | 
         | Lossless is borderline unusable. I don't remember the
         | algorithm's name, but it has a blur similar to FSR. It cannot
         | handle text or UI elements without artifacting (because it's
         | not integrated in the engine, those don't get rendered at
         | native resolution). The frame generation causes almost
         | everything to have a ghost or afterimage - UI elements and the
         | reticle included. It can also _reduce_ your framerate because
         | it 's not as optimized. On top of that, the way the program
         | works interferes with HDR pipelines. It is a last resort.
         | 
         | DLSS (3) is, by a large margin, the best offering. It just
         | works and I can't notice any cons. Older versions _did_ have
         | ghosting, but it 's been fixed. And I can retroactively fix
         | older games by just swapping the DLL (there's a tool for this
         | on GitHub, actually). I have not tried DLSS 4.
        
           | paulbgd wrote:
           | I've used fsr 4 and dlss 4, I'd say fsr 4 is a bit ahead of
           | dlss 3 but behind dlss 4. No more vaseline smear
        
           | cherioo wrote:
           | Maybe I over exaggerated, but I was dumbfounded myself
           | reading people's reaction to Lossless Scaling
           | https://www.reddit.com/r/LinusTechTips/s/wlaoHl6GAS
           | 
           | Most people either can't tell the difference, don't care
           | about the difference, or both. Similar discourse can be found
           | about FSR, frame drop, and frame stutter. I have conceded
           | that most people do not care.
        
         | piperswe wrote:
         | 10 years ago, $650 would buy you a top-of-the-line gaming GPU
         | (GeForce GTX 980 Ti). Nowadays, $650 might get you a mid-range
         | RX 9070 XT if you miraculously find one near MSRP.
        
           | ksec wrote:
           | That is $880 dollars in today's term. And 2015 Apple was
           | already shipping a 16nm SoC. The GeForce GTX 980 Ti was still
           | on 28nm. Two generation Node behind.
        
           | conception wrote:
           | Keeping with inflation (650 to 880) it'd get you a 5070TI.
        
             | orphea wrote:
             | 5070TI
             | 
             | Which, performance-wise, is a 60TI class card.
        
           | wasabi991011 wrote:
           | $650 of 2015 USD is around $875 of 2025 USD fwiw
        
         | datagram wrote:
         | The fact that we're calling $500 GPUs "midrange" is proof that
         | Nvidia's strategy is working.
        
           | WithinReason wrote:
           | What strategy? They charge more because manufacturing costs
           | are higher, cost per transistor haven't changed much since
           | 28nm [0] but chips have more and more transistors. What do
           | you think that does to the price?
           | 
           | [0]: https://www.semiconductor-digest.com/moores-law-indeed-
           | stopp...
        
             | NooneAtAll3 wrote:
             | strategy of marketting expensive product as normal one?
             | obviously?
             | 
             | if your product can't be cheap - your product is luxury,
             | not a day-to-day one
        
               | WithinReason wrote:
               | It's mid range. The range shifted.
        
           | blueboo wrote:
           | I think my TNT2 Ultra was $200. But Nvidia had dozens of
           | competitors back then. 89 when it was founded! Now: AMD...
        
         | luisgvv wrote:
         | Absolutely right, only AAA games get to showcase the true power
         | of GPUs.
         | 
         | For cheaper guys like me, I'll just give my son indie and low
         | graphic games which he enjoys
        
       | ionwake wrote:
       | I don't want to jump on nvidia but I found it super weird when
       | they clearly remote controlled a Disney bot onto the stage and
       | claimed it was all using real time AI which was clearly
       | impossible due to no latency and weirdly the bot verifying
       | correct stage position in relation to the presenter. It was
       | obviously the Disney bot just being controlled by someone off
       | stage.
       | 
       | I found it super alarming because why would they fake something
       | on stage to the extent of just lying.i know Steve jobs had backup
       | phones but jsut claiming a robot is autonomous when it isn't I
       | just feel it was scammy.
       | 
       | It reminded me of when Tesla had remote controlled Optimus bots.
       | I mean I think that's awesome like super cool but clearly the
       | users thought the robots were autonomous during that dinner
       | party.
       | 
       | I have no idea why I seem to be the only person bothered by
       | "stage lies" to this level. Tbh even the Tesla bots weren't
       | claimed to be autonomous so actually I should never have
       | mentioned them but it explains the "not real" vibe.
       | 
       | Not meaning to disparage just explaining my perception as a
       | European maybe it's just me though!
       | 
       | EDIT > Im kinda suprised by the weak arguments in the replies, I
       | love both companies, I am just offering POSITIVE feedback, that
       | its important ( in my eyes ) to be careful not to pretend in
       | certain specific ways or it makes the viewer question the
       | foundation ( which we all know is SOLID and good ).
       | 
       | EDIT 2 >There actually is a good rebuttal in the replies,
       | although apparently I have "reading comprehension skill
       | deficiencies" its just my pov that they were insinuating the
       | robot was aware of its surroundings, which is fair enough.
        
         | elil17 wrote:
         | As I understand it the Disney bots do actually use AI in a
         | novel way: https://la.disneyresearch.com/publication/design-
         | and-control...
         | 
         | So there's at least a bit more "there" there than the Tesla
         | bots.
        
           | ionwake wrote:
           | I believe its RL trained only.
           | 
           | See this snipet : "Operator Commands Are Merged: The control
           | system blends expressive animation commands (e.g., wave, look
           | left) with balance-maintaining RL motions"
           | 
           | I will print a full retraction if someone can confirm my gut
           | feeling is correct
        
             | dwattttt wrote:
             | Having worked on control systems a long time ago, that's a
             | 'nothing' statement: the whole job of the control system is
             | to keep the robot stable/ambulating, regardless of whatever
             | disturbances occur. It's meant to reject the forces induced
             | due to waving exactly as much as bumping into something
             | unexpected.
             | 
             | It's easier to stabilise from an operator initiated wave,
             | really; it knows it's happening before it does the wave,
             | and would have a model of the forces it'll induce.
        
               | ionwake wrote:
               | I tried to understand the point of your reply but Im not
               | sure what your point was - I only seemed to glean "its
               | easier to balance if the operator is moving it".
               | 
               | Please elaborate unless Im being thick.
               | 
               | EDIT > I upvoted your comment in any case as Im sure its
               | helping
        
               | rcxdude wrote:
               | 'control system' in this case is not implying remote
               | control, it's referring to the feedback system that
               | adjust the actuators in response to the sensed
               | information. If the motion is controlled automatically,
               | then the control loop can in principle anticipate the
               | motion in a way that it could not if it was remote
               | controlled: i.e. the opposite, it's easier to control the
               | motions (in terms of maintaining balance and avoiding
               | overstressing the actuators) if the operator is not live
               | puppeteering it.
        
               | dwattttt wrote:
               | Apologies, yes, "control system" is somewhat niche
               | jargon. "Balance system" is probably more appropriate.
        
               | dboreham wrote:
               | Well "control system" is a proper term understood by
               | anyone with a decent STEM education since 150 years ago.
        
               | tekla wrote:
               | > "control system" is somewhat niche jargon
               | 
               | Oh my god. What the hell is happening to STEM education?
               | Control systems engineering is standard parlance. This is
               | what Com Sci people are like?
        
               | ionwake wrote:
               | Thank you for the explanation
        
               | dwattttt wrote:
               | It's that there's nothing special about blending
               | "operator initiated animation commands" with the RL
               | balancing system. The balance system has to balance
               | anyway; if there was no connection between an operator's
               | wave command and balance, it would have exactly the same
               | job to do.
               | 
               | At best the advantage of connecting those systems is that
               | the operator command can inform the balance system, but
               | there's nothing novel about that.
        
             | elil17 wrote:
             | Only as opposed to what? VLAM/something else more trendy?
        
             | numpad0 wrote:
             | "RL is not AI" "Disney bots were remote controlled" are
             | major AI hypebro delulu moment lol
             | 
             | Your understanding of AI and robotics are more cucumber
             | than pear shaped. You're making very little technical sense
             | here. Challenges and progress in robotics aren't where you
             | think they are. It's all propagandish contents you're
             | basing your understandings on.
             | 
             | If you're getting information from TikTok or YouTube Shorts
             | style content, especially around Tesla bros - get the hell
             | out of it at Ludicrous Speed. Or consume way more of it so
             | thoroughly that you cannot be deceived anymore despite
             | blatant lies everywhere. Then come back. They're all plain
             | wrong and it's not good for you.
        
         | CoastalCoder wrote:
         | Not just you.
         | 
         | I hate being lied to, especially if it's so the liar can reap
         | some economic advantage from having the lie believed.
        
           | AnimalMuppet wrote:
           | Yeah. I have a general rule that I don't do business with
           | people who lie to me.
        
             | MichaelZuo wrote:
             | I can't even imagine what kind of person would not follow
             | that rule.
             | 
             | Do business with people that are known liars? And just get
             | repeatedly deceived?
             | 
             | ...Though upon reflection that would explain why the
             | depression rate is so high.
        
         | frollogaston wrote:
         | There's also a very thick coat of hype in
         | https://www.nvidia.com/en-us/glossary/ai-factory/ and related
         | material, even though the underlying product (an ML training
         | cluster) is real.
        
         | hn_throwaway_99 wrote:
         | > I don't want to jump on nvidia but I found it super weird
         | when they clearly remote controlled a Disney bot onto the stage
         | and claimed it was all using real time AI which was clearly
         | impossible due to no latency and weirdly the bot verifying
         | correct stage position in relation to the presenter. It was
         | obviously the Disney bot just being controlled by someone off
         | stage.
         | 
         | I don't know what you're referring to, but I'd just say that I
         | don't believe what you are describing could have possibly
         | happened.
         | 
         | Nvidia is a huge corporation, with more than a few lawyers on
         | staff and on retainer, and what you are describing is criminal
         | fraud that any plaintiff's lawyer would have a field day with.
         | So, given that, and since I don't think people who work at
         | Nvidia are complete idiots, I think whatever you are describing
         | didn't happen the way you are describing it. Now, it's
         | certainly possible there was some small print disclaimer, or
         | there was some "weasel wording" that described something with
         | ambiguity, but when you accuse someone of criminal fraud you
         | want to have more than "hey this is just my opinion" to back it
         | up.
        
           | numpad0 wrote:
           | They're soaked eyebrows deep in Tiktok style hype juice,
           | believing that latest breakthrough in robotics is that AGIs
           | just casually started walking and talking on their own and
           | therefore anything code controlled by now is considered proof
           | of ineptitude and fake.
           | 
           | It's complete cult crazy talk. Not even cargocult, it's
           | proper cultism.
        
           | kalleboo wrote:
           | Tefal literally sells a rice cooker that boasts "AI Smart
           | Cooking Technology" while not even containing a
           | microcontroller and just being controlled by the time-honored
           | technology of "a magnet that gets hot". They also have
           | lawyers.
           | 
           | AI doesn't mean anything. You can claim anything uses "AI"
           | and just define what that means yourself. They could have
           | some basic anti-collision technology and claim it's "AI".
        
           | moogly wrote:
           | > what you are describing is criminal fraud that any
           | plaintiff's lawyer would have a field day with
           | 
           | "Corporate puffery"
        
         | ionwake wrote:
         | Not sure why my comment got so upvoted, all my comments are my
         | personal opinion based solely on the publicly streamed video,
         | and as I said, I'll happily correct or retract my impression.
        
       | yunyu wrote:
       | If you are a gamer, you are no longer NVIDIA's most important
       | customer.
        
         | bigyabai wrote:
         | A revelation on-par with Mac users waking up to learn their
         | computer was made by a phone company.
        
           | ravetcofx wrote:
           | Barely even a phone company, more like a app store and
           | microtransactions services company
        
         | dcchambers wrote:
         | Haven't been for a while. Not since crypto bros started buying
         | up GPUs for coin mining.
        
         | theshackleford wrote:
         | Yes but why should I care provided the product they have
         | already sold me continues to work? How does this materially
         | change my life because Nvidia doesnt want to go steady with me
         | anymore?
        
         | Rapzid wrote:
         | Sounds like an opening for AMD then. But as long as NVidia has
         | the best tech I'll keep buying it when it's time to upgrade.
        
       | Nextgrid wrote:
       | I wonder if the 12VHPWR connector is intentionally defective to
       | prevent large-scale use of those consumer cards in
       | server/datacenter contexts?
       | 
       | The failure rate is just _barely_ acceptable in a consumer use-
       | case with a single card, but with multiple cards the probability
       | of failure (which takes down the whole machine, as there 's no
       | way to hot-swap the card) makes it unusable.
       | 
       | I can't otherwise see why they'd persevere on that stupid
       | connector when better alternatives exist.
        
         | mjevans wrote:
         | Sunk cost fallacy and a burning (literal) desire to have small
         | artistic things. That's probably also the reason the connector
         | was densified so much, and clearly, released with so VERY
         | little tolerance for error human and otherwise.
        
         | KerrAvon wrote:
         | IANAL, but knowingly leaving a serious defect in your product
         | at scale for that purpose would be very bad behavior and juries
         | tend not like that sort of thing.
        
           | thimabi wrote:
           | However, as we've learned from the Epic vs Apple case,
           | corporations don't really care about bad behavior -- as long
           | as their ulterior motives don't get caught.
        
         | transcriptase wrote:
         | It boggles my mind that an army of the most talented electrical
         | engineers on earth somehow fumble a power connector and then
         | don't catch it before shipping.
        
         | ls612 wrote:
         | They use the 12VHPWR on some datacenter cards too.
        
       | monster_truck wrote:
       | Remember when nvidia got caught dropping 2 bits of color
       | information to beat ati in benchmarks? I still can't believe
       | anyone has trusted them since! That is an insane thing to do
       | considering the purpose of the product.
       | 
       | For as long as they have competition, I will support those
       | companies instead. If they all fail, I guess I will start one. My
       | spite for them knows no limits
        
         | 827a wrote:
         | People need to start asking more questions about why the RTX 50
         | series (Blackwell) has almost no performance uplift over the
         | RTX 40 series (Ada/Hopper), and also conveniently its
         | impossible to find B200s.
        
       | alganet wrote:
       | Right now, all silicon talk is bullshit. It has been for a while.
       | 
       | It became obvious when old e-waste Xeons were turned into viable,
       | usable machines, years ago.
       | 
       | Something is obviously wrong with this entire industry, and I
       | cannot wait for it to pop. THIS will be the excitement everyone
       | is looking for.
        
         | gizajob wrote:
         | Do you have a timeframe for the pop? I need some excitement.
        
           | alganet wrote:
           | More a sequence of potential events than a timeframe.
           | 
           | High-end GPUs are already useless for gaming (a low-end GPU
           | is enough), their traditional source of demand. They're
           | floating on artificial demand for a while now.
           | 
           | There are two markets that currently could use them: LLMs and
           | Augmented Reality. Both of these are currently useless, and
           | getting more useless by the day.
           | 
           | CPUs are just piggybacking on all of this.
           | 
           | So, lots of things hanging on unrealized promises. It will
           | pop when there is no next use for super high-end GPUs.
           | 
           | War is a potential user of such devices, and I predict it
           | could be the next thing after LLMs and AR. But then if war
           | breaks out in such a scale to drive silicon prices up, lots
           | of things are going to pop, and food and fuel will boom to
           | such a magnitude that will make silicon look silly.
           | 
           | I think it will pop before it comes to the point of war
           | driving it, and it will happen within our lifetimes (so, not
           | a Nostradamus-style prediction that will only be realized
           | long-after I'm dead).
        
             | rightbyte wrote:
             | I don't see how GPU factories could be running in the event
             | of war "in such a scale to drive silicon prices up". Unless
             | you mean that supply will be low and people scavanging TI
             | calculators for processors to make boxes playing Tetris and
             | Space Invaders.
        
               | alganet wrote:
               | Why not?
               | 
               | This is the exact model in which WWII operated. Car and
               | plane supply chains were practically nationalized to
               | support the military industry.
               | 
               | If drones, surveillance, satellites become the main war
               | tech, they'll all use silicon, and things will be fully
               | nationalized.
               | 
               | We already have all sorts of hints of this. Doesn't need
               | a genius to predict that it could be what happens to
               | these industries.
               | 
               | The balance with food and fuel is more delicate though. A
               | war with drones, satellites and surveillance is not like
               | WWII, there's a commercial aspect to it. If you put it on
               | paper, food and fuel project more power and thus, can
               | move more money. Any public crisis can make people forget
               | about GPUs and jeopardize the process of nationalization
               | that is currently being implemented, which still depends
               | on relatively peaceful international trade.
        
               | newsclues wrote:
               | CPU and GPU compute will be needed for military use
               | processing the vast data from all sorts of sensors. Think
               | about data centres crunching satellite imagery for
               | trenches, fortifications and vehicles.
        
               | alganet wrote:
               | > satellite imagery for trenches, fortifications and
               | vehicles
               | 
               | Dude, you're describing the 80s. We're in 2025.
               | 
               | GPUs will be used for automated surveillance, espionage,
               | brainwashing and market manipulation. At least that's
               | what the current batch of technologies implies.
               | 
               | The only thing stopping this from becoming a full
               | dystopia is that delicate balance with food and fuel I
               | mentioned earlier.
               | 
               | It has become pretty obvious that entire wealthy nations
               | can starve if they make the wrong move. Turns out GPUs
               | cannot produce calories, and there's a limit to how much
               | of a market you can manipulate to produce calories for
               | you.
        
               | rightbyte wrote:
               | > Why not?
               | 
               | Bombs that fly between continents or are launched from
               | submarines for any "big scale" war.
        
               | alganet wrote:
               | I don't see how this is connected to what you said
               | before.
        
               | rightbyte wrote:
               | My point is that GPU factories are big static targets
               | with sensitive supply chains and thus have no strategic
               | importance in being so easy to distrupt.
        
               | alganet wrote:
               | So are airplane and car factories. I already explained
               | all of this, what keeps the supply chain together, and
               | what their strategic value is.
        
               | rightbyte wrote:
               | I have no clue if we agree with eachother or not?
        
             | selfhoster11 wrote:
             | Local LLMs are becoming more popular and easier to run, and
             | Chinese corporations are releasing extremely good models of
             | all sizes under MIT or similar terms in many cases. There
             | amount of VRAM is the main limiter, and it would help with
             | gaming too.
        
               | alganet wrote:
               | Gaming needs no additional VRAM.
               | 
               | From a market perspective, LLMs sell GPUs. Doesn't even
               | matter if they work or not.
               | 
               | From the geopolitical tensions perspective, they're the
               | perfect excuse to create infrastructure for a global
               | analogue of the Great Firewall (something that the
               | Chinese are pioneers of, and catching up to the plan).
               | 
               | From the software engineering perspective, LLMs are a
               | nuissance, a distraction. They harm everyone.
        
               | selfhoster11 wrote:
               | > Gaming needs no additional VRAM.
               | 
               | Really? What about textures? Any ML that the new wave of
               | games might use? For instance, while current LLMs
               | powering NPC interactions would be pretty horrible, what
               | about in 2 years time? You could have arbitrary dialogue
               | trees AND dynamically voiced NPCs or PCs. This is
               | categorically impossible without more VRAM.
               | 
               | > the perfect excuse to create infrastructure for a
               | global analogue of the Great Firewall
               | 
               | Yes, let's have more censorship and kill the dream of the
               | Internet even deader than it already is.
               | 
               | > From the software engineering perspective, LLMs are a
               | nuissance, a distraction. They harm everyone.
               | 
               | You should be aware that reasonable minds can differ in
               | this issue. I won't defend companies forcing the use of
               | LLMs (it would be like forcing use of vim or any other
               | tech you dislike), but I disagree about being a nuisance,
               | distraction, or a universal harm. It's all down to
               | choices and fit for use case.
        
               | alganet wrote:
               | How is any of that related to actual silicon sales
               | strategies?
               | 
               | Do not mistake adjacent topics for the main thing I'm
               | discussing. It only proves my point that right now, all
               | silicon talk is bullshit.
        
           | grg0 wrote:
           | Hell, yeah. I'm in for some shared excitement too if y'all
           | want to get some popcorn.
        
         | bigyabai wrote:
         | A lot of those Xeon e-waste machines were downright awful,
         | especially for the "cheap gaming PC" niche they were popular
         | in. Low single-core clock speeds, low memory bandwidth for
         | desktop-style configurations and super expensive motherboards
         | that ran at a higher wattage than the consumer alternatives.
         | 
         | > THIS will be the excitement everyone is looking for.
         | 
         | Or TSMC could become geopolitically jeopardized somehow,
         | drastically increasing the secondhand value of modern GPUs even
         | beyond what they're priced at now. It's all a system of
         | scarcity, things could go either way.
        
           | alganet wrote:
           | They were awful compared to newer models, but for the price
           | of _nothing_ , pretty good deal.
           | 
           | If no good use is found for high-end GPUs, secondhand models
           | will be like AOL CDs.
        
             | bigyabai wrote:
             | Sure, eventually. Then in 2032, you can enjoy the raster
             | performance that slightly-affluent people in 2025 had for
             | years.
             | 
             | By your logic people should be snatching up the 900 and
             | 1000-series cards by the truckload if the demand was so
             | huge. But a GTX 980 is like $60 these days, and honestly
             | not very competitive in many departments. Neither it nor
             | the 1000-series have driver support nowadays, so most users
             | will reach for a more recent card.
        
               | alganet wrote:
               | There's no zero-cost e-waste like that anymore, it was a
               | once-time thing.
               | 
               | Also, it's not "a logic", it's not a cosumer
               | recomendation. It was a fluke in the industry that to me,
               | represents a symptom.
        
       | system2 wrote:
       | Why does the hero image of this website says "Made with GIMP"?
       | I've never seen a web banner saying "Made with Photoshop" or
       | anything similar.
        
         | reddalo wrote:
         | I don't know why it says that, but GIMP is an open-source
         | project so it makes sense for fans to advertise it.
        
         | goalieca wrote:
         | Were you on the internet in the 90s? Lots of banners like that
         | on every site.
        
       | ls-a wrote:
       | Finally someone
        
       | honeybadger1 wrote:
       | A bit hyperbolic
        
       | dofubej wrote:
       | > With over 90% of the PC market running on NVIDIA tech, they're
       | the clear winner of the GPU race. The losers are every single one
       | of us.
       | 
       | Of course the fact that we overwhelmingly chose the better option
       | means that... we are worse off or something?
        
         | atq2119 wrote:
         | That bit does seem a bit whiney. AMD's latest offerings are
         | quite good, certainly better value for money. Why not buy that?
         | The only shame is that they don't sell anything as massive as
         | Nvidia's high end.
        
         | ohdeargodno wrote:
         | Choosing the vendor locked in, standards hating brand does tend
         | to mean that you inevitably get screwed when they decide do
         | massively inflate their prices and there's nothing you can do
         | about it does tend to make you worse off, yes.
         | 
         | Not that AMD was anywhere near being in a good state 10 years
         | ago. Nvidia still fucked you over.
        
         | johnklos wrote:
         | Many of you chose Windows, so, well, yes.
        
       | delduca wrote:
       | Nothing new, it is just Enshittification
        
       | porphyra wrote:
       | The article complains about issues with consumer GPUs but those
       | are nowadays relegated to being merely a side hobby project of
       | Nvidia, whose core business is enterprise AI chips. Anyway Nvidia
       | still has no significant competition from AMD on either front so
       | they are still getting away with this.
       | 
       | Deceptive marketing aside, it's true that it's sad that we can't
       | get 4K 60 Hz with ray tracing with current hardware without some
       | kind of AI denoising and upscaling, but ray tracing is really
       | just _profoundly_ hard so I can't really blame anyone for not
       | having figured out how to put it in a consumer pc yet. There's a
       | reason why pixar movies need huge render farms that take lots of
       | time per frame. We would probably sooner get gaussian splatting
       | and real time diffusion models in games than nice full resolution
       | ray tracing tbh.
        
         | Jabrov wrote:
         | I get ray tracing at 4K 60Hz with my 4090 just fine
        
           | trynumber9 wrote:
           | Really? I can't even play Minecraft (DXR: ON) at 4K 60Hz on a
           | RTX 5090...
           | 
           | Maybe another regression in Blackwell.
        
           | marcellus23 wrote:
           | What game? And with no upscaling or anything?
        
       | neuroelectron wrote:
       | Seems a bit calculated and agreed across the industry. What can
       | really make sense of Microsoft's acquisitions and ruining of
       | billion dollar IPs? It's a manufactured collapse of the gaming
       | industry. They want to centralize control of the market and make
       | it a service based (rent seeking) sector.
       | 
       | I'm not saying they all got together and decided this together
       | but their wonks are probably all saying the same thing. The
       | market is shrinking and whether it's by design or incompetence,
       | this creates a new opportunity to acquire it wholesale for
       | pennies on the dollar and build a wall around it and charge for
       | entry. It's a natural result of games requiring NVidia developers
       | for driver tuning, bitcoin/ai and buying out capacity to prevent
       | competitors.
       | 
       | The wildcard I can't fit into this puzzle is Valve. They have a
       | huge opportunity here but they also might be convinced that they
       | have already saturated the market and will read the writing on
       | the wall.
        
         | layoric wrote:
         | Valve is a private company so doesn't have the same growth at
         | all costs incentives. To Microsoft, the share price is
         | everything.
        
         | keyringlight wrote:
         | As much as they've got large resources, I'm not sure what
         | projects they could reasonably throw a mountain of money at and
         | expect to change things, and presumably benefit from in the
         | future instead of doing it to be a a force of chaos in the
         | industry. Valve's efforts all seem to orbit around the store,
         | that's their main business and everything else seems like a
         | loss-leader to get you buying through it even if it comes
         | across as a pet project of a group of employees.
         | 
         | The striking one for me is their linux efforts, at least as far
         | as I'm aware they don't do a lot that isn't tied to the steam
         | deck (or similar devices) or running games available on steam
         | through linux. Even the deck APU is derived from the semi-
         | custom work AMD did for the consoles, they're benefiting from a
         | second later harvest that MS/Sony have invested (hundreds of
         | millions?) in many years earlier. I suppose a lot of it comes
         | down to what Valve needs to support their customers
         | (developers/publishers), they don't see the point in pioneering
         | and establishing some new branch of tech with developers.
        
         | bob1029 wrote:
         | I think the reason you see things like Blizzard killing off
         | Overwatch 1 is because the Lindy effect applies in gaming as
         | well. Some things are so sticky and preferred that you have to
         | commit atrocities to remove them from use.
         | 
         | From a supply/demand perspective, if all of your customers are
         | still getting high on the 5 (or 20) year old supply, launching
         | a new title in the same space isn't going to work. There are
         | not an infinite # of gamers and the global dopamine budget is
         | limited.
         | 
         | Launching a game like TF2 or Starcraft 2 in 2025 would be
         | viewed as a business catastrophe by the metrics most AAA
         | studios are currently operating under. Monthly ARPU for gamers
         | _years_ after purchasing the Orange Box was approximately
         | $0.00. Giving gamers access to that strong of a drug would ruin
         | the demand for other products.
        
           | a_wild_dandan wrote:
           | I purchased "approximately $0.00" in TF2 loot boxes. How much
           | exactly? Left as an exercise to the reader.
        
             | refulgentis wrote:
             | This is too clever for me, I think - 0?
        
               | simonh wrote:
               | Approximately. +/- 0
        
             | bigyabai wrote:
             | People forget that TF2 was originally 20 dollars before
             | hitting the F2P market.
        
               | ThrowawayTestr wrote:
               | I paid full price for the orange box
        
             | KeplerBoy wrote:
             | When were microtransactions added to TF2? Probably years
             | after the initial launch, and they worked so well the game
             | became f2p.
        
           | aledalgrande wrote:
           | Petition related to companies like Blizzard killing games:
           | https://eci.ec.europa.eu/045/public/#/screen/home
        
           | sidewndr46 wrote:
           | From a business perspective, launching a game like Starcraft
           | 2 at any time is a business catastrophe. There are obscure
           | microtransactions in other Blizzard titles that have
           | generated more revenue than Starcraft 2.
        
             | bob1029 wrote:
             | If SC2 was such a failure at any time, why bother with 3
             | expansions?
             | 
             | I think the biggest factors involve willingness to operate
             | with substantially smaller margins and org charts.
             | 
             | It genuinely seemed like "Is this fun?" was actually a
             | bigger priority than profit prior to the Activision merger.
        
               | fireflash38 wrote:
               | I like games companies that create games for fun and
               | story, rather than just pure profit.
        
             | rollcat wrote:
             | There's plenty of business opportunity in any genre; you
             | can make a shit-ton of money by simply making the game good
             | and building community goodwill.
             | 
             | The strategy is simple: 1. there's always plenty of people
             | who are ready to spend way more money in a game than you
             | and I would consider sane - just let them spend it but 2.
             | make it easy to gift in-game items to other players. You
             | don't even need to keep adding that much content - the
             | "whales" are always happy to keep giving away to new
             | players all the time.
             | 
             | Assuming you've built up that goodwill, this is all you
             | need to keep the cash flowing. But that's non-exploitative,
             | so you'll be missing that extra 1%. /shrug
        
           | rollcat wrote:
           | > Launching a game like [...] Starcraft 2
           | 
           | They can't even _keep the lights on_ for SC2.
           | 
           | We [the community] have been designing our own balance
           | patches for the past five years; and our own ladder maps
           | since +/- day 1 - all Blizzard was to do since 2020 was to
           | press the "deploy" button, and they f-ed it up several times
           | anyway.
           | 
           | The news of the year so far is that someone has been
           | exploiting a remote hole to upload some seriously disturbing
           | stuff to the arcade (custom maps/mods) section. So of course
           | rather than fixing the hole, Blizzard has cut off uploads.
           | 
           | So we can't test the balance changes.
           | 
           | Three weeks left until EWC, a __$700.000__ tournament, by the
           | way.
           | 
           | Theoretically SC2 could become like Brood War, with balance
           | changes happening purely through map design. Except we can't
           | upload maps either.
        
         | kbolino wrote:
         | The video game industry has been through cycles like this
         | before. One of them (the 1983 crash) was so bad it killed most
         | American companies and caused the momentum to shift to Japan
         | for a generation. Another one I can recall is the "death" of
         | the RTS (real-time strategy) genre around 2010. They have all
         | followed a fairly similar pattern and in none of them that I
         | know of have things played out as the companies involved
         | thought or hoped they would.
        
           | georgeecollins wrote:
           | I worked in the video game industry from the 90s through to
           | today. I think you are over generalizing or missing the
           | original point. It's true that there have been boom and
           | busts. But there are also structural changes. Do you remember
           | CD-ROMs? Steam and the iPhone were structural changes.
           | 
           | What Microsoft is trying to do with Gamepass is a structural
           | change. It may not work out the way that they plan but the
           | truth is that sometimes these things do change the nature of
           | the games you play.
        
             | kbolino wrote:
             | But the thing is that Steam didn't _cause_ the death of
             | physical media. I absolutely do remember PC gaming before
             | Steam, and between the era when it was awesome (StarCraft,
             | Age of Empires, Unreal Tournament, Tribes, etc.) and the
             | modern Steam-powered renaissance, there was an absolutely
             | dismal era of disappointment and decline. Store shelves
             | were getting filled with trash like  "40 games on one CD!"
             | and each new console generation gave retailers an excuse to
             | shrink shelf space for PC games. Yet during this time, all
             | of Valve's games were still available on discs!
             | 
             | I think Microsoft's strategy is going to come to the same
             | result as Embracer Group. They've bought up lots of studios
             | and they control a whole platform (by which I mean Xbox,
             | not PC) but this doesn't give them that much power. Gaming
             | does evolve and it often evolves to work around attempts
             | like this, rather than in favor of them.
        
               | georgeecollins wrote:
               | I am not saying that about Steam. In fact Steam pretty
               | much saved triple A PC gaming. Your timeline is quite
               | accurate!
               | 
               | >> Microsoft's strategy is going to come to the same
               | result as Embracer Group.
               | 
               | I hope you are right.
               | 
               | If I were trying to make a larger point, I guess it would
               | be that big tech companies (Apple, MSFT, Amazon) don't
               | want content creators to be too important in the
               | ecosystem and tend to support initiatives that emphasize
               | the platform.
        
               | ethbr1 wrote:
               | > _big tech companies (Apple, MSFT, Amazon) don 't want
               | content creators to be too important in the ecosystem_
               | 
               | 100%. The platforms' ability to monetize in their factor
               | is directly proportional to their relative power vs the
               | most powerful creatives.
               | 
               | Thus, in order to keep more money, they make strategic
               | moves that disempower creatives.
        
             | IgorPartola wrote:
             | Not in the game industry but as a consumer this is very
             | true. One example: ubiquitous access to transactions and
             | payment systems gave a huge rise to loot boxes.
             | 
             | Also mobile games that got priced at $0.99 meant that only
             | the unicorn level games could actually make decent money so
             | In-App Purchases were born.
             | 
             | But also I suspect it is just a problem where as consumers
             | we spend a certain amount of money on certain kinds of
             | entertainment and if as a content producer you can catch
             | enough people's attention you can get a slice of that pie.
             | We saw this with streaming services where an average
             | household spent about $100/month on cable so Netflix, Hulu,
             | et al all decided to price themselves such that they could
             | be a portion of that pie (and would have loved to be the
             | whole pie but ironically studios not willing to license
             | everything to everyone is what prevented that).
        
           | the__alchemist wrote:
           | Thankfully, RTS is healthy again! (To your point about
           | cycles)
        
             | needcaffeine wrote:
             | What RTS games are you playing now, please?
        
               | sgarland wrote:
               | AoE2, baby. Still going strong, decades after launch.
        
               | KeplerBoy wrote:
               | And AoE4, one of the few high profile RTS games of the
               | past years, is dead.
        
               | the__alchemist wrote:
               | That was disappointing to see. I thought it was a great
               | game, with some mechanics improved over 2, and missing
               | some of the glitchy behavior that became cannon (e.g.
               | foot archer kiting) The community (nor my friends) didn't
               | seem to go for it, primarily for the reason that it's not
               | AoE2. Exquisite sound design too.
        
               | evelant wrote:
               | Sins of a solar empire 2. AI War 2. There haven't been
               | any really "big" ones like StarCraft but some very good
               | smaller ones like those two.
        
               | somat wrote:
               | BAR
               | 
               | https://www.beyondallreason.info/
               | 
               | But... While bar is good, very good. It is also very hard
               | to compete with, so I see it sort of killing any funding
               | for good commercial RTS's for the next few years.
        
               | rollcat wrote:
               | It's non-competitive (I'm burnt out with SC2 ladder a
               | bit), but I've been enjoying Cataclismo, Settlers 3 (THAT
               | is a throwback), and I'm eyeing They are Billions.
               | 
               | Some SC2 youtubers are now covering Mechabellum, Tempest
               | Rising, BAR, AoE4, and some in-dev titles: Battle Aces,
               | Immortal: Gates of Pyre, Zerospace, and of course
               | Stormgate.
               | 
               | These are all on my list but I'm busy enough playing
               | Warframe ^^'
        
         | MangoToupe wrote:
         | > It's a manufactured collapse of the gaming industry. They
         | want to centralize control of the market and make it a service
         | based (rent seeking) sector.
         | 
         | It also won't work, and Microsoft has developed no way to
         | compete on actual value. As much as I hate the acquisitions
         | they've made, even if Microsoft as a whole were to croak
         | tomorrow I think the game industry would be fine.
        
           | ehnto wrote:
           | New stars would arise, others suggesting the games industry
           | would collapse and go away is like saying the music industry
           | collapsing would stop people from making music.
           | 
           | Yes games can be expensive to make, but they don't have to
           | be, and millions will still want new games to play. It is
           | actually a pretty low bar for entry to bring an indie game to
           | market (relative to other ventures). A triple A studio
           | collapse would probably be an amazing thing for gamers, lots
           | of new and unique indie titles. Just not great for profit for
           | big companies, a problem I am not concerned with.
        
         | beefnugs wrote:
         | This post is crazy nonsense: Bad games companies have always
         | existed, and the solution is easy: dont buy their trash. I buy
         | mostly smaller indie games these days just fine.
         | 
         | nvidia isn't purposely killing anything, they are just
         | following the pivot into the AI nonsense. They have no choice,
         | if they are in a unique position to make 10x by a pivot they
         | will, even if it might be a dumpsterfire of a house of cards.
         | Its immoral to just abandon the industry that created you, but
         | companies have always been immoral.
         | 
         | Valve has an opportunity to what? Take over video card hardware
         | market? No. AMD and Intel are already competitors in the market
         | and cant get any foothold (until hopefully now consumers will
         | have no choice but to shift to them)
        
         | proc0 wrote:
         | I've always played a few games for many hours as opposed to
         | many games for one playthrough. Subscription just does not make
         | sense for me, and I suspect that's a big part of the market.
         | Add to this the fact that you have no control over it and then
         | top it off with potential ads and I will quit gaming before
         | switching to subs only. Luckily there is still GoG and Steam
         | doesn't seem like it will change but who knows.
        
         | pointlessone wrote:
         | If it's manufactured it implies intent. Someone at Microsoft is
         | doing it on purpose and, presumably, thinks it'll benefit them.
         | I'm not sure how this can be seen as a win for them. They
         | invested a massive amount of money into buying all those game
         | studios. They also admitted Xbox hardware is basically dead. So
         | the only way they can any return on that investment is third
         | party hardware: either PlayStation or PC. If I were to choose
         | it would be pc for MS. They already have game pass and windows
         | is the gaming OS. By giving business to Sony they would
         | undermine those.
         | 
         | I don't think nVidia wants gaming collapse either. They might
         | not prioritize it now but they definitely know that it will
         | persist in some form. They bet on AI (and crypto before it)
         | because those are lucrative opportunities but there's no
         | guarantee they will last. So they squeeze as much as they can
         | out of those while they can. They definitely want gaming as a
         | backup. It might be not as profitable and more finicky as it's
         | a consumer market but it's much more stable in the long run.
        
       | jekwoooooe wrote:
       | This guy makes some good points but he clearly has a bone to
       | pick. Calling dlss snake oil was where I stopped reading
        
         | kevingadd wrote:
         | The article doesn't make the best argument to support the claim
         | but it's true that NVIDIA is now making claims like '4090 level
         | performance' on the basis that if you turn on DLSS multi-frame
         | generation you suddenly have Huge Framerates when most of the
         | pixels are synthesized instead of real.
         | 
         | Personally I'm happy with DLSS on balanced or quality, but the
         | artifacts from framegen are really distracting. So I feel like
         | it's fair to call their modern marketing snake oil since it's
         | so reliant on frame gen to create the illusion of real
         | progress.
        
         | Retr0id wrote:
         | Yeah, computer graphics has always been "software trickery" all
         | the way down. There are valid points to be made about DLSS
         | being marketed in misleading ways, but I don't think it being
         | "software trickery" is a problem at all.
        
           | ThatPlayer wrote:
           | Exactly. Running games at a lower resolution isn't new. I
           | remember changing the size of the viewport in the original
           | DOOM 1993 to get it to run faster. Making a lower resolution
           | look better without having to run at a higher resolution is
           | the exact same problem anti-aliasing has been tackling
           | forever. DLSS is just another form of AA that is now so
           | advanced, you can go from an even lower resolution and still
           | look good.
           | 
           | So even when I'm running a game at native resolution, I still
           | want anti-aliasing, and DLSS is a great choice then.
        
             | imiric wrote:
             | It's one thing to rely on a technique like AA to improve
             | visual quality with negligible drawbacks. DLSS is entirely
             | different though, since upscaling introduces all kinds of
             | graphical issues, and frame generation[1] even more so,
             | while adding considerable input latency. NVIDIA will claim
             | that this is offset by its Reflex feature, but that has its
             | own set of issues.
             | 
             | So, sure, we can say that all of this is ultimately
             | software trickery, but when the trickery is dialed up to 11
             | and the marketing revolves entirely on it, while the raw
             | performance is only slightly improved over previous
             | generations, it's a clear sign that consumers are being
             | duped.
             | 
             | [1]: I'm also opposed to frame generation from a
             | philosophical standpoint. I want my experience to be as
             | close as possible to what the game creator intended. That
             | is, I want every frame to be generated by the game engine;
             | every object to look as it should within the world, and so
             | on. I don't want my graphics card to create an experience
             | that approximates what the creator intended.
             | 
             | This is akin to reading a book on an e-reader that replaces
             | every other word with one chosen by an algorithm. I want
             | none of that.
        
               | ThatPlayer wrote:
               | I don't disagree about frame-gen, but upscaling and its
               | artifacts are not new nor unique to DLSS. Even later PS3
               | games upscaled from 720p to 1080p.
        
             | sixothree wrote:
             | But we're not talking about resolution here. We're talking
             | about interpolation of entire frames, multiple frames.
        
               | ThatPlayer wrote:
               | I don't think we are? Article talks about DLSS on RTX 20
               | series cards, which do not support DLSS frame-gen:
               | 
               | > What always rubbed me the wrong way about how DLSS was
               | marketed is that it wasn't only for the less powerful
               | GPUs in NVIDIA's line-up. No, it was marketed for the top
               | of the line $1,000+ RTX 20 series flagship models to
               | achieve the graphical fidelity with all the bells and
               | whistles.
        
       | __turbobrew__ wrote:
       | > With over 90% of the PC market running on NVIDIA tech, they're
       | the clear winner of the GPU race. The losers are every single one
       | of us.
       | 
       | I have been rocking AMD GPU ever since the drivers were
       | upstreamed into the linux kernel. No regrets.
       | 
       | I have also realized that there is a lot out there in the world
       | besides video games, and getting all in a huff about it isn't
       | worth my time or energy. But consumer gotta consoooooom and then
       | cry and outrage when they are exploited instead of just walking
       | away and doing something else.
       | 
       | Same with magic the gathering, the game went to shit and so many
       | people got outraged and in a big huff but they still spend
       | thousands on the hobby. I just stopped playing mtg.
        
         | frollogaston wrote:
         | Also playing PC video games doesn't even require a Nvidia GPU.
         | It does sorta require Windows. I don't want to use that, so
         | guess I lost the ability to waste tons of time playing boring
         | games, oh no.
        
           | rightbyte wrote:
           | Steam's Wine thing works quite well. And yes you need to
           | fiddle and do work arounds including giving up getting some
           | games to work.
        
             | frollogaston wrote:
             | Yeah, but it's not worth. Apparently the "gold" list on
             | ProtonDB is games that _allegedly_ work _with tweaks_. So
             | like, drop in this random DLL and it might fix the game. I
             | 'm not gonna spend time on that.
             | 
             | Last one I ever tried was
             | https://www.protondb.com/app/813780 with comments like
             | "works perfectly, except multiplayer is completely broken"
             | and the workaround has changed 3 times so far, also it lags
             | no matter what. Gave up after stealing 4 different DLLs
             | from Windows. It doesn't even have anticheat, it's just
             | cause of some obscure math library.
        
               | surgical_fire wrote:
               | > Yeah, but it's not worth. Apparently the "gold" list on
               | ProtonDB is games that allegedly work with tweaks. So
               | like, drop in this random DLL and it might fix the game.
               | I'm not gonna spend time on that.
               | 
               | I literally never had to do that. Most tweaking I needed
               | to do was switching proton versions here and there (which
               | is trivial to do).
        
               | webstrand wrote:
               | I've been running opensuse+steam and I never had to tweak
               | a dll to get a game running. Albeit that I don't exactly
               | chase the latest AAA, the new releases that I have tried
               | have worked well.
               | 
               | Age of empires 2 used to work well, without needing any
               | babying, so I'm not sure why it didn't for you. I will
               | see about spinning it up.
        
               | imtringued wrote:
               | You're not supposed to "steal DLLs".
               | 
               | You're supposed to find a proton fork like "glorious
               | eggroll" that has patches specifically for your game.
        
             | cosmic_cheese wrote:
             | Yeah Proton covers a lot of titles. It's mainly games that
             | use the most draconian forms of anticheat that don't work.
        
             | y-curious wrote:
             | It's Linux, what software _doesn 't_ need fiddling to work?
        
               | msgodel wrote:
               | Other than maybe iOS what OSes in general don't need
               | fiddling these days to be usable?
        
           | snackbroken wrote:
           | Out of the 11 games I've bought through Steam this year, I've
           | had to refund one (1) because it wouldn't run under Proton,
           | two (2) had minor graphical glitches that didn't meaningfully
           | affect my enjoyment of them, and two (2) had native Linux
           | builds. Proton has gotten good enough that I've switched from
           | spending time researching if I can play a game to just
           | assuming that I can. Presumably ymmv depending on your taste
           | in games of course, but I'm not interested in competitive
           | multiplayer games with invasive anticheat which appears to be
           | the biggest remaining pain point.
           | 
           | My experience with running non-game windows-only programs has
           | been similar over the past ~5 years. It really is finally the
           | Year of the Linux Desktop, only few people seem to have
           | noticed.
        
             | mystified5016 wrote:
             | The only games in my library _at all_ that don 't work on
             | linux are indie games from the early 2000s, and I'm
             | comfortable blaming the games themselves in this case.
             | 
             | I also don't play any games that require a rootkit, so..
        
               | globalnode wrote:
               | good move, thats why i treat my windows install as a dumb
               | game box, they can steal whatever data they want from
               | that i dont care. i do my real work on linux, as far away
               | from windows as i can possibly get.
        
               | theshackleford wrote:
               | Same way I treat my windows machine, but also the reason
               | I wont be swapping it to linux any time soon. I use
               | different operating systems for different purposes for a
               | reason. It's great for fompartmentalization.
               | 
               | When I am in front of windows, I know I can permit myself
               | to relax, breath easy and let off some steam. When I am
               | not, I know I am there to learn/earn a living/produce
               | something etc. Most probably do not need this, but my
               | brain does, or I would never switch off.
        
               | duckmysick wrote:
               | What works for me is having different
               | Activities/Workspaces in KDE - they have different
               | wallpapers, pinned programs in the taskbar, the programs
               | themselves launch only in a specific Activity. I hear
               | others also use completely different user accounts.
        
             | proc0 wrote:
             | My hesitation is around high end settings, can Proton run
             | 240hz on 1440p and high settings? I'm switching anyway soon
             | and might just have a separate machine for gaming but I'd
             | rather it be Linux. SteamOS looks promising if they release
             | for PC.
        
               | onli wrote:
               | Proton has often better performance than gaming under
               | Windows - partly because Linux is faster - so sure it can
               | run those settings.
        
               | proc0 wrote:
               | Interesting, thanks.
        
               | onli wrote:
               | :) To give a source, https://www.computerbase.de/artikel/
               | betriebssysteme/welche-l... is one. There was a more
               | recent article the search is not showing me now.
        
             | PoshBreeze wrote:
             | It depends on the games you play and what you are doing. It
             | is a mixed bag IME. If you are installing a game that is
             | several years old it will work wonderfully. Most guides
             | assume you have Arch Linux or are using one of the "gaming"
             | distros like Bazzite. I use Debian (I am running
             | Testing/Trixie RC on my main PC).
             | 
             | I play a lot of HellDivers 2. Despite what a lot of Linux
             | YouTubers say. It doesn't work very well on Linux. The
             | recommendations I got from people was to change distro. I
             | do other stuff on Linux. Game slows down when you need it
             | to be running smoothly doesn't matter what
             | resolution/settings you set.
             | 
             | Anything with anti-cheat probably won't work very well if
             | at all.
             | 
             | I also wanted to play the old Command and Conquer games.
             | Getting the fan made patchers (not the games itself) to run
             | properly that fix a bunch of bugs that EA/Westwood never
             | fixed and mod support is more difficult than I cared to
             | bother with.
        
               | esseph wrote:
               | Fedora 42, Helldivers 2
               | 
               | Make sure to change your Steam launch options to:
               | 
               | PULSE_LATENCY_MSEC=84 gamemoderun %command%
               | 
               | This will use gamemode to run it, give it priority, put
               | the system in performance power mode, and will fix any
               | pulse audio static you may be having. You can do this for
               | any game you launch with steam, any shortcut, etc.
               | 
               | It's missing probably 15fps on this card between windows
               | and Linux, and since it's above 100fps I really don't
               | even notice.
               | 
               | It does seem to run a bit better under gnome with
               | Variable Refresh Rate than KDE.
        
               | PoshBreeze wrote:
               | I will be honest, I just gave up. I couldn't get
               | _consistent_ performance on HellDivers 2. Many of the
               | things you have mentioned I 've tried and found they
               | don't make much of a difference or made things worse.
               | 
               | I did get it running nice for about a day and then an
               | update was pushed and it ran like rubbish again. The game
               | runs smoothly when initially running the map and then
               | massive dip in frames for several seconds. This is
               | usually when one of the bugs is jumping at you.
               | 
               | This game may work better on Fedora/Bazzite or <some
               | other distro> but I find Debian to be super reliable and
               | don't want to switch distro. I also don't like Fedora
               | generally as I've found it unreliable in the past. I had
               | a look at Bazzite and I honestly just wasn't interested.
               | This is due to it having a bunch of technologies that I
               | have no interest in using.
               | 
               | There are other issues that are tangential but related
               | issues.
               | 
               | e.g.
               | 
               | I normally play on Super HellDive with other players in a
               | Discord VC. Discord / Pipewire seems to reset my sound
               | for no particular reason and my Plantronics Headset Mic
               | (good headset, not some gamer nonsense) will be not
               | found. This requires a restart of pipewire/wireplumber
               | and Discord (in that order). This happens often enough I
               | have a shell script alias called "fix_discord".
               | 
               | I have weird audio problems on HDMI (AMD card) thanks to
               | a regression in the kernel (Kernel 6.1 with Debian worked
               | fine).
               | 
               | I could mess about with this for ages and maybe get it
               | working or just reboot into Windows which takes me all of
               | a minute.
               | 
               | It is just easier to use Windows for Gaming. Then use
               | Linux for work stuff.
        
               | esseph wrote:
               | I used Debian for about 15 years.
               | 
               | Honestly? Fedora is really the premier Linux distro these
               | days. It's where the most the development is happening,
               | by far.
               | 
               | All of my hardware, some old, some brand new (AMD card),
               | worked flawlessly out of the box.
               | 
               | There was a point when you couldn't get me to use an rpm-
               | based distro if my life depended on it. That time is long
               | gone.
        
               | PoshBreeze wrote:
               | I don't want to use Fedora. Other than I've found it
               | unreliable I switched to Debian because I was fed up of
               | all the Window-isms/Corporate stuff in the distro that
               | was enabled by default that I was trying to get away
               | from.
               | 
               | It the same reason I don't want to use Bazzite. It misses
               | the point of using a Linux/Unix system altogether.
               | 
               | I also learned a long time ago Distro Hopping doesn't
               | actually fix your issues. You just end up either with the
               | same issues or different ones. If I switched from Debian
               | to Fedora, I suspect I would have many of the same
               | issues.
               | 
               | e.g. If a issue is in the Linux kernel itself such as
               | HDMI Audio on AMD cards having random noise, I fail to
               | see how changing from one distro to another would help.
               | Fedora might have a custom patch to fix this, however I
               | could also take this patch and make my own kernel image
               | (which I've done in the past btw).
               | 
               | The reality is that most people doing development for
               | various project / packages that make the Linux desktop
               | don't have the setup I have and some of the peculiarities
               | I am running into. If I had a more standard setup, I
               | wouldn't have an issue.
               | 
               | Moreover, I would be using FreeBSD/OpenBSD or some other
               | more traditional Unix system and ditch Linux if I didn't
               | require some Linux specific applications. I am
               | considering moving to something like Artix / Devuan in
               | the future if I did decide to switch.
        
           | surgical_fire wrote:
           | > It does sorta require Windows.
           | 
           | The vast majority of my gaming library runs fine on Linux.
           | Older games might run better than on Windows, in fact.
        
             | JeremyNT wrote:
             | True for single player, but if you're into multiplayer
             | games anti-cheat is an issue.
        
               | surgical_fire wrote:
               | If a game requires invasive anticheat, it is probably
               | something I won't enjoy playing. Most likely the game
               | will be full of cheaters anyway.
               | 
               | And yes, I rarely play anything online multiplayer.
        
               | akimbostrawman wrote:
               | multiplayer games with anti cheat are the minority and of
               | those about 40% do work
               | 
               | areweanticheatyet.com
        
           | esseph wrote:
           | Proton/Steam/ Linux works damn nearly flawlessly for /most/
           | games. I've gone through a Nvidia 2060, a 4060, and now an
           | AMD 6700 XT. No issues even for release titles at launch.
        
             | jabwd wrote:
             | What version of Linux do you run for that? I've had issues
             | getting Fedora or Ubuntu or Mint to work with my Xbox
             | controller + Bluetooth card combo, somehow Bazzite doesn't
             | have these issues even though its based on Fedora and I
             | don't know what I did wrong with the other distros.
        
         | bob1029 wrote:
         | > I have also realized that there is a lot out there in the
         | world besides video games
         | 
         | My favorite part about being a reformed gaming addict is the
         | fact that my MacBook now covers ~100% of my computer use cases.
         | The desktop is nice for Visual Studio but that's about it.
         | 
         | I'm still running a 5700XT in my desktop. I have absolutely
         | zero desire to upgrade.
        
           | Finnucane wrote:
           | Same here. I got mine five years ago when I needed to upgrade
           | my workstation to do work-from-home, and it's been entirely
           | adequate since then. I switched the CPU from an AMD 3900 to a
           | 5900, but that's the only upgrade. The differences from one
           | generation to the next are pretty marginal.
        
           | leoapagano wrote:
           | Same here - actually, my PC broke in early 2024 and I still
           | haven't fixed it. I quickly found out that without gaming, I
           | no longer have any use for my PC, so now I just do everything
           | on my MacBook.
        
           | nicce wrote:
           | > I'm still running a 5700XT in my desktop. I have absolutely
           | zero desire to upgrade.
           | 
           | Same boat. I have 5700XT as well and since 2023, used mostly
           | my Mac for gaming.
        
           | pshirshov wrote:
           | PCI reset bug makes it necessary to upgrade to 6xxx series at
           | least.
        
           | __turbobrew__ wrote:
           | Im a reformed gaming addict as well and mostly play games
           | over 10 years old, and am happy to keep doing that.
        
           | Mars008 wrote:
           | Still have 2080 RTX on primary desktop, it's more than enough
           | for GUI.
           | 
           | Just got PRO 6000 96GB for models tuning/training/etc. The
           | cheapest 'good enough' for my needs option.
        
             | jabwd wrote:
             | Is this like a computational + memory need? Otherwise one
             | would think something like the framework desktop or a mac
             | mini would be a better choice right?
        
           | ThatPlayer wrote:
           | Put Linux on it, and you can even run software raytracing on
           | it for games like Indiana Jones! It'll do something like ~70
           | fps medium 1080p IIRC.
           | 
           | No mesh shader supports though. I bet more games will start
           | using that soon
        
             | sunnybeetroot wrote:
             | I don't think a reformed gaming addict wants to be tempted
             | with another game :P
        
           | nozzlegear wrote:
           | The only video game I've played with any consistency is World
           | of Warcraft, which runs natively on my Mac. Combined with
           | Rider for my .NET work, I couldn't be happier with this
           | machine.
        
           | int_19h wrote:
           | Parallels is great for running Windows software on Mac.
           | Ironically, what with the Microsoft push for Windows on ARM,
           | increasingly more Windows software gets native ARM64 builds
           | which are great for Parallels on Apple Silicon. And Visual
           | Studio specifically is one of those.
        
         | surgical_fire wrote:
         | > I have also realized that there is a lot out there in the
         | world besides video games
         | 
         | My main hobby is videogames, but since I can consistently play
         | most games on Linux (that has good AMD support), it doesn't
         | really matter.
        
           | kassner wrote:
           | Steam+Proton has been so incredibly good in the last year
           | that I'm yet to install Windows on my gaming PC. I really do
           | recommend anyone to try out that option first.
        
         | mathiaspoint wrote:
         | AMD isn't even bad at video games, it's just pytorch that
         | doesn't work so well.
        
           | kyrra wrote:
           | Frame per watt they aren't as good. But they are still
           | decent.
        
             | msgodel wrote:
             | TCO per FPS is almost certainly cheaper.
        
             | trynumber9 wrote:
             | They seem to be close? The RX 9070 is the 2nd most
             | efficient graphics card this generation according to
             | TechPowerUp and they also do well when limited to 60Hz,
             | implying their joules per frame isn't bad either.
             | 
             | Efficiency: https://tpucdn.com/review/gigabyte-geforce-
             | rtx-5050-gaming-o...
             | 
             | Vsync power draw: https://tpucdn.com/review/gigabyte-
             | geforce-rtx-5050-gaming-o...
             | 
             | The variance within Nvidia's line-up is much larger than
             | the variance between brands, anyway.
        
               | docmars wrote:
               | The RX 9070XT goes toe-to-toe with the RTX 4080 in many
               | benchmarks, and costs around 2/3 MSRP. I'd say that's a
               | pretty big win!
        
               | tankenmate wrote:
               | I run 9070s (non XT) and in combination with under-
               | volting it is very efficient in both joules per frame and
               | joules per token. And in terms of purchase price it was a
               | steal compared to similar class of NVidia cards.
        
         | darkoob12 wrote:
         | I am not a gamer and don't why AMD GPUs aren't good enough.
         | It's weird since both Xbox and PlayStation are using AMD GPUs.
         | 
         | I guess there games that you can only play on PC with Nvidia
         | graphics. That begs the question why someone create a game and
         | ignore large console market.
        
           | PoshBreeze wrote:
           | Nvidia is the high end, AMD is the mid segment and Intel is
           | the low end. In reality I am playing 4K on HellDivers with
           | 50-60FPS on a 6800XT.
           | 
           | Traditionally the NVIDIA drivers have been more stable on
           | Windows than the AMD drivers. I choose an AMD card because I
           | wanted a hassle free experience on Linux (well as much as you
           | can).
        
           | ErrorNoBrain wrote:
           | ive used an amd card for a couple years
           | 
           | its been great. flawless in fact.
        
             | sfn42 wrote:
             | Same. Bought a 6950xt for like $800ish or something like
             | that a few years ago and it's been perfect. Runs any game I
             | want to play on ultra 1440p with good fps. No issues.
             | 
             | Maybe there's a difference for the people who buy the
             | absolute top end cards but I don't. I look for best value
             | and when I looked into it amd looked better to me. Also got
             | an amd CPU which has aso been great.
        
           | senko wrote:
           | > AMD GPUs aren't good enough.
           | 
           | Software. AMD has traditionally been really bad at their
           | drivers. (They also missed the AI train and are trying to
           | catch up).
           | 
           | I use Linux and have learned not to touch AMD GPUs (and to a
           | lesser extent CPUs due to chipset quality/support) a long
           | time ago. Even if they are better now, (I feel) Intel
           | integrated (if no special GPU perf needed) or NVidia are less
           | risky choices.
        
             | rewgs wrote:
             | > I use Linux and have learned not to touch AMD GPUs (and
             | to a lesser extent CPUs due to chipset quality/support) a
             | long time ago. Even if they are better now, (I feel) Intel
             | integrated (if no special GPU perf needed) or NVidia are
             | less risky choices.
             | 
             | Err, what? While you're right about Intel integrated GPUs
             | being a safe choice, AMD has long since been the GPU of
             | choice for Linux -- it just works. Whereas Nvidia on Linux
             | has been flaky for as long as I can remember.
        
               | michaelmrose wrote:
               | They have never been flaky on the x11 desktop
        
               | senko wrote:
               | Had major problems with xinerama, suspend/resume, vsync,
               | probably a bunch of other stuff.
               | 
               | That said, I've been avoiding AMD in general for so long
               | the ecosystem might have really improved in the meantime,
               | as there was no incentive for me to try and switch.
               | 
               | Recently I've been dabbling in AI where AMD GPUs (well,
               | sw ecosystem, really) are lagging behind. Just wasn't
               | worth the hassle.
               | 
               | NVidia hw, once I set it up (which may be a bit
               | involved), has been pretty stable for me.
        
               | tankenmate wrote:
               | I run llama.cpp using Vulkan and AMD CPUs, no need to
               | install any drivers (or management software for that
               | matter, nor any need to taint the kernel meaning if I
               | have an issue it's easy to get support). For example the
               | other day when a Mesa update had an issue I had a fix in
               | less than 36 hours (without any support contract or fees)
               | and `apt-mark hold` did a perfect job until there was a
               | fix. Performance for me is within a couple of % points,
               | and with under-volting I get better joules per token.
        
               | homebrewer wrote:
               | > I've been avoiding AMD in general
               | 
               | I have no opinion on GPUs (I don't play anything released
               | later than about 2008), but Intel CPUs have had more
               | problems over the last five years than AMD, including
               | disabling the already limited support for AVX-512 after
               | release and simply burning themselves to the ground to
               | get an easy win in initial benchmarks.
               | 
               | I fear your perception of their products is seriously out
               | of date.
        
               | senko wrote:
               | > I fear your perception of their products is seriously
               | out of date.
               | 
               | How's the chipset+linux story these days? That was the
               | main reason for not choosing AMD CPU for me the last few
               | times I was in the market.
        
               | simion314 wrote:
               | >Err, what? While you're right about Intel integrated
               | GPUs being a safe choice, AMD has long since been the GPU
               | of choice for Linux -- it just works. Whereas Nvidia on
               | Linux has been flaky for as long as I can remember.
               | 
               | Not OP, I had same experience in the past with AMD,I
               | bought a new laptop and in 6 months the AMD decided that
               | my card is obsolete and no longer provided drivers
               | forcing me to be stuck with older kernel/X11 , so I
               | switched to NVIDIA and after 2 PC changes I still use
               | NVIDIA since the official drivers work great, I really
               | hope AMD this time is putting the effort to keep older
               | generations of cards working on latest kernels/X11 maybe
               | next card will be AMD.
               | 
               | But this is an explanations why us some older Linux users
               | have bad memories with AMD and we had good reason to
               | switch over to NVIDIA and no good reason to switch back
               | to AMD
        
             | ho_schi wrote:
             | This is wrong. For 14 years the recommendation on Linux is:
             | * Purchase always AMD.               * Purchase never
             | Nvidia.         * Intel is also okay.
             | 
             | Because the AMD drivers are good and open-source. And AMD
             | cares about bug reports. The one from Nvidia can and will
             | create issues because they're closed-source and avoided for
             | years to support Wayland. Now Nvidia published source-code
             | and refuses to merge it into Linux and Mesa _facepalm_
             | 
             | While Nvidia comes up with proprietary stuff AMD brought us
             | Vulkan, FreeSync, supported Wayland well already with
             | Implicit-Sync (like Intel) and used the regular Video-
             | Acceleration APIs for long time.
             | 
             | Meanwhile Nvidia:
             | 
             | https://registry.khronos.org/OpenGL/extensions/NV/NV_robust
             | n...                   It's not a bug, it's a feature!
             | 
             | Their bad drivers still don't handle simple actions like a
             | VT-Switch or Suspend/Resume. If a developer doesn't know
             | about that extension the users suffer for years.
             | 
             | Okay. But that is probably only a short term solution? It
             | is Nvidias short term solution since 2016!
             | 
             | https://www.phoronix.com/news/NVIDIA-Ubuntu-2025-SnR
        
               | josephg wrote:
               | I've been using a 4090 on my linux workstation for a few
               | years now. Its mostly fine - with the occasional bad
               | driver version randomly messing things up. I'm using
               | linux mint. Mint uses X11, which, while silly, means
               | suspend / resume works fine.
               | 
               | NVIDIA's drivers also recently completely changed how
               | they worked. Hopefully that'll result in a lot of these
               | long term issues getting fixed. As I understand it, the
               | change is this: The nvidia drivers contain a huge amount
               | of proprietary, closed source code. This code used to be
               | shipped as a closed source binary blob which needed to
               | run on your CPU. And that caused all sorts of problems -
               | because its linux and you can't recompile their binary
               | blob. Earlier this year, they moved all the secret,
               | proprietary parts into a firmware image instead which
               | runs on a coprocessor within the GPU itself. This then
               | allowed them to - at last - opensource (most? all?) of
               | their remaining linux driver code. And that means we can
               | patch and change and recompile that part of the driver.
               | And that should mean the wayland & kernel teams can start
               | fixing these issues.
               | 
               | In theory, users shouldn't notice any changes at all. But
               | I suspect all the nvidia driver problems people have been
               | running into lately have been fallout from this change.
        
               | nirv wrote:
               | No browser on Linux supports any other backend for video
               | acceleration except VAAPI, as far as I know. AMD and
               | Intel use VAAPI, while Nvidia uses VDPAU, which is not
               | supported anywhere. This single fact means that with
               | Nvidia graphics cards on Linux, there isn't even such a
               | simple and important feature for users as video decoding
               | acceleration in the browser. Every silly YouTube video
               | will use CPU (not iGPU, but CPU) to decode video,
               | consuming resources and power.
               | 
               | Yes, there are translation layers[1] which you have to
               | know about and understand how to install correctly, which
               | partially solve the problem by translating from VAAPI to
               | NVDEC, but this is certainly not for the average user.
               | 
               | Hopefully, in the future browsers will add support for
               | the new Vulkan Video standard, but for now,
               | unfortunately, one has to hardcode the browser launch
               | parameters in order to use the integrated graphics chip's
               | driver (custom XDG-application file for AMD APU in my
               | case: ~/.local/share/applications/Firefox-
               | amdgpu.desktop): `Exec=env LIBVA_DRIVER_NAME=radeonsi
               | DRI_PRIME=0 MOZ_ENABLE_WAYLAND=1
               | __NV_PRIME_RENDER_OFFLOAD=0
               | __GLX_VENDOR_LIBRARY_NAME=radeons i /usr/bin/firefox-beta
               | %u`.
               | 
               | [1] https://github.com/elFarto/nvidia-vaapi-driver/
        
               | whatevaa wrote:
               | VAAPI support in browsers is also bad and oftenly
               | requires some forcing.
               | 
               | On my Steam deck, I have to use vulkan. AV1 decoder is
               | straight up buggy, have to disable it with config or
               | extensions.
        
               | pjmlp wrote:
               | I never managed to get it working on my Netbook APU.
        
               | homebrewer wrote:
               | They opened a tiny kernel level sliver of their driver,
               | everything else (including OpenGL stack et al) is and
               | will still be closed.
               | 
               | Sadly, a couple of years ago someone seriously
               | misunderstood the news about "open sourcing" their
               | drivers and spread that misunderstanding widely; many
               | people now think their whole driver stack is open, when
               | in reality it's like 1% of the code -- the barest minimum
               | they could get away with (I'm excluding GSP code here).
               | 
               | The real FOSS driver is Nova, and it's driven by the
               | community with zero help from Nvidia, as always.
        
               | quicksilver03 wrote:
               | The AMD drivers are open source, but they definitely are
               | not good. Have a look at the Fedora discussion forums
               | (for example
               | https://discussion.fedoraproject.org/t/fedora-does-not-
               | boot-... ) to see what happens about each month.
               | 
               | I have no NVIDIA hardware, but I understand that the
               | drivers are even worse than AMD's.
               | 
               | Intel seems to be, at the moment, the least worse
               | compromise between performance and stability,
        
               | roenxi wrote:
               | Although you get to set your own standards "A bug was
               | discovered after upgrading software" isn't very
               | illuminating vis a vis quality. That does happen from
               | time to time in most software.
               | 
               | In my experience an AMD card on linux is a great
               | experience unless you want to do something AI related, in
               | which case there will be random kernel panics (which, in
               | all fairness, may one day go away - then I'll be back on
               | AMD cards because their software support on Linux was
               | otherwise much better than Nvidia's). There might be some
               | kernel upgrades that should be skipped, but using an
               | older kernel is no problem.
        
               | homebrewer wrote:
               | I have zero sympathy for Nvidia and haven't used their
               | hardware for about two decades, but amdgpu is the sole
               | reason I stick to linux-lts kernels. They introduce
               | massive regressions into every mainline release, even if
               | I delay kernel updates by several minor versions (to
               | something like x.y.5), it's still often buggy and crashy.
               | 
               | They do care about but reports, and their drivers -- when
               | given time to stabilize -- provide the best experience
               | across all operating systems (easy updates, etc), but IME
               | mainline kernels should be treated as alpha-to-beta
               | material.
        
             | jorams wrote:
             | > I use Linux and have learned not to touch AMD GPUs
             | 
             | The situation completely changed with the introduction of
             | the AMDGPU drivers integrated into the kernel. This was
             | like 10 years ago.
             | 
             | Before then the AMD driver situation on Linux was
             | atrocious. The open source drivers performed so bad you'd
             | get better performance out of Intel integrated graphics
             | than an expensive AMD GPU, and their closed source drivers
             | were so poorly updated you'd have to downgrade the entire
             | world for the rest of your software to be compatible. At
             | that time Nvidia was clearly ahead, even though the driver
             | needs to be updated separately and they invented their own
             | versions of some stuff.
             | 
             | With the introduction of AMDGPU and the years after that
             | everything changed. AMD GPUs now worked great without any
             | effort, while Nvidia's tendency to invent their own things
             | really started grating. Much of the world started moving to
             | Wayland, but Nvidia refused to support some important
             | common standards. Those that really wanted their stuff to
             | work on Nvidia had to introduce entirely separate code
             | paths for it, while other parts of the landscape refused to
             | do so. This started improving again a few years ago, but
             | I'm not aware of the current state because I now only use
             | Intel and AMD hardware.
        
               | MegaDeKay wrote:
               | I use the amdgpu driver and my luck has not been as good
               | as yours. Can't sleep my PC without having it wake up to
               | fill my logs with spam [0] and eventually crash.
               | 
               | Then there is the (in)famous AMD reset bug that makes AMD
               | a real headache to use with GPU passthrough. The card
               | can't be properly reset when the VM shuts down so you
               | have to reboot the PC to start the VM a second time.
               | There are workarounds but they only work on some cards &
               | scenarios [1] [2]. This problem goes back to around the
               | 390 series cards so they've had forever to properly
               | implement reset according to the pci spec but haven't.
               | nvidia handles this flawlessly
               | 
               | [0] https://gitlab.freedesktop.org/drm/amd/-/issues/3911
               | 
               | [1] https://github.com/gnif/vendor-reset
               | 
               | [2] https://github.com/inga-lovinde/RadeonResetBugFix
        
               | eptcyka wrote:
               | I was under the impression that nvidia just didn't let
               | consumer cards do GPU passthrough.
        
               | pjmlp wrote:
               | The open source driver for the Netboooks APU was never as
               | good as either the Windows version, or the closed source
               | that predated it.
               | 
               | Lesser OpenGL version, and I never managed to have
               | hardware accelerated video until it died last year.
        
           | datagram wrote:
           | AMD cards are fine from a raw performance perspective, but
           | Nvidia has built themselves a moat of software/hardware
           | features like ray-tracing, video encoding, CUDA, DLSS, etc
           | where AMD's equivalents have simply not been as good.
           | 
           | With their current generation of cards AMD has caught up on
           | all of those things except CUDA, and Intel is in a similar
           | spot now that they've had time to improve their drivers, so
           | it's pretty easy now to buy a non-Nvidia card without feeling
           | like you're giving anything up.
        
             | SSLy wrote:
             | AMD RT is still slower than Nvidia's.
        
             | jezze wrote:
             | I have no experience of using it so I might be wrong but
             | AMD has ROCm which has something called HIP that should be
             | comparable to CUDA. I think it also has a way to
             | automatically translate CUDA calls into HIP as well so it
             | should work without the need to modify your code.
        
               | StochasticLi wrote:
               | it's mostly about AI training at this point. the software
               | for this only supports CUDA well.
        
               | tankenmate wrote:
               | `I think it also has a way to automatically translate
               | CUDA calls`
               | 
               | I suspect the thing you're referring to is ZLUDA[0], it
               | allows you to run CUDA code on a range of non NVidia
               | hardware (for some value of "run").
               | 
               | [0] https://github.com/vosen/ZLUDA
        
               | smallmancontrov wrote:
               | For an extremely flexible value of "run" that you would
               | be extremely unwise to allow anywhere near a project
               | whose success you have a stake in.
        
               | tankenmate wrote:
               | To quote "The Dude"; "Well ... ummm ... that's ... ahh
               | ... just your opinion man". There are people who are
               | successfully running it in production, but of course
               | depending on your code, YMMV.
        
               | Almondsetat wrote:
               | AMD "has" ROCm just like Intel "has" AVX-512
        
               | whatevaa wrote:
               | Consumer card ROCm support is straight up garbage. CUDA
               | support project was also killed.
               | 
               | AMD doesn't care about consumers anymore either. All the
               | money in AI.
        
               | MangoToupe wrote:
               | > AMD doesn't care about consumers anymore either. All
               | the money in AI.
               | 
               | I mean, this also describes the quality of NVIDIA cards.
               | And their drivers have been broken for the last two
               | decades if you're not using windows.
        
           | npteljes wrote:
           | What I experienced is that AI is a nightmare on AMD in Linux.
           | There is a myriad of custom things that one needs to do, and
           | even that just breaks after a while. Happened so much on my
           | current setup (6600 XT) that I don't bother with local AI
           | anymore, because the time investment is just not worth it.
           | 
           | It's not that I can't live like this, I still have the same
           | card, but if I were looking to do anything AI locally with a
           | new card, for sure it wouldn't be an AMD one.
        
             | eden-u4 wrote:
             | I don't have much experience with ROCm for large trainings,
             | but NVIDIA is still shit with driver+cuda version+other
             | things. The only simplification is due to ubuntu and other
             | distros that already do the heavy lift by installing all
             | required components, without much configuration.
        
               | npteljes wrote:
               | Oh I'm sure. The thing is that with AMD I have the same
               | luxury, and the wretched thing still doesn't work, or has
               | regressions.
        
               | int_19h wrote:
               | On Ubuntu, in my experience, installing the .deb version
               | of the CUDA toolkit pretty much "just works".
        
             | phronimos wrote:
             | Are you referring to AI training, prediction/inference, or
             | both? Could you give some examples for what had to be done
             | and why? Thanks in advance.
        
               | npteljes wrote:
               | Sure! I'm referring to setting up a1111's stable
               | diffusion webui, and setting up Open WebUI.
               | 
               | Wrt/ a1, it worked at one point (a year ago) after 2-3
               | hours of tinkering, then regressed to not working at all,
               | not even from fresh installs on new, different Linuxes. I
               | tried the main branch and the AMD specific fork as well.
               | 
               | Wrt/ Open WebUI, it works, but the thing uses my CPU.
        
             | FredPret wrote:
             | I set up a deep learning station probably 5-10 years ago
             | and ran into the exact same issue. After a week of pulling
             | out my hair, I just bought an Nvidia card.
        
           | Cthulhu_ wrote:
           | AMD GPU's are fine, but nvidia's marketing (overt and covert
           | / word-of-mouth) is better. "RTX On" is a meme where people
           | get convinced the graphics are over 9000x "better"; it's a
           | meaningless marketing expression but a naive generation of
           | fairly new PC gamers are eating it up.
           | 
           | And... they don't need to. Most of the most played video
           | games on PC are all years old [0]. They're online multiplayer
           | games that are optimized for average spec computers (and
           | mobile) to capture as big a chunk of the potential market as
           | possible.
           | 
           | It's flexing for clout, nothing else to it. And yet, I can't
           | say it's anything new, people have been bragging, boasting
           | and comparing their graphics cards for decades.
           | 
           | [0] https://activeplayer.io/top-15-most-popular-pc-games-
           | of-2022...
        
             | keyringlight wrote:
             | One thing I wonder about is whether PC gaming is splitting
             | into two distinct tiers, high end for those with thousands
             | to spend on their rig and studios who are pathfinders (id,
             | Remedy, 4A, etc) in graphics, then the wider market for
             | cheaper/older systems and studios going for broad appeal. I
             | know the market isn't going to be neatly divided and more
             | of a blurry ugly continuum.
             | 
             | The past few years (2018 with the introduction of RT and
             | upscaling reconstruction seems as good a milestone as any)
             | feel like a transition period we're not out of yet, similar
             | to the tail end of the DX9/Playstation3/Xbox360 era when
             | some studios were moving to 64bit and DX11 as optional
             | modes, almost like PC was their prototyping platform for
             | when they made completed the jump with PS4/Xbox one and
             | more mature PC implementations. It wouldn't surprise me if
             | it takes more years and titles built targeting the next
             | generation consoles before it's all settled.
        
               | phatfish wrote:
               | Once the "path tracing" that the current top end Nvidia
               | cards can pull off reaches mainstream it will settle
               | down. The PS6 isn't going to be doing path tracing
               | because the hardware for that is being decided now. I'd
               | guess PS7 time frame. It will take console level hardware
               | pricing to bring the gaming GPU prices down.
               | 
               | I understand the reason for moving to real time ray-
               | tracing. It is much easier for development, and
               | apparently the data for baked/pre-rendered lighting in
               | these big open worlds was getting out of hand. Especially
               | with multiple time-of-day passes.
               | 
               | But, it is only the "path tracing" that top end Nvidia
               | GPUs can do that matches baked lighting detail.
               | 
               | The standard ray-tracing in the latest Doom for instance
               | has a very limited number of entities that actually emit
               | light in a scene. I guess there is the main global
               | illumination source, but many of the extra lighting
               | details in the scene don't emit light. This is a step
               | backward compared to baked lighting.
               | 
               | Even shots from the plasma weapon don't cast any light
               | into the scene with the standard ray-tracing, which Quake
               | 3 was doing.
        
           | wredcoll wrote:
           | A significant part of the _vocal_ "gamers" is about being
           | "the best" which translates into gpu benchmarking.
           | 
           | You don't get headlines and hype by being an affordable way
           | to play games at a decent frame rate, you achieve it by
           | setting New Fps Records.
        
         | stodor89 wrote:
         | > I have also realized that there is a lot out there in the
         | world besides video games, and getting all in a huff about it
         | isn't worth my time or energy.
         | 
         | I think more and more people will realize games are a waste of
         | time for them and go on to find other hobbies. As a game
         | developer, it kinda worries me. As a gamer, I can't wait for
         | gaming to be a niche thing again, haha.
        
           | esseph wrote:
           | "it's just a fad"
           | 
           | Nah. Games will always be around.
        
             | stodor89 wrote:
             | Of course they will. People play since before they were
             | people.
        
           | whatevertrevor wrote:
           | The games industry is now bigger than the movies industry. I
           | think you're very wrong about this, as games are engaging in
           | a way other consumption based media simply cannot replicate.
        
             | padjo wrote:
             | I played video games since I was a teenager. Loved them,
             | was obsessed with them. Then sometime around 40 I just gave
             | up. Not because of life pressure or lack of time but
             | because I just started to find them really boring and
             | unfulfilling. Now I'd much rather watch movies or read. I
             | don't know if the games changed or I changed.
        
               | whatevertrevor wrote:
               | I get that, I go through periods of falling in and out of
               | them too after having grown up with them. But there is a
               | huge fraction of my age group (and a little older) that
               | have consistently had games as their main "consumption"
               | hobby throughout.
               | 
               | And then there's the age group younger than me, for whom
               | games are not only a hobby but also a "social place to
               | be", I doubt they'll be dropping gaming entirely easily.
        
               | FredPret wrote:
               | I'm an ex-gamer, but I remember games in the 90's and
               | earlier 00's being much more respecting of one's time.
               | 
               | You could still sink a ton of time into it if you wanted
               | do, but you could also crank out a decent amount of fun
               | in 5-15 minutes.
               | 
               | Recently games seem to have been optimized to maximize
               | play time rather than for fun density.
        
               | int_19h wrote:
               | I would strongly disagree. If anything, it's the other
               | way around - a typical 90s game had a fairly steep
               | learning curve. Often no tutorials whatsoever, difficulty
               | could be pretty high from the get go, players were
               | expected to essentially learn through trial and error and
               | failing a lot. Getting familiar enough with the game
               | mechanics to stop losing all the time would often take a
               | while, and could be frustrating while it lasted.
               | 
               | These days, AAA games are optimized for "reduced
               | friction", which in practice usually means dumbing down
               | the mechanics and the overall gameplay to remove
               | everything that might annoy or frustrate the player. I
               | was playing Avowed recently and the sheer amount of
               | convenience features (e.g. the entire rest / fast travel
               | system) was boggling.
        
           | immibis wrote:
           | Fortunately for your business model, there's a constant
           | stream of new people to replace the ones who are aging out.
           | But you have to make sure your product is appealing to them,
           | not just to the same people who bought it last decade.
        
         | klipklop wrote:
         | You are certainly right that this group has little spending
         | self-control. There is no limit just about to how abusive
         | companies like Hasbro, Nvidia and Nintendo can be and still
         | rake in record sales.
         | 
         | They will complain endlessly about the price of a RTX 5090 and
         | still rush out to buy it. I know people that own these high end
         | cards as a flex, but their lives are too busy to actually play
         | games.
        
           | kevincox wrote:
           | I'm not saying that these companies aren't charging "fair"
           | prices (whatever that means) but for many hardcore gamers
           | their spending per hour is tiny compared to other forms of
           | entertainment. They may buyba $100 game and play to for over
           | 100 hours. Maybe add another $1/hour for the console.
           | Compared to someone who frequents the cinema goes to the pub
           | or does many other common hobbies and it can be hard to say
           | that games are getting screwed.
           | 
           | Now it is hard to draw a straight comparison. Gamers may
           | spend a lot more time playing so $/h isn't a perfect metric.
           | And some will frequently buy new games or worse things like
           | microtransactions which quickly skyrocket the cost. But
           | overall it doesn't seem like the most expensive hobby,
           | especially if you are trying to spend less.
        
             | dgellow wrote:
             | Off-topic: micro transactions are just digital
             | transactions. There is nothing micro about them. I really
             | wish that term would just die
        
           | fireflash38 wrote:
           | It's because it's part and parcel of their identity. Being
           | able to play the latest games, often with their friends, is
           | critical to their social networks.
        
         | duckmysick wrote:
         | > I have also realized that there is a lot out there in the
         | world besides video games, and getting all in a huff about it
         | isn't worth my time or energy.
         | 
         | I'd really love to try AMD as a daily driver. For me CUDA is
         | the showstopper. There's really nothing comparable in the AMD
         | camp.
        
           | delusional wrote:
           | ROCM is, to some degree and in some areas, a pretty decent
           | alternative. Developing with it is often times a horrible
           | experience, but once something works, it works fine.
        
             | pixelesque wrote:
             | > but once something works, it works fine.
             | 
             | Is there "forwards compatibility" to the same code working
             | on the next cards yet like PTX provided Nvidia?
             | 
             | Last time (4 years ago?) I looked into ROCM, you seemed to
             | have to compile for each revision of each architecture.
        
               | delusional wrote:
               | I'm decently sure you have to compile separately for each
               | architecture, and if you elect to compile for multiple
               | architectures up front, you'll have excruciating compile
               | times. You'd think that would be annoying, but it ends up
               | not really mattering since AMD completely switches out
               | the toolchain about every graphics generation anyway.
               | That's not a good reason to not have forwards
               | compatibility, but it is a reason.
               | 
               | The reason I'm not completely sure is because I'm just
               | doing this as a hobby, and I only have a single card, and
               | that single card has never seen a revision. I think
               | that's generally the best way to be happy with ROCM.
               | Accept that it's at the abstraction level of embedded
               | programming, any change in the hardware will have to
               | result in a change in the software.
        
         | flohofwoe wrote:
         | > I have also realized that there is a lot out there in the
         | world besides video games
         | 
         | ...and even if you're all in on video games, there's a massive
         | amount of really brilliant indie games on Steam that run just
         | fine on a 1070 or 2070 (I still have my 2070 and haven't found
         | a compelling reason to upgrade yet).
        
         | xg15 wrote:
         | > _I have also realized that there is a lot out there in the
         | world besides video games, and getting all in a huff about it
         | isn't worth my time or energy._
         | 
         | I'm with you - in principle. Capital-G "Gamers" who turned
         | gaming into an identity and see themselves as the _real_
         | discriminated group have fully earned the ridicule.
         | 
         | But I think where the criticism is valid is how NVIDIA's
         | behavior is part of the wider enshittification trend in tech.
         | Lock-in and overpricing in entertainment software might be
         | annoying but acceptable, but it gets problematic when we have
         | the exact same trends in actually critical tech like phones and
         | cars.
        
         | reissbaker wrote:
         | I want to love AMD, but they're just... mediocre. Worse for
         | gaming, and _much_ worse for ML. They 're better-integrated
         | into Linux, but given that the entire AI industry runs on:
         | 
         | 1. Nvidia cards
         | 
         | 2. Hooked up to Linux boxes
         | 
         | It turns out that Nvidia tends to work pretty well on Linux
         | too, despite the binary blob drivers.
         | 
         | Other than gaming and ML, I'm not sure what the value of
         | spending much on a GPU is... AMD is just in a tough spot.
        
           | const_cast wrote:
           | Price-per-price AMD typically has better rasterization
           | performance in comparison to nvidia. The only price point
           | where this doesn't hold true is the very tippy top, which, I
           | think, most people aren't at. Nvidia does have DLSS which I
           | hear is quite good these days. But I know for me personally,
           | I just try to buy the GPU with the best rasterization
           | performance at my price point, which is always AMD.
        
         | dgellow wrote:
         | > when they are exploited instead of just walking away and
         | doing something else.
         | 
         | You don't even have to walk away. You pretty much never need
         | the latest GPUs to have a great gaming experience
        
         | artursapek wrote:
         | I just learned MTG this year because my 11 year old son got
         | into it. I like it. How did it "go to shit"?
        
           | zaneyard wrote:
           | If you don't care about competitive balance or the "identity"
           | of magic it probably didn't.
           | 
           | Long answer: the introduction of non-magic sets like
           | SpongeBob SquarePants, Deadpool, or Assassin's Creed are seen
           | as tasteless money grabs that dilute the quality and theme of
           | magic even further, but fans of those things will scoop them
           | up.
           | 
           | The competitive scene has been pretty rough, but I haven't
           | played constructed formats in a while so I'm not as keyed
           | into this. I just know that there have been lots of cards
           | released recently that have had to be banned for how powerful
           | they were.
           | 
           | Personally, I love the game, but I hate the business model.
           | It's ripe for abuse and people treat cards like stocks to
           | invest in.
        
             | artursapek wrote:
             | yeah I hate that Lego has been doing this too. most new
             | sets are co-branded garbage.
        
           | __turbobrew__ wrote:
           | Don't let my opinion affect you, MTG is still a fun game and
           | you should do that if you find it enjoyable -- especially if
           | your son likes it. But here is why I had a falling out:
           | 
           | 1. The number of sets per year increased too much, there are
           | too many cards being printed to keep up
           | 
           | 2. Cards from new sets are pushed to be very strong (FIRE
           | design) which means that the new cards are frequently the
           | best cards. Combine this with the high number of new sets
           | means the pool of best cards is always churning and you have
           | to constantly be buying new cards to keep up.
           | 
           | 3. Artificial scarcity in print runs means that the best
           | cards in the new sets are very expensive. We are talking
           | about cardboard here, it isn't hard to simply print more
           | sheets of a set.
           | 
           | 4. The erosion of the brand identity and universe. MTG used
           | to have a really nicely curated fantasy universe and things
           | meshed together well. Now we have spongebob, deadpool, and a
           | bunch of others in the game. It like if you put spongebob in
           | the star wars universe, it just ruins the texture of the
           | game.
           | 
           | 5. Print quality of cards went way down. Older cards actually
           | have better card stock than the new stuff.
           | 
           | 6. Canadians MTG players get shafted. When a new set is
           | printed stores get allocations of boxes (due to the
           | artificial scarcity) and due to the lower player count in
           | Canada, usually Canadian stores get much lower allocations
           | than their USA counterparts. Additionally, MTG cards get
           | double tariffs as they get printed outside of the USA,
           | imported into the USA and tariffed, and then imported into
           | Canada and tariffed again. I think the cost of MTG cards when
           | up like 30-40% since global trade war.
           | 
           | Overall it boils down to hasbro turning the screws on players
           | to squeeze more money, and I am just not having it. I already
           | spent obscene amounts of money on the game before this all
           | happened.
        
             | hadlock wrote:
             | > 1. The number of sets per year increased too much, there
             | are too many cards being printed to keep up
             | 
             | My local shop has an entire wall of the last ~70 sets,
             | everything from cyberpunk ninjas to gentlemen academic
             | fighting professors to steampunk and everything in between.
             | I think they are releasing ~10 sets per year on average? 4
             | main ones and then a bunch of effectively novelty ones. I
             | hadn't been in a store in years (most of my stuff is 4th
             | edition from the late 1990s) I did pull the trigger on the
             | Final Fantasy novelty set recently though, for nostalgia's
             | sake.
             | 
             | But yeah it's overwhelming, as a kid I was used to a new
             | major set every year and a half or so with a handful of new
             | cards. 10 sets a year makes it feel futile to participate.
        
               | artursapek wrote:
               | "There's an infinite amount of cash at the Federal
               | Reserve"
        
         | scarface_74 wrote:
         | And even if you ignore AMD, most PCs being sold are cheap
         | computers using whatever integrated hardware Intel is selling
         | for graphics.
        
         | notnullorvoid wrote:
         | If I hadn't bought a 3090 when they were 1k new, I likely
         | would've switched back onto the AMD train by now.
         | 
         | So far there hasn't been enough of a performance increase for
         | me to upgrade either for gaming or ML. Maybe AMDs rumored 9090
         | will be enough to get me to open my wallet.
        
         | witnessme wrote:
         | Couldn't agree more
        
       | snitty wrote:
       | NVIDIA is, and will be for at least the next year or two, supply
       | constrained. They only have so much capacity at TSMC for all the
       | chips, and the lion's share of that is going to be going
       | enterprise chips, which sell for an order of magnitude more than
       | the consumer chips.
       | 
       | It's hard to get too offended by them shirking the consumer
       | marker right now when they're printing money with their
       | enterprise business.
        
         | wmf wrote:
         | They could be more honest about it though.
        
         | davidee wrote:
         | Not personally offended, but when a company makes a big stink
         | around several gross exaggerations (performance, price,
         | availability) it's not hard to understand why folks are kicking
         | up their own stink.
         | 
         | Nvidia could have said "we're prioritizing enterprise" but
         | instead they put on a big horse and pony show about their
         | consumer GPUs.
         | 
         | I really like the Gamer's Nexus paper launch shirt. ;)
        
           | nicce wrote:
           | They could rapidly build new own factories but they don't.
        
             | axoltl wrote:
             | Are you saying Nvidia could spin up their own chip fabs in
             | short order?
        
               | benreesman wrote:
               | If they believed they were going to continue selling AI
               | chips at those margins they would:
               | 
               | - outbid Apple on new nodes
               | 
               | - sign commitments with TSMC to get the capacity in the
               | pipeline
               | 
               | - absolutely own the process nodes they made cards on
               | that are still selling way above retail
               | 
               | NVIDIA has been posting net earnings in the 60-90 range
               | over the last few years. If you think that's going to
               | continue? You book the fab capacity hell or high water.
               | Apple doesn't make those margins (which is what on paper
               | would determine who is in front for the next node).
        
               | ksec wrote:
               | And what if Nvidia booked but the order didn't come. What
               | if Nvidia's customer isn't going to commit? How expensive
               | and how much prepayment is needed for TSMC to break a new
               | Fab?
               | 
               | These are the same question Apple Fans asking Apple to
               | buy TSMC. The fact is isn't so simple. And even if Nvidia
               | were willing to pay for it TSMC wouldn't do it just for
               | Nvidia alone.
        
               | benreesman wrote:
               | Yeah, I agree my "if" is doing a lot of lifting there. As
               | in, "if Jensen were being candid and honest when he goes
               | on stage and said things".
               | 
               | Big if, I I get that.
        
               | nicce wrote:
               | Yes, if they wanted. They have had years to make that
               | decision. They have enough knowledge. Their profits are
               | measured in billions. But in order to maximize profits,
               | that is not good because it is better to throttle supply.
        
             | selectodude wrote:
             | Somebody should let Intel know.
        
         | scrubs wrote:
         | "It's hard to get too offended by them shirking the consumer"
         | 
         | BS! Nvidia isn't entitled. I'm not obligated. Customer always
         | has final say.
         | 
         | The problem is a lot of customers can't or don't stand their
         | ground. And the other side knows that.
         | 
         | Maybe you're a well trained "customer" by Nvidia just like
         | Basil Fawlty was well trained by his wife ...
         | 
         | Stop excusing bs.
        
         | msgodel wrote:
         | I was under the impression that a ton of their sales growth
         | last quarter was actually from consumers. DC sales growth was
         | way lower than I expected.
        
       | frollogaston wrote:
       | Because they won't sell you an in-demand high-end GPU for cheap?
       | Well TS
        
         | tiahura wrote:
         | Not to mention that they are currently in stock at my local
         | microcenter.
        
       | another_kel wrote:
       | I'm sorry but this framing is insane
       | 
       | > So 7 years into ray traced real-time computer graphics and
       | we're still nowhere near 4K gaming at 60 FPS, even at $1,999.
       | 
       | The guy is complaining that a product can't live up to his
       | standard, while dismissing barely noticeable proposed trade off
       | that can make it possible because it's <<fake>>.
        
       | jdprgm wrote:
       | The 4090 was released coming up on 3 years and is currently going
       | for about 25% over launch msrp USED. Buying gpu's is literally an
       | appreciating asset. It is complete insanity and an infuriating
       | situation for an average consumer.
       | 
       | I honestly don't know why nvidia didn't just suspend their
       | consumer line entirely. It's clearly no longer a significant
       | revenue source and they have thoroughly destroyed consumer
       | goodwill over the past 5 years.
        
         | trynumber9 wrote:
         | >I honestly don't know why nvidia didn't just suspend their
         | consumer line entirely.
         | 
         | It's ~$12 billion a year with a high gross margin by the
         | standards of every other hardware company. They want to make
         | sure neither AMD nor Intel get that revenue they can invest
         | into funding their own AI/ML efforts.
        
       | oilkillsbirds wrote:
       | Nobody's going to read this, but this article and sentiment is
       | utter anti-corporate bullshit, and the vastly congruent responses
       | show that none of you have watched the historical development of
       | GPGPU, or do any serious work on GPUs, or keep up with the open
       | work of nvidia researchers.
       | 
       | The spoiled gamer mentality is getting old for those of us that
       | actually work daily in GPGPU across industries, develop with RTX
       | kit, do AI research, etc.
       | 
       | Yes they've had some marketing and technical flubs as any giant
       | publically traded company will have, but their balance of
       | research-driven development alongside corporate profit
       | necessities is unmatched.
        
         | oilkillsbirds wrote:
         | And no I don't work for nvidia. I've just been in the industry
         | long enough to watch the immense contribution nvidia has made
         | to every. single. field. The work of their researchers is
         | astounding, it's clear to anyone that's honestly worked in this
         | field long enough. It's insane to hate on them.
        
           | grg0 wrote:
           | Their contribution to various fields and the fact that they
           | treat the average consumer like shit nowadays are not
           | mutually exclusive.
           | 
           | Also, nobody ever said they hate their researchers.
        
             | Rapzid wrote:
             | Maybe the average consumer doesn't agree they are being
             | treated like shit? Steam top 10 GPU list is almost all
             | NVidia. Happy customers or duped suckers? I've seen the
             | later sentiment a lot over the years and discounting
             | consumer's preferences never seems to lead to correct
             | prediction of outcomes..
        
               | detaro wrote:
               | Or maybe the average consumer bought them while still
               | being unhappy about the overall situation?
        
         | gdbsjjdn wrote:
         | It pains me to be on the side of "gamers" but I would rather
         | support spoiled gamers than modern LLM bros.
        
       | FeepingCreature wrote:
       | Oh man, you haven't gotten into their AI benchmark bullshittery.
       | There's factors of 4x on their numbers that are basically
       | invented whole cloth by switching units.
        
       | benreesman wrote:
       | The thing is, company culture is a real thing. And some cultures
       | are invasive/contagious like kudzu both internally to the company
       | and into adjacent companies that they get comped against. The
       | people get to thinking a certain way, they move around between
       | adjacent companies at far higher rates than to more distant parts
       | of their field, the executives start sitting on one another's
       | boards, before you know it a whole segment is enshittified, and
       | customers feel like captives in an exploitation machine instead
       | of parties to a mutually beneficial transaction in which trade
       | increases the wealth of all.
       | 
       | And you can build mythologies around falsehoods to further
       | reinforce it: "I have a legal obligation to maximize shareholder
       | value." No buddy, you have some very specific restrictions on
       | your ability to sell the company to your cousin (ha!) for a
       | handful of glass beads. You have a legal obligation to bin your
       | wafers the way it says on your own box, but that doesn't seem to
       | bother you.
       | 
       | These days I get a machine like the excellent ASUS Proart P16
       | (grab one of those before they're all gone if you can) with a
       | little 4060 or 4070 in it that can boot up Pytorch and make sure
       | the model will run forwards and backwards at a contrived size,
       | and then go rent a GB200 or whatever from Latitude or someone
       | (seriously check out Latitude, they're great), or maybe one of
       | those wildly competitive L40 series fly machines (fly whips the
       | llama's ass like nothing since Winamp, check them out too). The
       | GMTek EVO-X1 is a pretty capable little ROCm inference machine
       | for under 1000, its big brother is nipping at the heels of a DGX
       | Spark under 2k. There is good stuff out there but its all from
       | non-incumbent angles.
       | 
       | I don't game anymore but if I did I would be paying a lot of
       | attention to ARC, I've heard great things.
       | 
       | Fuck the cloud and their ancient Xeon SKUs for more than Latitude
       | charges for 5Ghz EPYC. Fuck NVIDIA gaming retail rat race, its an
       | electrical as well as moral hazard in 2025.
       | 
       | It's a shame we all have to be tricky to get what used to be a
       | halfway fair deal 5-10 years ago (and 20 years ago they passed a
       | HUGE part of the scaling bonanza down to the consumer), but its
       | possible to compute well in 2025.
        
         | glitchc wrote:
         | Nice advertorial. I hope you got paid for all of those plugs.
        
           | benreesman wrote:
           | I wish! People don't care what I think enough to monetize it.
           | 
           | But I do spend a lot of effort finding good deals on modern
           | ass compute. This is the shit I use to get a lot of
           | performance on a budget.
           | 
           | Will people pay you to post on HN? How do I sign up?
        
         | 827a wrote:
         | > Fuck the cloud and their ancient Xeon SKUs
         | 
         | Dude, no one talks about this and it drives me up the wall. The
         | only way to guarantee modern CPUs from any cloud provider is to
         | explicitly provision really new instance types. If you use any
         | higher-level abstracted services (Fargate, Cloud Run, Lambda,
         | whatever) you get salvation army second-hand CPUs from 15 years
         | ago, you're billed by the second so the slower, older CPUs
         | screw you over there, and you pay a 30%+ premium over the
         | lower-level instances because its a "managed service". Its
         | insane and extremely sad that so many customers put up with it.
        
           | benreesman wrote:
           | Bare metal is priced like it always was but is mad convenient
           | now. latitude.sh is my favorite, but there are a bunch of
           | providers that are maybe a little less polished.
           | 
           | It's also way faster to deploy and easier to operate now. And
           | mad global, I've needed to do it all over the world (a lot of
           | places the shit works flawlessly and you can get Ryzen SKUs
           | for nothing).
           | 
           | Protip: burn a partition of Ubuntu 24.04 LTS which is the
           | default on everything and use that as "premium IPMI", even if
           | you run Ubuntu. you can always boot into a known perfect
           | thing with all the tools to tweak whatever. If I have to even
           | restart on I just image it, faster than launching a VM on
           | EC2.
        
       | sonicvrooom wrote:
       | it would be "just" capitalist to call these fuckers out for real,
       | on the smallest level.
       | 
       | you are safe.
        
       | shmerl wrote:
       | _> ... NVENC are pretty much indispensable_
       | 
       | What's so special about NVENC that Vulkan video or VAAPI can't
       | provide?
       | 
       |  _> AMD also has accelerated video transcoding tech but for some
       | reason nobody seems to be willing to implement it into their
       | products_
       | 
       | OBS works with VAAPI fine. Looking forward to them adding Vulkan
       | video as an option.
       | 
       | Either way, as a Linux gamer I haven't touched Nvidia in years.
       | AMD is a way better experience.
        
       | DarkmSparks wrote:
       | I sometimes wonder if people getting this salty over "fake"
       | frames actually realise every frame is fake even in native mode.
       | Neither is more "real" than the other, it's just different.
        
       | strictnein wrote:
       | This really makes no sense:
       | 
       | > This in turn sparked rumors about NVIDIA purposefully keeping
       | stock low to make it look like the cards are in high demand to
       | drive prices. And sure enough, on secondary markets, the cards go
       | way above MSRP
       | 
       | Nvidia doesn't earn more money when cards are sold above MSRP,
       | but they get almost all the hate for it. Why would they set
       | themselves up for that?
       | 
       | Scalpers are a retail wide problem. Acting like Nvidia has the
       | insight or ability to prevent them is just silly. People may not
       | believe this, but retailers hate it as well and spend millions of
       | dollars trying to combat it. They would have sold the product
       | either way, but scalping results in the retailer's customers
       | being mad and becoming some other company's customers, which are
       | both major negatives.
        
         | kbolino wrote:
         | Scalping and MSRP-baiting have been around for far too many
         | years for nVidia to claim innocence. The death of EVGA's GPU
         | line also revealed that nVidia holds most of the cards in the
         | relationship with its "partners". Sure, Micro Center and Amazon
         | can only do so much, and nVidia isn't a retailer, but they know
         | what's going on and their behavior shows that they actually
         | like this situation.
        
           | amatecha wrote:
           | Yeah wait, what happened with EVGA? (guess I can search it
           | up, of course) I was browsing gaming PC hardware recently and
           | noticed none of the GPUs were from EVGA .. I used to buy
           | their cards because they had such a good warranty policy (in
           | my experience)... :\
        
             | theshackleford wrote:
             | In 2022 claiming a lack of respect from Nvidia, low
             | margins, and Nvidia's control over partners as just a few
             | of the reasons, EVGA ended its partnership with Nvidia and
             | ceased manufacturing Nvidia GPUs.
             | 
             | > I used to buy their cards because they had such a good
             | warranty policy (in my experience)... :\
             | 
             | It's so wild to hear this as in my country, they were not
             | considered anything special over any other third party
             | retailer as we have strong consumer protection laws which
             | means its all much of a muchness.
        
               | kbolino wrote:
               | The big bombshell IMO is that, according to EVGA at
               | least, nVidia just comes up with the MSRP for each card
               | all on its own, and doesn't even tell its partners what
               | that number will be before announcing it to the public. I
               | elaborate on this a bit more in a response to a sibling
               | comment.
        
             | izacus wrote:
             | EVGA was angry because nVidia wouldn't pay them for
             | attempts at scalping which failed.
        
               | kbolino wrote:
               | I've never seen this accusation before. I want to give
               | the benefit of the doubt but I suspect it's confusing
               | scalping with MSRP-baiting.
               | 
               | It's important to note that nVidia mostly doesn't sell or
               | even make finished consumer-grade GPUs. They own and
               | develop the IP cores, and they contract with TSMC and
               | others to make the chips, and they do make _limited runs_
               | of  "Founders Edition" cards, but most cards that are
               | available to consumers undergo final assembly and retail
               | boxing according to the specs of the partner -- ASUS,
               | GIGABYTE, MSI, formerly EVGA, etc.
               | 
               | MSRP-baiting is what happens when nVidia sets the MSRP
               | without consulting any of its partners and then those
               | partners go and assemble the graphics cards and have to
               | charge more than that to make a reasonable profit. This
               | has been going on for many GPU generations now, but it's
               | not scalping. We can question why this "partnership"
               | model even exists in the first place, since these
               | middlemen offer very little unique value vs any of their
               | competitors anymore, but again nVidia has the upper hand
               | here and thus the lion's share of the blame.
               | 
               | Scalping is when somebody who's ostensibly outside of the
               | industry buys up a bunch of GPUs at retail prices,
               | causing a supply shortage, so that they can resell the
               | cards at higher prices. While nVidia doesn't have direct
               | control over this (though I wouldn't be too surprised if
               | it came out that there was some insider involvement),
               | they also never do very much to address it either.
               | Getting all the hate for this without directly reaping
               | the monetary benefit sounds irrational at first, but
               | artificial scarcity and luxury goods mentality are real
               | business tactics.
        
               | izacus wrote:
               | Then you didn't follow the situation, since majority of
               | EVGA anger was because nVidia wouldn't buy back their
               | chips after EVGA failed to sell cards at hugely inflated
               | price point.
               | 
               | Then they tried to weaponize PR to beat nVidia into
               | buying back their unsold cores they thought they'll
               | massively profit off with inflated crypto hype prices.
        
               | kbolino wrote:
               | Ok, this seems to be based entirely on speculation. It
               | could very well be accurate but there's no statements I
               | can find from either nVidia or EVGA corroborating it.
               | Since it's done by the manufacturer themselves, it's more
               | like gouging rather than scalping.
               | 
               | But more to the point, there's still a trail of blame
               | going back to nVidia here. If EVGA could buy the cores at
               | an inflated price, then nVidia should have raised its
               | advertised MSRP to match. The reason I call it MSRP-
               | baiting is not because I care about EVGA or any of these
               | other rent-seekers, it's because it's a calculated lie
               | weaponized against the consumer.
               | 
               | As I kind of implied already, it's probably for the best
               | if this "partner" arrangement ends. There's no good
               | reason nVidia can't sell all of its desktop GPUs directly
               | to the consumer. EVGA may have bet big and lost from
               | their own folly, but everybody else was in on it too
               | (except video gamers, who got shafted).
        
               | Tijdreiziger wrote:
               | NVIDIA doesn't make a lot of finished cards for the same
               | reason Intel doesn't make a lot of motherboards,
               | presumably.
        
         | rubyn00bie wrote:
         | > Scalpers are a retail wide problem. Acting like Nvidia has
         | the insight or ability to prevent them is just silly.
         | 
         | Oh trust me, they can combat it. The easiest way, which is what
         | Nintendo often does for the launch of its consoles, is produce
         | an enormous amount of units before launch. The steady supply to
         | retailers, absolutely destroys folks ability to scalp. Yes a
         | few units will be scalped, but most scalpers will be underwater
         | if there is a constant resupply. I know this because I used to
         | scalp consoles during my teens and early twenties, and
         | Nintendo's consoles were the least profitable and most
         | problematic because they really try to supply the market. The
         | same with iPhones, yeah you might have to wait a month after
         | launch to find one if you don't pre-order but you can get one.
         | 
         | It's widely reported that most retailers had maybe tens of
         | cards per store, or a few hundred nationally, for the 5090s
         | launch. This immediately creates a giant spike in demand, and
         | drove prices up along with the incentive for scalpers. The
         | manufacturing partners immediately saw what (some) people were
         | willing to pay (to the scalpers) and jacked up prices so they
         | could get their cut. It is still so bad in the case of the 5090
         | that MSRP prices from AIBs skyrocketed 30%-50%. PNY had cards
         | at the original $1999.99 MSRP and now those same cards can't be
         | found for less than $2,999.99.
         | 
         | By contrast look at how AMD launched it's 9000 series of GPUS--
         | each MicroCenter reportedly had hundreds on hand (and it sure
         | looked like by pictures floating around). Folks were just
         | walking in until noon and still able to get a GPU on launch
         | day. Multiple restocks happened across many retailers
         | immediately after launch. Are there still some inflated prices
         | in the 9000 series GPUs? Yes, but we're not talking a 50%
         | increase. Having some high priced AIBs has always occurred but
         | what Nvidia has done by intentionally under supplying the
         | market is awful.
         | 
         | I personally have been trying to buy a 5090 FE since launch. I
         | have been awake attempting to add to cart for every drop on BB
         | but haven't been successful. I refuse to pay the inflated MSRP
         | for cards that haven't been been that well reviewed. My 3090 is
         | fine... At this point, I'm so frustrated by NVidia I'll likely
         | just piss off for this generation and hope AMD comes out with
         | something that has 32GB+ of VRAM at a somewhat reasonable
         | price.
        
           | pshirshov wrote:
           | W7900 has 48 Gb and is reasonably priced.
        
             | kouteiheika wrote:
             | It' $4.2k on Newegg; I wouldn't necessarily call it
             | reasonably priced, even compared to NVidia.
             | 
             | If we're looking at the ultra high end, you can pay double
             | that and get an RTX 6000 Pro with double the VRAM (96GB vs
             | 48GB), double the memory bandwidth (1792 GB/s vs 864 GB/s)
             | and much much better software support. Or you could get an
             | RTX 5000 Pro with the same VRAM, better memory bandwidth
             | (1344 GB/s vs 864 GB/s) at similar ~$4.5k USD from what I
             | can see (only a little more expensive than AMD).
             | 
             | Why the hell would I ever buy AMD in this situation? They
             | don't really give you anything extra over NVidia, while
             | having similar prices (usually only marginally cheaper) and
             | much, much worse software support. Their strategy was
             | always "slightly worse experience than NVidia, but $50
             | cheaper and with much worse software support"; it's no
             | wonder they only have less than 10% GPU market share.
        
           | ksec wrote:
           | >Oh trust me, they can combat it.
           | 
           | As has been explained by others. They cant. Look at the tech
           | which is used by Switch 2 and then look at the tech by Nvidia
           | 50 series.
           | 
           | And Nintendo didn't destroy scalpers, they are still in many
           | market not meeting demand despite "is produce an enormous
           | amount of units before launch".
        
             | rubyn00bie wrote:
             | If you put even a modicum of effort into trying to acquire
             | a Switch 2 you can. I've had multiple instances to do so,
             | and I don't even have interest in it yet. Nintendo even
             | sent me an email giving me a 3 day window to buy one. Yes,
             | it will require a bit of effort and patience but it's
             | absolutely possible. If you decide you want one
             | "immediately" yeah you probably are going to be S.O.L. but
             | it has literally been out a month as of today. I'd bet by
             | mid August it's pretty darn easy.
             | 
             | Nintendo has already shipped over 5 million of them. That's
             | an insane amount of supply for its first month.
             | 
             | Also, Nvidia could have released the 50-series after
             | building up inventory. Instead, they did the opposite
             | trickling supply into the market to create scarcity and
             | drive up prices. They have no real competition right now
             | especially in the high end. There was no reason to have a
             | "paper launch" except to drive up prices for consumers and
             | margins for their board partners. Process node had zero to
             | do with what has transpired.
        
           | cherioo wrote:
           | Switch 2 inventory was amazing, but how did RX 9070 inventory
           | remotely sufficient? News at the time were all about how
           | limited its availability
           | https://www.tweaktown.com/news/103716/amd-rx-9070-xt-
           | stock-a...
           | 
           | Not to mention it's nowhere to be found on Steam Hardware
           | Survey https://store.steampowered.com/hwsurvey/videocard/
        
             | Rapzid wrote:
             | The 9070 XT stock situation went about like this; I bought
             | a 5070 Ti instead.
        
         | lmm wrote:
         | > Nvidia doesn't earn more money when cards are sold above MSRP
         | 
         | How would we know if they were?
        
           | sidewndr46 wrote:
           | Theoretically they'd need to make a public filing about their
           | revenue and disclose this income stream. More to your point,
           | I think it's pretty easy to obscure this under something
           | else. My understanding is Microsoft has somehow always
           | avoided disclosing the actual revenue from the Xbox for
           | example.
        
         | adithyassekhar wrote:
         | Think of it this way, the only reason 40 series and above are
         | priced like they are is because they saw how willing people
         | were to pay dueing 30 series scalper days. This over
         | representation by the rich is training other customers that
         | nvidia gpus are worth that much so when they increase it again
         | people won't feel offended.
        
           | Mars008 wrote:
           | Is AMD doing the same? From another post in this thread:
           | 
           | > Nowadays, $650 might get you a mid-range RX 9070 XT if you
           | miraculously find one near MSRP.
           | 
           | If yes then it's industry wide phenomena.
        
           | KeplerBoy wrote:
           | Did you just casually forget about the AI craze we are in the
           | midst of? Nvidia still selling GPUs for gamers at all is a
           | surprise to be honest.
        
         | thaumasiotes wrote:
         | > Nvidia doesn't earn more money when cards are sold above
         | MSRP, but they get almost all the hate for it. Why would they
         | set themselves up for that?
         | 
         | If you believe their public statements, because they didn't
         | want to build out additional capacity and then have a huge
         | excess supply of cards when demand suddenly dried up.
         | 
         | In other words, the charge of "purposefully keeping stock low"
         | is something NVidia admitted to; there was just no theory of
         | how they'd benefit from it in the present.
        
           | rf15 wrote:
           | which card's demand suddenly dried up? Can we buy their
           | excess stock already? please?
        
             | thaumasiotes wrote:
             | I didn't say that happened. I said that was why NVidia said
             | they didn't want to ramp up production. They didn't want to
             | end up overextended.
        
               | bigyabai wrote:
               | I don't even think Nvidia _could_ overextend if they
               | wanted to. They 're buying low-margin, high demand TSMC
               | wafers to chop into _enormous_ GPU tiles or _even larger_
               | datacenter products. These aren 't smartphone chipsets,
               | they're enormous, high-power desktop GPUs.
        
         | whamlastxmas wrote:
         | Nvidia shareholders make money when share price rises.
         | Perceived extreme demand raises share prices
        
         | solatic wrote:
         | Scalpers are only a retail-wide problem if (a) factories could
         | produce more, but they calculated demand wrong, or (b)
         | factories can't produce more, they calculated demand wrong, and
         | under-priced MSRP relative to what the market is actually
         | willing to pay, thus letting scalpers capture more of the
         | profits.
         | 
         | Either way, scalping is not a problem that persists for
         | multiple years unless it's intentional corporate strategy.
         | Either factories ramp up production capacity to ensure there is
         | enough supply for launch, or MSRP rises much faster than
         | inflation. Getting demand planning wrong year after year after
         | year smells like incompetence leaving money on the table.
         | 
         | The argument that scalping is better for NVDA is coming from
         | the fact that consumer GPUs no longer make a meaningful
         | difference to the bottom line. Factory capacity is better
         | reserved for even more profitable data center GPUs. The
         | consumer GPU market exists not to increase NVDA profits
         | directly, but as a marketing / "halo" effect that promotes
         | decision makers sticking with NVDA data center chips. That
         | results in a completely different strategy where out-of-stock
         | is a feature, not a bug, and where product reputation is more
         | important than actual product performance, hence the coercion
         | on review media.
        
       | Ancapistani wrote:
       | I disagree with some of the article's points - primarily, that
       | nVidia's drivers were _ever_ "good" - but the gist I agree with.
       | 
       | I have a 4070 Ti right now. I use it for inference and VR gaming
       | on a Pimax Crystal (2880x2880x2). In War Thunder I get ~60 FPS.
       | I'd love to be able to upgrade to a card with at least 16GB of
       | VRAM and better graphics performance... but as far as I can tell,
       | such a card does not exist at any price.
        
       | scrubs wrote:
       | Another perspective: Nvidia customer support on their mellanox
       | purchase ...is total crap. It's the worst of corporate America
       | ... paper pushing beurceatric guys who slow roll stuff ...
       | getting to a smart person behind the customer reps requires one
       | to be an ape in a bad mood 5x ... I think they're so used to that
       | now that unless you go crazy mode their take is ... well I guess
       | he wasn't serious about his ask and he dropped it.
       | 
       | Here's another nvdia/mellanox bs problem: many mlx nic cards are
       | finalized or post assembled say by hp. So if you have a hp
       | "mellanox" nic nvidia washes their hands of anything detailed.
       | It's not ours; hp could have done anything to it what do we know?
       | So one phones hp ... and they have no clue either because it's
       | really not their IP or their drivers.
       | 
       | It's a total cluster bleep and more and more why corporate
       | america sucks
        
         | grg0 wrote:
         | Corporate America actually resembles the state of government a
         | lot too. Deceptive marketing, inflated prices that leave the
         | average Joe behind, and low quality products on top of all
         | that.
        
           | scrubs wrote:
           | In the 1980s maybe a course correction was needed to help
           | capitalism. But it's over corrected by 30%. I'm not knocking
           | corporate america or capitalism in absolute terms. I am
           | saying customers have lost power... whether it's phone trees,
           | right to fix, a lack of accountability (2008 housing crisis),
           | the ability to play endless accounting games to pay lower
           | taxes plus all the more mundane things ... it's gotten out of
           | whack.
        
         | ksec wrote:
         | I have guessing you have HP "mellanox"? Because Connect-X
         | support are great.
        
           | scrubs wrote:
           | >I have guessing you have HP "mellanox"? Because Connect-X
           | support are great.
           | 
           | I'll have to take your word on that.
           | 
           | And if I take your word: ergo not Connect-X support sucks
           | 
           | So that's sucks yet again on the table ... for what the 3rd
           | time? Nvidia sucks.
        
       | spoaceman7777 wrote:
       | The real issue here is actually harebrained youtubers stirring up
       | drama for views. That's 80% of the problem. And their viewers
       | (and readers, for that which makes it into print) eat it up.
       | 
       | Idiots doing hardware installation, with zero experience, using
       | 3rd party cables incorrectly, posting to social media, and
       | youtubers jumping on the trend for likes.
       | 
       | These are 99% user error issues drummed up by non-professionals
       | (and, in some cases, people paid by 3rd party vendors to protect
       | _those_ vendors ' reputation).
       | 
       | And the complaints about transient performances issues with
       | drivers, drummed up into apocalyptics scenarios, again, by
       | youtubers, who are putting this stuff under a microscope for
       | views, are universal across every single hardware and software
       | product. Everything.
       | 
       | Claiming "DLSS is snakeoil", and similar things are just an
       | expression of the complete lack of understanding of the people
       | involved in these pot-stirring contests. Like... the technique
       | obviously couldn't magically multiply the ability of hardware to
       | generate frames using the primary method. It is exactly as
       | advertised. It uses machine learning to approximate it. And it's
       | some fantastic technology, that is now ubiquitous across the
       | industry. Support and quality will increase over time, _just like
       | every _quality_ hardware product does_ during its early lifespan.
       | 
       | It's all so stupid and rooted in greed by those seeking ad-money,
       | and those lacking in basic sense or experience in what they're
       | talking about and doing. Embarrassing for the author to so
       | publicly admit to eating up social media whinging.
        
         | grg0 wrote:
         | If you've ever watched a GN or LTT video, they never claimed
         | that DLSS is snakeoil. They specifically call out the pros of
         | the technology, but also point out that Nvidia lies, very
         | literally, about its performance claims in marketing material.
         | Both statements are true and not mutually exclusive. I think
         | people like in this post get worked up about the false
         | marketing and develop (understandably) a negative view of the
         | technology as a whole.
         | 
         | > Idiots doing hardware installation, with zero experience,
         | using 3rd party cables incorrectly
         | 
         | This is not true. Even GN reproduced the melting of the _first-
         | party_ cable.
         | 
         | Also, why shouldn't you be able to use third-party cables? Fuck
         | DRM too.
        
           | spoaceman7777 wrote:
           | I'm referring to the section header in this article.
           | Youtubers are not a truly hegemonic group, but there's a set
           | of ideas and narratives that pervade the group as a whole
           | that different subsets buy into, and push, and that's one
           | that exists in the overall sphere of people who discuss the
           | use of hardware for gaming.
        
             | grg0 wrote:
             | Well, I can't speak for all youtubers, but I do watch most
             | GN and LTT videos and the complaints are legitimate, nor
             | are they random jabronis yolo'ing hardware installations.
        
               | spoaceman7777 wrote:
               | As far as I know, neither of them have had a card
               | unintentionally light on fire.
               | 
               | The whole thing started with Derbauer going to bat for a
               | cable from some 3rd party vendor that he'd admitted he'd
               | already plugged in and out of various cards something
               | like 50 times.
               | 
               | The actual instances that youtubers report on are all
               | reddit posters and other random social media users who
               | would clearly be better off getting a professional
               | installation. The huge popularity for enthusiast consumer
               | hardware, due to the social media hype cycle, has brought
               | a huge number of naive enthusiasts into the arena. And
               | they're getting burned by doing hardware projects on
               | their own. It's entirely unsurprising, given what happens
               | in all other realms of amateur hardware projects.
               | 
               | Most of those who are whinging about their issues are
               | false positive user errors. The actual failure rates (and
               | there are device failures) are far lower, and that's what
               | warrantys are for.
        
               | grg0 wrote:
               | I'm sure the failure rates are blown out of proportion, I
               | agree with that.
               | 
               | But the fact of the matter is that Nvidia has shifted
               | from a consumer business to b2b, and they don't even give
               | a shit about pretending they care anymore. People take
               | beef with that, understandably, and when you couple that
               | with the false marketing, the lack of inventory, the
               | occasional hardware failure, missing ROPs, insane prices
               | that nobody can afford and all the other shit that's
               | wrong with these GPUs, then this is the end result.
        
         | Rapzid wrote:
         | GN were the OG "fake framers" going back to their constant
         | casting shade on DLSS, ignoring it on their reviews, and also
         | crapping on RT.
         | 
         | AI upscaling, AI denoising, and RT were clearly the future even
         | 6 years ago. CDPR and the rest of the industry knew it, but
         | outlets like GN pushed a narrative(borderline conspiracy) the
         | developers were somehow out of touch and didn't know what they
         | were talking about?
         | 
         | There is a contingent of gamers who play competitive FPS. Most
         | of which are, like in all casual competitive hobbies, not very
         | good. But they ate up the 240hz rasterization be-all meat GN
         | was feeding them. Then they think they are the majority and
         | speak for all gamers(as every loud minority on the internet
         | does).
         | 
         | Fast forward 6 years and NVidia is crushing the Steam top 10
         | GPU list, AI rendering techniques are becoming ubiquitous, and
         | RT is slowly edging out rasterization.
         | 
         | Now that the data is clear the narrative is most consumers are
         | "suckers" for purchasing NVidia, Nintendo, and etc. And the
         | content creator economy will be there to tell them they are
         | right.
         | 
         | Edit: I believe too some of these outlets had chips on their
         | shoulder regarding NVidia going way back. So AMDs poor RT
         | performance and lack of any competitive answer the the DLSS
         | suite for YEARS had them lying to themselves about where the
         | industry was headed. Essentially they were running interference
         | for AMD. Now that FSR4 is finally here it's like AI upscaling
         | is finally ok.
        
       | andrewstuart wrote:
       | All symptoms of being number one.
       | 
       | Customers don't matter, the company matters.
       | 
       | Competition sorts out such attitude quick smart but AMD never
       | misses a chance to copy Nvidias strategy in any way and intel is
       | well behind.
       | 
       | So for now, you'll eat what Jensen feeds you.
        
       | voxleone wrote:
       | It's reasonable to argue that NVIDIA has a de facto monopoly in
       | the field of GPU-accelerated compute, especially due to CUDA
       | (Compute Unified Device Architecture). While not a legal monopoly
       | in the strict antitrust sense (yet), in practice, NVIDIA's
       | control over the GPU compute ecosystem -- particularly in AI,
       | HPC, and increasingly in professional content creation -- is
       | extraordinarily dominant.
        
         | arcanus wrote:
         | > NVIDIA's control over the GPU compute ecosystem --
         | particularly in AI, HPC
         | 
         | The two largest supercomputers in the world are powered by AMD.
         | I don't think it's accurate to say Nvidia has monopoly on HPC
         | 
         | Source: https://top500.org/lists/top500/2025/06/
        
           | infocollector wrote:
           | It's misleading to cite two government-funded supercomputers
           | as evidence that NVIDIA lacks monopoly power in HPC and AI:
           | 
           | - Government-funded outliers don't disprove monopoly
           | behavior. The two AMD-powered systems on the TOP500 list--
           | both U.S. government funded--are exceptions driven by
           | procurement constraints, not market dynamics. NVIDIA's
           | pricing is often prohibitive, and its dominance gives it the
           | power to walk away from bids that don't meet its margins.
           | That's not competition--it's monopoly leverage.
           | 
           | - Market power isn't disproven by isolated wins. Monopoly
           | status isn't defined by having every win, but by the lack of
           | viable alternatives in most of the market. In commercial AI,
           | research, and enterprise HPC workloads, NVIDIA owns an
           | overwhelming share--often >90%. That kind of dominance is
           | monopoly-level control.
           | 
           | - AMD's affordability is a symptom, not a sign of strength.
           | AMD's lower pricing reflects its underdog status in a market
           | it struggles to compete in--largely because NVIDIA has
           | cornered not just the hardware but the entire CUDA software
           | stack, developer ecosystem, and AI model compatibility. You
           | don't need 100% market share to be a monopoly--you need
           | control. NVIDIA has it.
           | 
           | In short: pointing to a couple of symbolic exceptions doesn't
           | change the fact that NVIDIA's grip on the GPU compute stack--
           | from software to hardware to developer mindshare--is
           | monopolistic in practice.
        
         | yxhuvud wrote:
         | Strict antitrust sense don't look at actual monopoly to
         | trigger, but just if you use your standing in the market to
         | gain unjust advantages. Which does not require a monopoly
         | situation but just a strong standing used wrong (like abusing
         | vertical integration). So Standard Oil, to take a famous
         | example, never had more than a 30% market share.
         | 
         | Breaking a monopoly can be a solution to that, however. But
         | having a large part of a market by itself doesn't trigger anti
         | trust legislation.
        
         | hank808 wrote:
         | Thanks ChatGPT!
        
       | rkagerer wrote:
       | I am a volunteer firefighter and hold a degree in electrical
       | engineering. The shenanigans with their shunt resistors, and
       | ensuing melting cables, is in my view criminal. Any engineer
       | worth their salt would recognize pushing 600W through a bunch of
       | small cables with no contingency if some of them have failed is
       | just asking for trouble. These assholes are going to set
       | someone's house on fire.
       | 
       | I hope they get hit with a class action lawsuit and are forced to
       | recall and properly fix these products before anyone dies as a
       | result of their shoddy engineering.
        
         | rkagerer wrote:
         | Apparently somebody did sue a couple years back. Anyone know
         | what happened with the Lucas Genova vs. nVidia lawsuit?
         | 
         | EDIT: Plantiff dismissed it. Guessing they settled. Here are
         | the court documents (alternately, shakna's links below include
         | unredacted copies):
         | 
         | https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
         | 
         | https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
         | 
         | A GamersNexus article investigating the matter:
         | https://gamersnexus.net/gpus/12vhpwr-dumpster-fire-investiga...
         | 
         | And a video referenced in the original post, describing how the
         | design changed from one that proactively managed current
         | balancing, to simply bundling all the connections together and
         | hoping for the best: https://youtu.be/kb5YzMoVQyw
        
           | shakna wrote:
           | > NOTICE of Voluntary Dismissal With Prejudice by Lucas
           | Genova (Deckant, Neal) (Filed on 3/10/2023) (Entered:
           | 03/10/2023)
           | 
           | Sounds like it was settled out of court.
           | 
           | [0] https://www.docketalarm.com/cases/California_Northern_Dis
           | tri...
        
           | middle-aged-man wrote:
           | Do those mention failing to follow Underwriters Laboratory
           | requirements?
           | 
           | I'm curious whether the 5090 package was not following UL
           | requirements.
           | 
           | Would that make them even more liable?
           | 
           | Part of me believes that the blame here is probably on the
           | manufacturers and that this isn't a problem with Nvidia
           | corporate.
        
           | autobodie wrote:
           | GamersNexus ftw as always
        
         | ryao wrote:
         | Has anyone made 12VHPWR cables that replace the 12 little wires
         | with 2 large gauge wires yet? That would prevent the wires from
         | becoming unbalanced, which should preempt the melting connector
         | problem.
         | 
         | As a bonus, if the gauge is large enough, the cable would
         | actually cool the connectors, although that should not be
         | necessary since the failure appears to be caused by overloaded
         | wires dumping heat into the connector as they overheat.
        
           | bobmcnamara wrote:
           | Or 12 strands in a single sheath so it's not overly rigid.
        
           | alright2565 wrote:
           | Might help a little bit, by heatsinking the contacts better,
           | but the problem is the contact resistance, not the wire
           | resistance. The connector itself dangerously heats up.
           | 
           | Or at least I think so? Was that a different 12VHPWR scandal?
        
             | bobmcnamara wrote:
             | Contact resistance is a problem.
             | 
             | Another problem is when the connector is angled, several of
             | the pins may not make contact, shoving all the power
             | through as few as one wire. A common bus would help this
             | but the contact resistance in this case is still bad.
        
               | ryao wrote:
               | A common bus that is not also overheating would cool the
               | overheating contact(s).
        
               | alright2565 wrote:
               | It would help, but my intuition is that the thin steel of
               | the contact would not move the heat fast enough to make a
               | significant difference. Only way to really know is to
               | test it.
        
             | ryao wrote:
             | I thought that the contact resistance caused the unbalanced
             | wires, which then overheat alongside the connector, giving
             | the connector's heat nowhere to go.
        
             | chris11 wrote:
             | I think it's both contact and wire resistance.
             | 
             | It is technically possible to solder a new connector on.
             | LTT did that in a video.
             | https://www.youtube.com/watch?v=WzwrLLg1RR4
        
               | ryao wrote:
               | Uneven abnormal contact resistance is what causes the
               | wires to become unbalanced, and then the remaining ones
               | whose contacts have low resistance have huge currents
               | pushed through them, causing them to overheat due to wire
               | resistance. I am not sure if it is possible to have
               | perfect contact resistance in all systems.
        
           | AzN1337c0d3r wrote:
           | They don't just specify 12 smaller cables for nothing if 2
           | larger ones will do. There are concerns here with mechanical
           | compatibility (12 wires have smaller allowable bend radius
           | than 2 larger ones with the same ampacity).
        
             | kuschku wrote:
             | One option is to use two very wide, thin insulated copper
             | sheets as cable. Still has a good bend radius in one
             | dimension, but is able to sink a _lot_ of power.
        
         | lukeschlather wrote:
         | Also, like, I kind of want to play with these things, but also
         | I'm not sure I want a computer that uses 500W+ in my house, let
         | alone just a GPU.
         | 
         | I might actually be happy to buy one of these things, at the
         | inflated price, and run it at half voltage or something... but
         | I can't tell if that is going to fix these concerns or they're
         | just bad cards.
        
           | wasabinator wrote:
           | It's not the voltage, it's the current you'd want to halve.
           | The wire gauge required to carry power is dependent on the
           | current load. It's why when i first saw these new connectors
           | and the loads they were being tasked with it was a wtf moment
           | for me. Better to just avoid them in the first place though.
        
             | dietr1ch wrote:
             | It's crazy, you don't even need to know about electricity
             | after you see a thermal camera on them operating at full
             | load. I'm surprised they can be sold to the general public,
             | the reports of cables melting plus the high temps should be
             | enough to force a recall.
        
           | izacus wrote:
           | With 5080 using 300W, talking about 500W is a bit of an
           | exaggeration, isn't it?
        
             | lukeschlather wrote:
             | I'm talking about the 5090 which is 575W.
        
               | izacus wrote:
               | But why are you talking about it? It's a hugely niche
               | hardware which is a tiny % of nVidia cards out there.
               | It's deliberately outsized and you wouldn't put it in 99%
               | of gaming PCs.
               | 
               | And yet you speak of it like it's a representative model.
               | Do you also use a Hummer EV to measure all EVs?
        
               | lukeschlather wrote:
               | I am interested in buying hardware that can run the full
               | DeepSeek R1 locally. I don't think it's a particularly
               | good idea, but I've contemplated an array of 5090s.
               | 
               | If I were interested in using an EV to haul particularly
               | heavy loads, I might be interested in the Hummer EV and
               | have similar questions that might sound ridiculous.
        
         | dreamcompiler wrote:
         | To emphasize this point, go outside at noon in the summer and
         | mark off a square meter on the sidewalk. That square of
         | concrete is receiving about 1000w from the sun.
         | 
         | Now imagine a magnifying glass that big (or more practically a
         | fresnel lens) concentrating all that light into one square
         | inch. That's a lot of power. When copper connections don't work
         | perfectly they have nonzero resistance, and the current running
         | through them turns into heat by I^2R.
        
       | johnklos wrote:
       | I'm so happy to see someone calling NVIDIA out for their
       | bullshit. The current state of GPU programming sucks, and that's
       | just an example of the problems with the GPU market today.
       | 
       | The lack of open source anything for GPU programming makes me
       | want to throw my hands up and just do Apple. It feels much more
       | open than pretending that there's anything open about CUDA on
       | Linux.
        
       | Dylan16807 wrote:
       | > The competing open standard is FreeSync, spearheaded by AMD.
       | Since 2019, NVIDIA also supports FreeSync, but under their
       | "G-Sync Compatible" branding. Personally, I wouldn't bother with
       | G-Sync when a competing, open standard exists and differences are
       | negligible[4].
       | 
       | Open is good, but the open standard itself is not enough. You
       | need some kind of testing/certification, which is built in to the
       | G-Sync process. AMD does have a FreeSync certification program
       | now which is good.
       | 
       | If you rely on just the standard, some manufacturers get really
       | lazy. One of my screens technically supports FreeSync but I
       | turned it off day one because it has a narrow range and flickers
       | very badly.
        
       | jes5199 wrote:
       | with Intel also shitting the bed, it seems like AMD is poised to
       | pick up "traditional computing" while everybody else runs off to
       | chase the new gold rush. Presumably there's still _some_ money in
       | desktops and gaming rigs?
        
       | fracus wrote:
       | This was an efficient, well written, TKO.
        
         | anonymars wrote:
         | Agreed. An excellent summary of a lot of missteps that have
         | been building for a while. I had watched that article on the
         | power connector/ shunt resistors and was dumbfounded at the
         | seemingly rank-amateurish design. And although I don't have a
         | 5000 series GPU I have been astonished at how awful the drivers
         | have been for the better part of a year.
         | 
         | As someone who filed the AMD/ATi ecosystems due to their quirky
         | unreliability, Nvidia and Intel have really shit the bed these
         | days (I also had the misfortune of "upgrading" to a 13th gen
         | Intel processor just before we learned that they cook
         | themselves)
         | 
         | I do think DLSS supersampling is incredible but Lord almighty
         | is it annoying that the frame generation is under the same
         | umbrella because that is nowhere near the same, and the water
         | is awful muddy since "DLSS" is often used without distinction
        
       | ksec wrote:
       | > _How is it that one can supply customers with enough stock on
       | launch consistently for decades, and the other can't?_
       | 
       | I guess the author is too young and didn't go through iPhone 2G
       | to iPhone 6 era. Also worth remembering it wasn't too long ago
       | Nvidia was sitting on nearly ONE full year of GPU stock unsold.
       | That has completely changed the course of how Nvidia does supply
       | chain management and forecast. Which unfortunately have a
       | negative impact all the way to Series 50. I believe they have
       | since changed and next Gen should be better prepared. But you can
       | only do so much when AI demand is seemingly unlimited.
       | 
       | > _The PC, as gaming platform, has long been held in high regards
       | for its backwards compatibility. With the RTX 50 series, NVIDIA
       | broke that going forward. PhysX....._
       | 
       | Glide? What about all the Audio Drivers API before. As much as I
       | wish everything is backward compatible. That is just not how the
       | world works. Just like any old games you need some fiddling to
       | get it work. And they even make the code available so people
       | could actually do something rather then emulation or reverse
       | engineering.
       | 
       | > _That, to me, was a warning sign that maybe, just maybe, ray
       | tracing was introduced prematurely and half-baked._
       | 
       | Unfortunately that is not how it works. Do we want to go back to
       | Pre-3DFx to today to see how many what we thought was great idea
       | for 3D accelerator only to be replaced by better ideas or
       | implementation? These idea were good on paper but didn't work
       | well. We than learn from it and reiterate.
       | 
       | > _Now they're doing an even more computationally expensive
       | version of ray tracing: path tracing. So all the generational
       | improvements we could've had are nullified again......_
       | 
       | How about Path Tracing is simply a better technology? Game
       | developers also dont have to use any of these tech. The article
       | act as if Nvidia forces all game to use it. Gamers want better
       | graphics quality, Artist and Graphics asset is already by far the
       | most expensive item in gaming and it is still increasing. What
       | hardware improvement is allowing those to be achieved at lower
       | cost. ( To Game Developers )
       | 
       | > _Never mind that frame generation introduces input lag that
       | NVIDIA needs to counter-balance with their "Reflex" technology,_
       | 
       | No. That is not _why_ "Reflex" tech was invented. Nvidia spend
       | R&D on 1000 fps monitor as well and potentially sub 1ms frame
       | monitor. They have always been latency sensitive.
       | 
       | ------------------------------
       | 
       | I have no idea how modern Gamers become what they are today. And
       | this isn't the first time I have read it even on HN. You dont
       | have to buy Nvidia. You have AMD and now Intel ( again ).
       | Basically I can summarise one thing about it, Gamers want Nvidia
       | 's best GPU for the lowest price possible. Or a price they think
       | is acceptable without understanding the market dynamics and
       | anything supply chain or manufacturing. They also want higher
       | "generational" performance. Like 2x every 2 year. And if they
       | dont get it, it is Nvidia's fault. Not TSMC, not Cadence, not
       | Tokyo Electron, not Issac Newton or Law of Physic. But Nvidia.
       | 
       | Nvidia's PR tactic isn't exactly new in the industry. Every
       | single brand do something similar. Do I like it? No. But
       | unfortunately that is how the game is played. And Apple is by far
       | the worst offender.
       | 
       | I do sympathise with the Cable issue though. And not the first
       | time Nvidia has with thermal issues. But then again they are also
       | the one who are constantly pushing the boundary forward. And
       | AFAIK the issues isn't as bad as the series 40 but some YouTube
       | seems to be making a bigger issue than most. Supply issues will
       | be better but TSMC 3nm is fully booked . The only possible
       | solution would be to have consumer GPU less capable of AI
       | workload. Or to have AI GPU working with leading edge node and
       | consumer always be a node lower to split the capacity problem. I
       | would imagine that is part of the reason why TSMC is accelerating
       | 3nm capacity increase on US soil. Nvidia is now also large enough
       | and has enough cash to take on more risk.
        
       | DeepYogurt wrote:
       | > And I hate that they're getting away with it, time and time
       | again, for over seven years.
       | 
       | Nvidia's been at this way longer than 7 years. They were cheating
       | at benchmarks to control a narrative back in 2003.
       | https://tech.slashdot.org/story/03/05/23/1516220/futuremark-...
        
       | 827a wrote:
       | Here's something I don't understand: Why is it that when I go
       | look at DigitalOcean's GPU Droplet options, they don't offer any
       | Blackwell chips? [1] I thought Blackwell was supposed to be the
       | game changing hyperchip that carried AI into the next generation,
       | but the best many providers still offer are Hopper H100s? Where
       | are all the Blackwell chips? Its been oodles of months.
       | 
       | Apparently AWS has them available in the P6 instance type, but
       | the only configuration they offer has 2TB of memory and costs...
       | $113/hr [2]? Like, what is going on at Nvidia?
       | 
       | Where the heck is Project Digits? Like, I'm developing this
       | shadow opinion that Nvidia actually hasn't built anything new in
       | three years, but they fill the void by talking about hypothetical
       | newtech that no one can actually buy + things their customers
       | have built with the actually good stuff they built three years
       | ago. Like, consumers can never buy Blackwell because "oh
       | Enterprises have bought them all up" then when Microsoft tries to
       | buy any they say "Amazon bought them all up" and vice-versa.
       | Something really fishy is going on over there. Time to short.
       | 
       | [1] https://www.digitalocean.com/products/gpu-droplets
       | 
       | [2] https://aws.amazon.com/ec2/pricing/on-demand/
        
         | hank808 wrote:
         | Digits: July 22 it seems is the release date for the version
         | with the Asus badge. https://videocardz.com/newz/asus-ascent-
         | gx10-with-gb10-black...
        
       | mcdeltat wrote:
       | Anyone else getting a bit disillusioned with the whole tech
       | hardware improvements thing? Seems like every year we get less
       | improvement for higher cost and the use cases become less useful.
       | Like the whole industry is becoming a rent seeking exercise with
       | diminishing returns. I used to follow hardware improvements and
       | now largely don't because I realised I (and probably most of us)
       | don't need it.
       | 
       | It's staggering that we are throwing so many resources at
       | marginal improvements for things like gaming, and I say that as
       | someone whose main hobby used to be gaming. Ray tracing, path
       | tracing, DLSS, etc at a price point of $3000 just for the GPU -
       | who cares when a 2010 cell shaded game running on an upmarket
       | toaster gave me the utmost joy? And the AI use cases don't
       | impress me either - seems like all we do each generation is burn
       | more power to shove more data through and pray for an improvement
       | (collecting sweet $$$ in the meantime).
       | 
       | Another commenter here said it well, there's just so much more
       | you can do with your life than follow along with this drama.
        
         | bamboozled wrote:
         | I remember when it was a serious difference, like PS1-PS3 was
         | absolutely miraculous and exciting to watch.
         | 
         | It's also fun that no matter how fast the hardware seems to
         | get, we seem to fill it up with shitty bloated software.
        
           | mcdeltat wrote:
           | IMO at some point in the history of software we lost track of
           | hardware capabilities versus software end outcomes. Hardware
           | improved many orders of magnitude but overall software
           | quality/usefulness/efficiency did not (yes this is a hill I
           | will die on). We've ended up with mostly garbage and an
           | occasional legitimately brilliant use of transistors.
        
         | philistine wrote:
         | Your disillusionment is warranted, but I'll say that on the Mac
         | side the grass has never been greener. The M chips are
         | screamers year after year, the GPUs are getting ok, the ML
         | cores are incredible and actually useful.
        
           | mcdeltat wrote:
           | Good point, we should commend genuinely novel efforts towards
           | making baseline computation more efficient, like Apple has
           | done as you say. Particularly in light of recent x86
           | development which seems to be "shove as many cores as
           | possible on a die and heat your apartment while your power
           | supply combusts" (meanwhile the software gets less efficient
           | by the day, but that's another thing altogether...). ANY DAY
           | of the week I will take a compute platform that's no-bs no-
           | bells-and-whistles simply more efficient without the
           | manufacturer trying to blow smoke up our asses.
        
         | seydor wrote:
         | Our stock investments are going up so ...... What can we do
         | other than shrug
        
         | keyringlight wrote:
         | What stands out to me is that it's not just the hardware side,
         | software production to make use of it to realize the benefits
         | offered doesn't seem to be running smoothly either, at least
         | for gaming. I'm not sure nvidia really cares too much though as
         | there's no market pressure on them where it's a weakness for
         | them, if consumer GPUs disappeared tomorrow they'd be fine.
         | 
         | A few months ago Jensen Huang said he sees quantum computing as
         | the next big thing he wants nvidia to be a part of over the
         | next 10-15 years (which seems like a similar timeline as GPU
         | compute), so I don't think consumer GPUs are a priority for
         | anyone. Gaming used to be the main objective with byproducts
         | for professional usage, for the past few years that's reversed
         | where gaming piggybacks on common aspects to compute.
        
       | WhereIsTheTruth wrote:
       | Call it delusions or conspiracy theories, what ever, I don't
       | care, but it seems to me that NVIDIA wants to vendor lock the
       | whole industry
       | 
       | If all game developers begin to rely on NVIDIA technology, the
       | industry as a whole puts customers in a position where they are
       | forced to give in
       | 
       | The public's perception of RTX's softwarization (DLSS) and them
       | coining the technical terms says it all
       | 
       | They have a long term plan, and that plan is:
       | 
       | - make all the money possible
       | 
       | - destroy all competition
       | 
       | - vendor lock the whole world
       | 
       | When I see that, I can't help myself but to think something is
       | fishy:
       | 
       | https://i.imgur.com/WBwg6qQ.png
        
       | yalok wrote:
       | a friend of mine is a SW developer in Nvidia, working on their
       | drivers. He was complaining lately that he is required to fix a
       | few bugs in the drivers code for the new card (RTX?), while not
       | provided with the actual hardware. His pleas to send him this HW
       | were ignored, but the demand to fix by a deadline kept being
       | pushed.
       | 
       | He actually ended up buying older but somewhat similar used
       | hardware with his personal money, to be able to do his work.
       | 
       | Not even sure if he was eventually able to expense it, but
       | wouldn't be surprised if not, knowing how big companies
       | bureaucracy works...
        
       | PoshBreeze wrote:
       | > The RTX 4090 was massive, a real heccin chonker. It was so huge
       | in fact, that it kicked off the trend of needing support brackets
       | to keep the GPU from sagging and straining the PCIe slot.
       | 
       | This isn't true. People were buying brackets with 10 series
       | cards.
        
       | tonyhart7 wrote:
       | Consumer GPU feels like an "paper launch" for the past years
       | 
       | that's like they purposely not selling because they allocated 80%
       | of their production to enterprise only
       | 
       | I just hope that new fabs operate early as possible because these
       | price is insane
        
       | amatecha wrote:
       | Uhh, these 12VHPWR connectors seem like a serious fire risk. How
       | are they not being recalled? I just got a 5060ti , now I'm
       | wishing I went AMD instead.. what the hell :(
       | 
       | Whoa, the stuff covered in the rest of the post is just as
       | egregious. Wow! Maybe time to figure out which AMD models
       | compares performance-wise and sell this thing, jeez.
        
       | musebox35 wrote:
       | With the rise of LLM training, Nvidia's main revenue stream
       | switched to datacenter gpus (>10x gaming revenue). I wonder
       | whether this have affected the quality of these consumer cards,
       | including both their design and product processes:
       | 
       | https://stockanalysis.com/stocks/nvda/metrics/revenue-by-seg...
        
       | reichstein wrote:
       | Aks. "Every beef anyone has ever had with Nvidia in one outrage
       | friendly article."
       | 
       | If you want to hate on Nvidia, there'll be something for you in
       | there.
       | 
       | An entire section on 12vhpwr connectors, with no mention of
       | 12V-2x6.
       | 
       | A lot of "OMG Monopoly" and "why won't people buy AMD" without
       | considering that maybe ... AMD cards are not considered by the
       | general public to be as good _where it counts_. (Like benefit per
       | Watt, aka heat.) Maybe it's all perception, but then AMD should
       | work on that perception. If you want the cooler CPU/GPU,
       | perception is that that's Intel/Nvidia. That's reason enough for
       | me, and many others.
       | 
       | Availability isn't great, I'll admit that, if you don't want to
       | settle for a 5060.
        
       | Sweepi wrote:
       | Nvidia is full of shit, but this article is full of shit, too. A
       | lot of human slop, some examples:
       | 
       | - 12VHPWR is not at fault / the issue. As the article itself
       | points out, the missing power balancing circuit is to blame. The
       | 3090 Ti had bot 12VHPWR and the balancing power circuit and ran
       | flawless.
       | 
       | - Nvidia G-Sync: Total non-issue. G-Sync native is dead. Since
       | 2023, ~1000 Freesync Monitors have been released, and 3(!!)
       | G-Sync native Monitors.
       | 
       | - The RTX 4000 series is not still expensive, it is again
       | expensive. It was much cheaper a year before RTX 5000 release
       | 
       | - Anti-Sag Brackets were a thing way before RTX 4000
        
       | Kon5ole wrote:
       | TSMC can only make about as many Nvidia chips as OpenAI and the
       | other AI guys wants to buy. Nvidia releases gpus made from
       | basically the shaving leftovers from the OpenAI products, which
       | makes them limited in supply and expensive.
       | 
       | So gamers have to pay much more and wait much longer than before,
       | which they resent.
       | 
       | Some youtubers make content that profit from the resentment so
       | they play fast and loose with the fundamental reasons in order to
       | make gamers even more resentful. Nvidia has "crazy prices" they
       | say.
       | 
       | But they're clearly not crazy. 2000 dollar gpus appear in
       | quantities of 50+ from time to time at stores here but they sell
       | out in minutes. Lowering the prices would be crazy.
        
         | xiphias2 wrote:
         | This is one reason, and another is that both Dennard scaling
         | has stopped and GPUs hit a memory wall for DRAM. The only
         | reason AI hardware gets the significant improvements is that
         | they are using big matmuls and a lot of research has been in
         | getting lower precision (now 4bit) training working (numerical
         | precision stability was always a huge problem with backprop).
        
         | Ologn wrote:
         | Yes. In 2021, Nvidia was actually making more revenue from its
         | home/consumer/gaming chips than from its data center chips. Now
         | 90% of its revenue is from its data center hardware, and less
         | than 10% of its revenue is from home gpus. The home gpus are an
         | afterthought to them. They take up resources that can be
         | devoted to data center.
         | 
         | Also, in some sense there can be some fear 5090s could
         | cannibalize the data center hardware in some aspects - my
         | desktop has a 3060 and I have trained locally, run LLMs locally
         | etc. It doesn't make business sense at this time for Nvidia to
         | meet consumer demand.
        
       | jdthedisciple wrote:
       | Read this in good faith but I don't see how it's supposed to be
       | Nvidia's fault?
       | 
       | How could Nvidia realistically stop scalper bots?
        
       | liendolucas wrote:
       | I haven't read the whole article but a few things to remark:
       | 
       | * The prices for Nvidia GPUs are insane. For that money you can
       | have an extremely good PC with a good non Nvidia GPU.
       | 
       | * The physical GPU sizes are massive, even letting the card rest
       | on a horizontal motherboard looks like scary.
       | 
       | * Nvidia has still issues with melting cables? I've heard about
       | those some years ago and thought it was a solved problem.
       | 
       | * Proprietary frameworks like CUDA and others are going to fall
       | at some point, is just a matter of time.
       | 
       | Looks as if Nvidia at the moment is only looking at the AI market
       | (which as a personal belief has to burst at some point) and
       | simply does not care the non GPU AI market at all.
       | 
       | I remember many many years ago when I was a teenager and 3dfx was
       | the dominant graphics card manufacturer that John Carmack
       | profethically in a gaming computer magazine (the article was
       | about Quake I) predicted that the future wasn't going to be 3dfx
       | and Glide. Some years passed by and effectively 3dfx was gone.
       | 
       | Perhaps is just the beginning of the same story that happened
       | with 3dfx. I think AMD and Intel have a huge opportunity to
       | balance the market and bring Nvidia down, both in the AI and
       | gaming space.
       | 
       | I have only heard excellent things about Intel's ARC GPUs in
       | other HNs threads and if I need to build a new desktop PC from
       | scratch there's no way to pay for the prices that Nvidia is
       | pushing to the market, I'll definitely look at Intel or AMD.
        
       | Havoc wrote:
       | They're not full of shit - they're just doing what a for profit
       | co in a dominant position does.
       | 
       | In other news I hope intel pulls their thumb out of their ass
       | cause AMD is crushing it and that's gonna end the same way
        
       | snarfy wrote:
       | I'm a gamer and love my AMD gpu. I do not give a shit about ray
       | tracing, frame generation, or 4k gaming. I can play all modern
       | fps at 500fps+. I really wish the market wasn't so trendy and
       | people bought what worked for them.
        
         | alt227 wrote:
         | Yeah I was exactly the same as you for years, holding out
         | against what I considered to be unecessary exrtravagence. That
         | was until I got a 4k monitor at work and experienced 4k HDR
         | gaming. I immediately went out and bought an RTX 4070 and a 4k
         | monitor and I will never be going back. The experience is
         | glorious and I was a fool for not jumping sooner.
         | 
         | 4K HDR gaming is not the future, is has been the standard for
         | many years now for good reason.
        
       | Arainach wrote:
       | Why was the title of this post changed long after posting to
       | something that doesn't match the article title? This
       | editorializing goes directly against HN Guidelines (but was
       | presumably done by the HN team?)
        
         | cbarrick wrote:
         | +1. "Nvidia won, we all lost" sets a very different tone than
         | "NVIDIA is full of shit". It's clearly not the tone the author
         | intended to set.
         | 
         | Even more concerning is that, by editorializing the title of an
         | article that is (in part) about how Nvidia uses their market
         | dominance to pressure reviewers and control the narrative, we
         | must question whether or not the mod team is complicit in this
         | effort.
         | 
         | Is team green afraid that a title like "NVIDIA is full of shit"
         | on the front page of HN is bad for their image or stock price?
         | Was HN pressured to change the name?
         | 
         | Sometimes, editorialization is just a dumb and lazy mistake.
         | But editorializing something like this is a lot more
         | concerning. And it's made worse by the fact that the title was
         | changed by the mods.
        
           | tyre wrote:
           | Okay let's take off the tin foil hat for a second. HN has a
           | very strong moderation team with years and years of history
           | letting awkward (e.g. criticism of YC, YC companies) things
           | stand.
        
             | cbarrick wrote:
             | I said what I said above not as a genuinely held belief (I
             | doubt Nvidia had any involvement in this editorialization),
             | but as a rhetorical effect.
             | 
             | There are many reasons why the editorialized-title rule
             | exists. One of the most important reasons is so that we can
             | trust HN as an unbiased news aggregator. Given the content
             | of the article, this particular instance of
             | editorialization is pretty egregious and trust breaking.
             | 
             | And to be clear, those questions I asked are not outlandish
             | to ask, even if we do trust HN enough to dismiss them.
             | 
             | The title should not have been changed.
        
             | blibble wrote:
             | > HN has a very strong moderation team with years and years
             | of history letting awkward (e.g. criticism of YC, YC
             | companies) things stand.
             | 
             | the attempt to steer direction is well hidden, but it is
             | very much there
             | 
             | with https://hnrankings.info/ you can see the correction
             | applied, in real time
             | 
             | the hidden bits applied to dissenting accounts? far less
             | visible
        
               | throwawayqqq11 wrote:
               | Oh wow, i always had that gut feeling, but now i know.
               | Stop killing games went from consistent rank 2 to 102 in
               | an instant. And it all happend outside my timezone so i
               | didnt even know it existed here.
        
               | Ygg2 wrote:
               | Jesus Christ. That is a massive correction. I fear most
               | of those EU petition numbers are probably bots, designed
               | to sabotage it.
        
               | p_j_w wrote:
               | HN's moderation system (posts with lots of flagged
               | comments get derated) seems to really easy to game. Don't
               | like a story? Have bots post a bunch of inflammatory
               | comments likely to get flagged and it will go away.
               | There's no way the people who run the site don't know
               | this, so I don't know how to possibly make the case that
               | they are actually okay with it.
        
               | const_cast wrote:
               | I believe usually when this happens the admins like dang
               | and tomhow manually unflag the post if they think it's
               | relevant. Which... is not a perfect system, but it works.
               | I've seen plenty of posts be flagged, dead, then get
               | unflagged and revived. They'll go in and manually flag
               | comments, too, to get the conversation back on track. So,
               | I think site admins are aware that this is happening.
               | 
               | Also, it's somewhat easy to tell who is a bot. Really new
               | accounts are colored green. I'm sure there's also long-
               | running bots, and I'm not sure how you would find those.
        
             | cipher_accompt wrote:
             | I'm curious whether you're playing devil's advocate or if
             | you genuinely believe that characterizing OP's comment as
             | "tin foil hat" thinking is fair.
             | 
             | The concentration of wealth and influence gives entities
             | like Nvidia the structural power to pressure smaller
             | players in the economic system. That's not speculative --
             | it's common sense, and it's supported by antitrust cases.
             | Firms like Nvidia are incentivized to abuse their market
             | power to protect their reputation and, ultimately, their
             | dominance. Moreover, such entities can minimize legal and
             | economic consequences in the rare instances that there are
             | any.
             | 
             | So what exactly is the risk created by the moderation team
             | allowing criticism of YC or YC companies? There aren't many
             | alternatives -- please fill me in if I'm missing something.
             | In contrast, allowing sustained or high-profile criticism
             | of giants like Nvidia could, even if unlikely, carry
             | unpredictable risks.
             | 
             | So were you playing devil's advocate, or do you genuinely
             | think OP's concern is more conspiratorial than it is a
             | plausible worry about the chilling effect created by
             | concentration of immense wealth?
        
               | sillyfluke wrote:
               | >the concentration of wealth
               | 
               | On this topic, I'm curious what others think of the
               | renaming of this post:
               | 
               | https://news.ycombinator.com/item?id=44435732
               | 
               | The original title I gave was: "Paul Graham: without
               | billionaires, there will be no startups."
               | 
               | As it was a tweet, I was trying to summarize his
               | conclusive point in the first part of the sentence:
               | 
               |  _Few of them realize it, but people who say "I don't
               | think that we should have billionaires" are also saying
               | "I don't think there should be startups,"_
               | 
               | Now, this part of the sentence to me was the far more
               | interesting part because it was a much bolder claim than
               | the second part of the sentence:
               | 
               |  _because successful startups inevitably produce
               | billionaires._
               | 
               | This second part seems like a pretty obvious observation
               | and is a completely uninteresting observation by itself.
               | 
               | The claim that successful startups have produced
               | billonaires _therefore successful startups require
               | billionaires_ is a far more contentious and interesting
               | claim.
               | 
               | The mods removed "paul graham" from the title and
               | switched the title to the uninteresting second part of
               | the sentence, turning it into a completely banal and
               | pointless title: Successful startups produce
               | billionaires. Thereby removing any hint of the bold claim
               | being made by the founder of one of the most succesful
               | VCs of the 21st century. And incidentally, also the
               | creator of this website.
               | 
               | I can only conclude someone is loathe to moderate a
               | thread about whether billionaires are neccessary for
               | sucessful startups to exist.
               | 
               | ps. There is no explicit guideline for tweets as far as I
               | can tell. You are forced to use an incomplete quote or
               | are forced to summarize the tweet im some fashion.
        
             | hshdhdhj4444 wrote:
             | I thought HN was a dingle moderator, dang, and now I think
             | there may be 2 people?
        
               | card_zero wrote:
               | That's correct, dang has offloaded some of the work to
               | tomhow, another dingle.
        
               | kevindamm wrote:
               | and together they are trouble?
        
             | ldjkfkdsjnv wrote:
             | theres alot of shadow banning, up ranking and down ranking
        
           | rubatuga wrote:
           | Probably malicious astroturfing is going on from Nvidia and
           | the mods. @dang who was the moderator who edited the title?
        
         | throwaway290 wrote:
         | I think it's pretty obvious. People were investing like crazy
         | into Nvidia on the "AI" gamble. Now everybody needs to keep
         | hyping up Nvidia and AI no matter reality. (Until it starts to
         | become obvious and then the selloff starts)
        
           | j_timberlake wrote:
           | Literally every single anti-AI comment I see on this site
           | uses a form of the word "hype". You cannot make an actual
           | objective argument against the AI-wave predictions, so you
           | use the word hype and pretend that's a real argument and not
           | just ranting.
        
             | elzbardico wrote:
             | I work with AI, I consider generative AI an incredible tool
             | in our arsenal of computing things.
             | 
             | But, in my opinion, the public expectations in my opinion
             | are clearly exaggerated and sometimes even dangerous as we
             | ran the risk of throwing the baby with the bathwater when
             | some ideas/marketing/vc people ideas become not realizable
             | in the concrete world.
             | 
             | Why, having this outlook, I should be banned of using the
             | very useful word/concept of "hype"?
        
               | j_timberlake wrote:
               | Your post doesn't contain a single prediction of a
               | problem that will occur, dangerous or otherwise, just
               | some vague reference to "the baby might get thrown out
               | with the bathwater". This is exactly what I'm talking
               | about, you just talk around the issue without naming
               | anything specific, because you don't have anything. If
               | you did, you'd state it.
               | 
               | Meanwhile the AI companies continue to produce new SotA
               | models yearly, sometimes quarterly, meaning the evidence
               | that you're just completely wrong never stops increasing.
        
         | dandanua wrote:
         | Haven't you figured out the new global agenda yet? Guidelines
         | (and rules) exist only to serve the masters.
        
           | Zambyte wrote:
           | New as of which millennium?
        
         | rectang wrote:
         | When titles are changed, the intent as I understand it is to
         | nudge discussion towards thoughtful exchange. Discussion is
         | forever threatening to spin out of control towards flame wars
         | and the moderators work hard to prevent that.
         | 
         | I think that if you want to understand why it might be helpful
         | to change the title, consider how well "NVIDIA is full of shit"
         | follows the HN _comment_ guidelines.
         | 
         | I don't imagine you will agree with the title change no matter
         | what, but I believe that's essentially the rationale. Note that
         | the topic wasn't flagged, which if suppression of the author's
         | ideas or protection of Nvidia were goals would have been more
         | effective.
         | 
         | (FWIW I have plenty of issues with HN but how titles are
         | handled isn't one of them.)
        
           | mindslight wrote:
           | I agree with your explanation, but I think it's a hollow
           | rationale. "Full of shit" is a bit aggressive and divisive,
           | but the thesis is in the open and there is plenty of room to
           | expand on it in the actual post. Whereas "Nvidia won" is
           | actually just as divisive and in a way has _more_ implied
           | aggression (of a fait accompli), it 's just cloaked in using
           | less vulgar language.
        
             | rectang wrote:
             | The new title, "Nvidia won, we all lost", is taken from a
             | subheading in the actual article, which is something I've
             | often seen dang recommend people do when faced with baity
             | or otherwise problematic titles.
             | 
             | https://blog.sebin-nyshkim.net/posts/nvidia-is-full-of-
             | shit/...
        
           | iwontberude wrote:
           | I don't see how changing the title has encouraged thoughtful
           | exchange when the top comments are talking about the change
           | to the title. Seems better to let moderators do their job
           | when there is an actual problem with thoughtful exchange
           | instead of creating one.
        
         | shutupnerd0000 wrote:
         | Barbara Streisand requested it.
        
       | dagaci wrote:
       | Jenson has managed to kneel into every market boom in a
       | reasonable amount of time with his GPUs and tech (hardware and
       | software). No doubt he will be there when the next boom kicks off
       | too.
       | 
       | Microsoft fails consistently ... even when offered a lead on the
       | plate... it fails, but these failures are eventually corrected
       | for by the momentum of its massive business units.
       | 
       | Apple is just very very late... but this failure can be
       | eventually corrected for by its unbeatable astroturfing units.
       | 
       | Perhaps AMD are too small keep up everywhere it should. But
       | compared to the rest, AMD is a fast follower. Why Intel is where
       | it is is a mystery to me but i'm quite happy about its demise and
       | failures :D
       | 
       | Being angry about NVIDIA is not giving enough credit to NVIDIA
       | for being on-time and even leading the charge in the first place.
       | 
       | Everyone should remember that NVIDIA also leads into the markets
       | that it dominates.
        
         | Mistletoe wrote:
         | What is the next boom? I honestly can't think of one. Feels
         | like we are just at the Age of the Plateau, which will be quite
         | painful for markets and the world.
        
           | alanbernstein wrote:
           | Humanoid robotics
        
             | chriskanan wrote:
             | This will be huge in the next decade and powered by AI.
             | There are so many competitors, currently, that it is hard
             | to know who the winners will be. Nvidia is already angling
             | for humanoid robotics with its investments.
        
             | mtmail wrote:
             | and skynet
        
               | alanbernstein wrote:
               | Not THAT kind of boom
        
             | mdaniel wrote:
             | relevant: _Launch HN: K-Scale Labs (YC W24) - Open-Source
             | Humanoid Robots_ -
             | https://news.ycombinator.com/item?id=44456904 - July, 2025
             | (97 comments)
        
           | debesyla wrote:
           | As all the previous booms - hard to predict before it
           | happens. And if we do predict, high chances are that we will
           | miss.
           | 
           | My personal guess is something in the medical field, because
           | surely all the AI search tools could help to detect common
           | items in all the medical data. Maybe more of ozempyc, maybe
           | for some other health issue. (Of course, who knows. Maybe it
           | turns out that the next boom is going to be in figuring out
           | ways to make things go boom. I hope not.)
        
           | xeromal wrote:
           | It's just because we can't know what the next boom is until
           | it hits us in the face except for a tiny population of humans
           | that effect those changes
        
           | thebruce87m wrote:
           | VLM / VLA.
        
           | tmtvl wrote:
           | I'm gonna predict biotech. Implanted chips that let you
           | interact with LLMs directly with your brain. Chips that allow
           | you to pay for stuff by waving your hand at a sensor. Fully
           | hands-free videoconferencing on the go. As with blockchain
           | and current LLMs, not something I fancy spending any time
           | with, but people will call it the next step towards some kind
           | of tech utopia.
        
           | bgnn wrote:
           | Jensen id betting on two technologies: integrated silicon
           | photonucs, aka optical compute + communication (realistic
           | bet), and Quantum computing (moonshot bet).
        
         | thfuran wrote:
         | Why be happy about the demise of Intel? I'd rather have more
         | chip designers than fewer.
        
         | int_19h wrote:
         | With respect to GPUs and AI I think it might actually be the
         | case of engineering the boom more so than anticipating it. Not
         | the AI angle itself, but the GPU compute part of it
         | specifically - Jensen had NVIDIA invest heavily into that when
         | it was still very niche (Ian Buck was hired in 2004) and then
         | actively promoted it to people doing number crunching.
        
         | parineum wrote:
         | Nvidia won and we all did too. There's a reason they own so
         | much if the market, they are the best. There's no allegations
         | of anything anticompetitive behavior alleged and the market is
         | fairly open.
        
       | mrkramer wrote:
       | Probably the next big thing will be Chinese GPUs that are the
       | same quality as NVIDIA GPUs but at least 10-20% cheaper aaand we
       | will have to wait for that maybe 5-10 years.
        
       | nickdothutton wrote:
       | It has been decades since I did any electronics, and even then
       | only as a hobby doing self-build projects, but the power feed
       | management (obviously a key part of such a high current and
       | expensive component in a system) is shameful.
        
       | zoobab wrote:
       | Not enough VRAM to load big LLMs, in order not to compete with
       | their expensive high end. Market segmentation it's called.
        
       | fithisux wrote:
       | NVidia won?
       | 
       | Not for me. I prefer Intel offerings. Open and Linux friendly.
       | 
       | I even hope they would release the next gen Risc-V boards with
       | Intel Graphics.
        
         | camel-cdr wrote:
         | A RISC-V board with NVIDIA graphics is more likely:
         | https://mp.weixin.qq.com/s/KiV13GqXGMZfZjopY0Xxpg
         | 
         | NVIDIA Keynote from the upcoming RISC-V Summit China: "Enabling
         | RISC-V application processors in NVIDIA compute platforms"
        
       | hiAndrewQuinn wrote:
       | To anyone who remembers econ 101 it's hard to read something like
       | "scalper bots scoop up all of the new units as soon as they're
       | launched" and not conclude that Nvidia itself is simply pricing
       | the units they sell too low.
        
       | Nifty3929 wrote:
       | I just don't think NVidia cares all that much about it's gaming
       | cards, except to the extent that they don't want to cede too much
       | ground to AMD and basically preserve their image in that market
       | for now. Basically they don't want to lose their legions of
       | gaming fans that got them started, and who still carry the torch.
       | But they'll produce the minimum number of gaming cards needed to
       | accomplish that.
       | 
       | Otherwise the money is in the datacenter (AI/HPC) cards.
        
       | avipars wrote:
       | If only, NVIDIA could use their enterprise solution on consumer
       | hardware.
        
       | parketi wrote:
       | Here's my take on video cards in general. I love NVIDIA cards for
       | all out performance. You simply can't beat them. And until
       | someone does, they will not change. I have owned AMD and Intel
       | cards as well and played mainly FPS games like Doim, Quake,
       | Crysis, Medal of Honor, COD, etc. all of them perform better on
       | NVIDIA. But I have noticed a change.
       | 
       | Each year those performance margins seem to narrow. I paid $1000+
       | dollars for my RTX 4080 Super. That's ridiculous. No video card
       | should cost over $1000. So the next time I "upgrade," it won't be
       | NVIDIA. I'll probably go back to AMD or Intel.
       | 
       | I would love to see Intel continue to develop video cards that
       | are high performance and affordable. There is a huge market for
       | those unicorns. AMDs model seems to be slightly less performance
       | for slightly less money. Intel on the other hand is offering
       | performance on par with AMD and sometimes NVIDIA for far less
       | money - a winning formula.
       | 
       | NVIDIA got too greedy. They overplayed their hand. Time for Intel
       | to focus on development and fill the gaping void of price for
       | performance metrics.
        
       | TimParker1727 wrote:
       | Here's my take on video cards in general. I love NVIDIA cards for
       | all out performance. You simply can't beat them. And until
       | someone does, they will not change. I have owned AMD and Intel
       | cards as well and played mainly FPS games like Doim, Quake,
       | Crysis, Medal of Honor, COD, etc. all of them perform better on
       | NVIDIA. But I have noticed a change.
       | 
       | Each year those performance margins seem to narrow. I paid $1000+
       | dollars for my RTX 4080 Super. That's ridiculous. No video card
       | should cost over $1000. So the next time I "upgrade," it won't be
       | NVIDIA. I'll probably go back to AMD or Intel.
       | 
       | I would love to see Intel continue to develop video cards that
       | are high performance and affordable. There is a huge market for
       | those unicorns. AMDs model seems to be slightly less performance
       | for slightly less money. Intel on the other hand is offering
       | performance on par with AMD and sometimes NVIDIA for far less
       | money - a winning formula.
       | 
       | NVIDIA got too greedy. They overplayed their hand. Time for Intel
       | to focus on development and fill the gaping void of price for
       | performance metrics.
        
       | tricheco wrote:
       | > The RTX 4090 was massive, a real heccin chonker
       | 
       | Every line of the article convinces me I'm reading bad rage bait,
       | every comment in the thread confirms it's working.
       | 
       | The article provides a nice list of grievances from the
       | "optimized youtube channel tech expert" sphere ("doink" face and
       | arrow in the thumbnail or GTFO), and none of them really stick.
       | Except for the part where nVidia is clearly leaving money on the
       | table... From 5080 up no one can compete, with or without "fake
       | frames", at no price, I'd love to take the dividends on the sale
       | of the top 3 cards, but that money is going to scalpers.
       | 
       | If nvidia is winning, it's because competitors and regulators are
       | letting them.
        
       | xgkickt wrote:
       | AMD's openness has been a positive in the games industry. I only
       | wish they too made ARM based APUs.
        
       ___________________________________________________________________
       (page generated 2025-07-05 23:01 UTC)