[HN Gopher] PC CPU Shipments See Steepest Decline in 30 Years
       ___________________________________________________________________
        
       PC CPU Shipments See Steepest Decline in 30 Years
        
       Author : giuliomagnifico
       Score  : 99 points
       Date   : 2023-02-12 17:17 UTC (5 hours ago)
        
 (HTM) web link (www.tomshardware.com)
 (TXT) w3m dump (www.tomshardware.com)
        
       | maigret wrote:
       | Are Raspberry Pi available again?
        
         | gjsman-1000 wrote:
         | Strongly recommend looking at used Dell OptiPlex Micros. I have
         | 2 5060Ms, 35W Core i3 8th gen, 2 RAM slots (8GB commonly
         | included), NVMe slot (256GB commonly included), SATA slot, RTC,
         | for about $150. Considering Raspberry Pi prices and the
         | absurdly greater performance, it was a no-brainer for me.
        
           | superkuh wrote:
           | Most people are buying rpis because they can throw some boot
           | image on it made for the rpi and have things just work. A
           | regular computer, no matter how much better in performance,
           | doesn't _just work_.
           | 
           | For example: you can't just throw a Birdnet-Pi
           | (https://birdnetpi.com/) image on a normal PC and run it.
           | There is plain Birdnet but it doesn't have any of the
           | automation or the web interface for use Birdnet-Pi does.
           | Instead it's pages worth of complex multi-argument commands
           | you'd have to customize and you'd have to port the web
           | interface yourself.
           | 
           | Since most people want rpis for little projects like this a
           | normal PC massively increases the complexity.
        
           | dfghjjbhvg wrote:
           | interesting. migth be a good IoT+HTPC+wifiRouter replacer
           | combo!
        
         | graphe wrote:
         | They always are. Why do you need them? Their high prices are
         | probably because of Broadcom though, which is also why they
         | made their own chip. It may never be cheap again.
        
           | operatingthetan wrote:
           | Not really, no: https://www.adafruit.com/product/4295
           | 
           | High prices are because they've been getting scalped on the
           | secondary market, so I wouldn't say that means they are
           | "available."
        
         | locustous wrote:
         | For a price...
        
       | unpopularopp wrote:
       | Did my part buying an i5-13600k. Going from an i7-7700 it was a
       | pretty good jump
        
         | bombcar wrote:
         | I've been so far out of it for so long I have no real way to
         | interpret those numbers. 13600 is almost double 7700 so that's
         | obviously good but is i5 noticeably inferior to i7?
        
           | wongarsu wrote:
           | 7700 is 7th generation, 13600 is 13th generation, so they are
           | about 6 years apart. The 600 and 700 tells you something
           | about how they are positioned within their generation (bigger
           | equals better). i5 vs i7 is a difficult topic. i7 generally
           | has more hardware features enabled and has the higher end
           | models. But unless you want specific features it's not so big
           | of a deal.
           | 
           | Looking up the numbers instead of reading them like tarot
           | cards, I can tell you that the i7 7700 is a 8 core processor
           | with a max turbo frequency of 4.20 GHz, and the i5 13600 is a
           | 14 core with 6 cores that can do 5 GHz, and 8 efficiency
           | cores capped at 3.7GHz. And both support VT-d, which is about
           | the only feature I would care about :D
        
           | Tsiklon wrote:
           | Intel's naming convention on consumer parts is relatively
           | easy to grok, as it's been much more consistent for the last
           | 12 years than their server/workstation parts.
           | 
           | Breaking both down we have 3 things to take note of:
           | 
           | i5/i7 - indicates relative performance or feature set within
           | a given generation, bigger is generally better
           | 
           | 13/7 - the generation of processor
           | 
           | 600/700 - where Intel rates a given processor within a
           | generation, this is consistent and doesn't (to my knowledge)
           | involve overlap between i3/5/7/9 - generally bigger is
           | better.
           | 
           | So i5-13600 is a thirteenth gen i5, type 600. i7-7700 is a
           | seventh gen i7, type 700
           | 
           | Then you get the legion of letter suffixes determining other
           | features, mobile SKUs etc.
        
           | Tuna-Fish wrote:
           | i7 and i5 are just "market segment" identifiers. Even in the
           | first few generations, there were "i7" cpus that were
           | inferior to "i5" ones.
           | 
           | In general, you can only infer very little from the marketing
           | names. To have any actual idea, you have to look at
           | benchmarks. In practice, 13600 is ~70% faster per thread, and
           | has 2.5 times the threads.
        
           | unpopularopp wrote:
           | Recently, and especially with this gen, the line between i5
           | and i7 is coming down to the amount of cores itself not the
           | single core peformance. Basically 14 Cores, 20 Threads
           | @3.5GHz vs 16 Cores, 24 Threads @3.4GHz.
           | 
           | What used to be the i7 was (like with the i7-7700) now
           | occupied by the i9 line
           | 
           | The value price of the 12600k is just much better
        
           | mhh__ wrote:
           | Which i7?
        
           | tinfever wrote:
           | I like to use Passmark as a very rough comparison for CPUs.
           | Emphasis on rough, but probably grounded in reality. Whether
           | the user can utilize such performance, or if they have a
           | specific workload that isn't ideally multithreaded, is
           | critical.
           | 
           | i5-13600k is around 38000 points (https://www.cpubenchmark.ne
           | t/cpu.php?cpu=Intel+Core+i5-13600...)
           | 
           | i7-7700k is around 9600 points (https://www.cpubenchmark.net/
           | cpu.php?cpu=Intel+Core+i7-7700K...)
           | 
           | So the new i5 is nearly 4x faster than the old i7. New CPUs
           | have come a long way over the last few years.
        
       | LarryMullins wrote:
       | A CPU a few years old is still good enough. If you have a limited
       | budget and want the most bang for your buck, then stick with your
       | old CPU and get a new GPU instead.
        
         | coffeebeqn wrote:
         | Depending on what you're doing - unless you need to game with
         | slightly better FPS there's nothing a gaming PC/laptop from a
         | few years ago can't completely crush still
        
           | LarryMullins wrote:
           | I was thinking more about playing with new AI toys. TBH a 10
           | year old CPU and GPU are still fine for most games, latest
           | AAA releases perhaps notwithstanding.
        
             | coffeebeqn wrote:
             | Hmm yeah I figure it's better to just rent a monster
             | machine in the cloud for that. But I haven't dipped into
             | that much yet
        
               | Our_Benefactors wrote:
               | It's not because then you're trading graphical fidelity
               | for input lag.
        
         | btgeekboy wrote:
         | Yeah, my Ryzen 3700x is a 3.5 year old CPU now. I might try to
         | move to an AM5 build later this year, but it's still a very
         | decent PC.
        
           | noir_lord wrote:
           | 2700X/RTX2080 - at the time I thought the 2080 was over
           | priced but then things went mental and it looked like a solid
           | buy.
           | 
           | I want to upgrade (held off while buying a house and
           | moving/renovating) now I finally can...I realised the only
           | game I play a lot isn't even capped by my current hardware so
           | there just is no incentive to upgrade.
        
           | stateofinquiry wrote:
           | I had the same CPU from 2019 until a few weeks ago. Upgraded
           | to a 5700X (electricity is expensive around here, so the
           | lower wattage parts are appealing); with more aggressive
           | memory timing got about 20% improvement in video encoding
           | (main cpu-bound task I do regularly). With selling the 3700X
           | net cost for upgrade was around $100- not a bad deal, even
           | though I stuck with the stock cooler! This is after I have
           | also doubled the RAM from the original build to 32G for ~$60
           | last October. I expect to get another 3-4 years of of this
           | AM4/DDR4 rig before big overhaul.
           | 
           | Decades ago I did similar things, but the cadence was much
           | faster (annual sometimes); my conclusion, like that of many
           | others, is that PCs are usable for much longer these days. A
           | net positive I believe.
        
         | bluedino wrote:
         | My work PC is a i7-6700 and it's....7 years old?
        
       | croes wrote:
       | Percentage compared to the previous year is a bad measure.
       | 
       | A positive outlier in the last year leads to a negative one in
       | the current.
        
       | V__ wrote:
       | > Most of the downturn in shipments is blamed on excess inventory
       | shipping in prior quarters impacting current sales.
       | 
       | This is in line with comments from Drew Prairie (AMD's VP of
       | communications): [1]
       | 
       | > We are shipping below consumption because there is too much
       | inventory in the channel and that partners want to carry lower
       | levels of inventory based on the demand they are seeing
       | 
       | [1] https://www.pcworld.com/article/1499957/amd-is-
       | undershipping...
        
       | finphil wrote:
       | Shipments are still high... but the "PC pandemic boom" might be
       | over.
        
       | necessary wrote:
       | I don't know about anyone else but I've been patiently waiting
       | for the Ryzen 9 7950X3D since the 5800X3D came out. The gaming
       | performance on that chip was so good that it was competitive with
       | more expensive chips at the time, despite being slower for
       | productivity workloads. My 4790k is starting to show it's age
       | when playing games like Rimworld and Elden Ring.
        
         | shmerl wrote:
         | I'm waiting for benchmarks. So far I'm not convinced 7950X3D
         | will be better than 7950X, especially since there is no way
         | scheduler will be able tell whether some thread benefits from
         | more cache or from higher clocks, unless someone develops a
         | very sophisticated one with AI like training capabilities? I
         | haven't seen any kind of efforts of that sort (I'm gaming on
         | Linux).
        
           | chucky_z wrote:
           | Look at 5800X vs 5800X3D.
        
             | shmerl wrote:
             | Not comparable, because 5800X3D provides cache for all
             | cores. 7950X3D provides more cache only for half the cores.
             | That creates a weird hybrid CPU. Half the cores have higher
             | clocks and less cache, and half the cores have lower clocks
             | and more cache.
             | 
             | So I'm not yet convinced it's actually going to be better
             | than stock 7950X with higher clocks across the board.
        
         | schemescape wrote:
         | I'm surprised to see Rimworld and Elden Ring lumped together. I
         | thought Rimworld was a nicer looking Dwarf Fortress.
         | 
         | Is the simulation just extremely CPU-intensive?
        
           | ozarker wrote:
           | Yeah it's notoriously cpu bottlenecked and doesn't do much
           | multithreading
        
           | Narishma wrote:
           | It's just poorly coded and doesn't take advantage of the CPU
           | power on hand. A lot of games are like that, both indie and
           | AAA.
        
       | classichasclass wrote:
       | Interesting in those tables that VIA isn't even a rounding error
       | anymore.
        
       | ramshanker wrote:
       | I am waiting for DDR5 prices to normalise. Simple.
        
         | downrightmike wrote:
         | Given how many times the MFGs have been caught colluding to
         | keep them high, good luck!
        
         | eertami wrote:
         | With a 5800X3D, I'm waiting for DDR6 prices to normalise ;)
        
       | jostmey wrote:
       | The best chip production is going to smartphones. PC processors
       | drain my battery and are slow compared to a GPU. So why would I
       | spend money on a new PC or new PC processor?
        
         | unpopularopp wrote:
         | Good luck playing Hogwarts Legacy on your smartphone.
        
           | downrightmike wrote:
           | HL == ew
        
           | CameronNemo wrote:
           | I assume you picked that game just to stir shit up.
        
         | akmarinov wrote:
         | Them sweet FPS
        
       | witheld wrote:
       | I can't see myself needing a CPU upgrade for a long time. I used
       | Ivy Bridge for a decade, and my Ryzen is far far more powerful.
       | 
       | Right now the only compute upgrade I need is a GPU and I think
       | that applies to a lot of people.
        
         | zamadatix wrote:
         | That era marked the lowest level of competition in CPU
         | performance and it really showed in terms of how relatively
         | lame an upgrade with the leader of the time (Intel) was. With
         | AMD's products being competitive and ARM CPUs no longer being
         | relegated to smartphone class performance there is real
         | competition again. Given that and the historical tendency that
         | software grows to use the available hardware I wouldn't bank on
         | every CPU upgrade lasting as long as they did in that period.
         | 
         | But damn if it hasn't been hard to get a good deal on a GPU
         | these last couple of years...
        
       | dv_dt wrote:
       | post the WFH investments, plus major tech companies laying off
       | pushing a glut of used hw onto the market would seem to put a
       | dent it cpu shipments.
        
       | mensetmanusman wrote:
       | The ending of Moore's law should result in a future where compute
       | devices last a century like furniture does.
        
         | kupfer wrote:
         | If it comes to that, I expect manufacturers to design
         | processors in a way that electromigration limits their
         | lifetime.
        
           | mensetmanusman wrote:
           | The new lightbulb :)
           | 
           | I wonder if it comes to that...
        
       | izacus wrote:
       | We're in for a doomsaying "XXXX Shipments see YYY decline"
       | articles after 2020 boom aren't we?
       | 
       | Gotta have those clicks coming. This is all just reversion back
       | to pre-2020 normal.
        
       | krisroadruck wrote:
       | Just built a new rig last week. This isn't at all surprising.
       | Prices are so out of whack right now. $1000 used to be the
       | ceiling for the highest end video cards (like a 1080 Ti), now
       | $1000 is the floor. There are no good current gen (or even
       | previous gen) cards available for under $1000. The 4080 is a
       | horrible value and still regularly listed at $1200-$1400. The
       | 4090 is overkill and sits around $1700-$2200. Even 2 year old
       | tech 3080's are regularly selling for near a grand. AM5
       | motherboards are insanely priced. Want a 10Gb onboard NIC? Be
       | prepared to shell out $1000 just for the motherboard. Add to all
       | of that, this latest batch of CPUs are just stupid power hungry -
       | like 240w+ under load (except for the non-x variants of AMD 7000
       | series, just released last month).
       | 
       | It used to be you could buy a lot of computer for $2-3K, now that
       | figure is closer to $5K. These prices, combined with the folks
       | that just went through this pain 2 years ago during the pandemic
       | and yeah you aren't going to see stuff flying off the shelves any
       | time soon.
        
         | jb1991 wrote:
         | The Mac Studio is quite a powerful machine at a relatively
         | reasonable price.
        
           | ChuckNorris89 wrote:
           | How many games run on it?
        
             | davely wrote:
             | Quite a lot, at least based on my experience using Steam
             | and playing various games I've bought for my actual gaming
             | PC, but have builds available for multiple platforms.
             | 
             | As long as they're 64 bit builds, MacOS's Rosetta
             | translation layer does a great job of running then without
             | a hitch.
             | 
             | Apple Silicon is a beast. I wish more developers would take
             | advantage of it.
        
               | amelius wrote:
               | > Apple Silicon is a beast. I wish more developers would
               | take advantage of it.
               | 
               | It's consumer electronics. Not very useful for people who
               | want to build things containing a CPU/GPU.
        
               | krisroadruck wrote:
               | This is only true if your primary game preference is
               | casual games. The mac studio lacks a discrete graphics
               | card. Ignoring the OS, the whole ARM vs x86, and Rosetta,
               | wine, whatever stumbling blocks - just lack of a discrete
               | GPU is enough to make it a no-go as a serious gaming rig.
               | This isn't just me talking out of my neck either. I have
               | 3 rigs I keep around my desk, and one of them is a mac
               | studio. Great for dev and video editing, but for gaming
               | not so much.
               | 
               | Edit: "Why do you need 3 computers?" - I regularly switch
               | between Windows, Ubuntu & MacOS, and I don't like dinking
               | with switching my monitor inputs, dual booting or
               | remoting in. Rather just swivel my chair. Yes I fully
               | realize how ridiculous this is. Some people like nice
               | cars. I like wasting money on computers apparently.
               | -\\_(tsu)_/-
        
               | bitL wrote:
               | No worries, I have 12 computers around my desk...
        
         | Whinner wrote:
         | $1000 as the floor? Maybe if you're talking 4K gaming but
         | that's the very high end.
         | 
         | Intel arc a750 are under $300 and are decent cards for 1080 and
         | do well in 1440. Dx9 support has greatly increased since
         | release.
         | 
         | Going up a little in price, amds 6650 amd 6750 are 3-400.
         | 
         | 6800xt are under $600.
        
           | krisroadruck wrote:
           | I haven't bought a non-4K monitor in over 6 years. I honestly
           | don't know anyone who is still using 1080p monitors as daily
           | drivers if they are also using the machine for productivity
           | or media work. But your point is not invalid.
        
             | IntelMiner wrote:
             | Hi, I use a pair of 1080p monitors on my primary machine.
             | Both for work, leisure and my hobby of video editing for
             | youtube videos
        
             | pprotas wrote:
             | https://store.steampowered.com/hwsurvey/Steam-Hardware-
             | Softw...
             | 
             | > 60% of Steam users have a 1080p monitor as their primary
             | display
        
               | interstice wrote:
               | How recently have they upgraded
        
               | Nullabillity wrote:
               | What motivation is there to upgrade? Even if you're
               | replacing the PC itself completely there's not much of a
               | reason to not just reuse the old screen.
        
               | alyandon wrote:
               | Apparently, I'm a real oddball with my dual 2408WFP
               | setup:                 1920 x 1200  0.70%   +0.01%
        
             | bryanlarsen wrote:
             | Modern games let you set separate rendering and display
             | resolutions so you can get most of the benefit of a 4k
             | display without requiring a video card that can render
             | every pixel. The new upscaling techs address really good.
        
             | TaylorAlexander wrote:
             | Still using my old Dell 1440p 27" monitor to edit my 4k
             | youtube videos. I briefly considered buying a 4k monitor
             | this year but I spent my money on a NAS instead. I use
             | three monitors on my desktop, the other two being 1080p. I
             | use the 27" for my main monitor and the others are for docs
             | and videos. I haven't bought a monitor in like 10 years
             | because these things just keep on going. I do have a 4k
             | monitor and work and its nice but does not feel
             | significantly different from my old 1440p monitor. If I had
             | more money I probably wouldn't think much about an upgrade
             | but I work for a tech non profit and live in the Bay Area
             | so I am not out buying new stuff that often. The NAS was a
             | long needed upgrade to serve as a backup for my important
             | media!
        
             | dvngnt_ wrote:
             | 1440p gang
        
             | anthomtb wrote:
             | I am on a pair of 1080p's. I stare at text all day. What is
             | the benefit of upgrading to 4K?
        
               | acdha wrote:
               | The text isn't fuzzy? I have a work-provided 1080p
               | display and it's really noticeable switching between that
               | and a Retina display unless I'm across the room, even
               | without my glasses.
        
               | anthomtb wrote:
               | Nope, not at all.
               | 
               | Maybe it is one of those things where, once you switch to
               | 4K, you can't got back.
        
               | E39M5S62 wrote:
               | I was on ~100dpi monitors for years. I just picked up a
               | 26" 4k Dell for a steal - it's noticably more crisp than
               | my 1440p screens. I'm not getting rid of my 1200p and
               | 1440p screens on my workstation, but ... 4k is nice.
        
               | acdha wrote:
               | Quite possibly. I do this multiple times a week and it's
               | quite noticeable but that's definitely after training my
               | baseline expectations.
               | 
               | I have found it seems to be better for eye strain but
               | that's a single anecdote, not science.
        
               | giantrobot wrote:
               | Your text can be _much_ sharper and easier on the eyes.
               | Newer displays also can have much better dynamic range
               | which also improves text readability.
        
               | FpUser wrote:
               | I run single 30" 4K monitor at 100% scaling. Main reason
               | for 4K is - I usually have 2-3 editing windows arranged
               | side by side when coding. It fits a lot of text
               | vertically and I like that.
               | 
               | As an extra benefit: I am a sucker for good photos and
               | viewing those on large 4K is way better in my opinion. 4K
               | Youtube and Netflix also looks better.
        
             | tbrownaw wrote:
             | On in-office days I'm stuck with a pair of 1080p (at home,
             | yes it's a pair of 4k). It's kind of annoying.
        
               | ClumsyPilot wrote:
               | I understand when people sont want to spend their own
               | money on displays, but as a business you wre literally
               | loosing money by using substandard equipment, research
               | shows its around 10% of salary, much more than a monitor
               | costs
        
               | lazide wrote:
               | Different budget line items. Silly huh?
        
             | ChuckNorris89 wrote:
             | At my job we all got 1080p monitors for dev work.
        
             | krisroadruck wrote:
             | @ChuckNorris89 man... why do they hate their devs? That's
             | just mean =/ Hopefully they don't have you all on a bunch
             | of $400 dell optiplexes too. Pouring one out for you
             | brother.
        
             | cammikebrown wrote:
             | 144Hz is way more important to me than resolution. I have a
             | 4K TV I can hook my computer up to if I really want.
        
             | scarface74 wrote:
             | This is so out of touch with reality it might as well be
             | "do people still watch TV? I haven't owned a TV in 10
             | years"
        
             | goosedragons wrote:
             | 1440p monitors aren't that expensive either and for a lot
             | of people 4K at 100% scaling is too small to be comfortable
             | at a typical 27" size. For PC gaming a high refresh rate
             | 1080p or 1440p display is a better buy than a 60Hz 4K one
             | at roughly the same price.
        
           | elabajaba wrote:
           | The cheapest 3050 in Canada is $389. It is ~10% slower than a
           | 1070ti.
           | 
           | 4 years ago you could get a 1070ti in Canada for $400 new.
           | 
           | The exchange rate is about the same as well.
        
             | qball wrote:
             | >The cheapest 3050 in Canada is $389. It is ~10% slower
             | than a 1070ti.
             | 
             | Fortunately, used 1070s and RX580s are 150CAD (100USD).
             | 
             | The combination of board prices being completely out of
             | whack, and the performance delta of the 3080 over the
             | 3070/2080Ti (made worse by the fact that the 4090 is that
             | leap _again_ over the 3080) being as large as it is, means
             | the value proposition in the middle has disappeared.
             | 
             | It's not that the 3080 isn't worth 700USD or that the
             | current 4000 series cards don't have a similar
             | price/performance ratio- because they very much are priced
             | appropriately; it's that the cheaper new cards (especially
             | the 3070) have a far worse ratio than the expensive ones
             | do. This is also partially why the most popular gaming GPU
             | on Steam is the GTX1650.
             | 
             | And with current-generation console games targeting the
             | equivalent of that 1070/2060, buying anything less than a
             | 3080 is an objectively bad decision, unless you're someone
             | who plays a lot of competitive shooters and thus can
             | benefit from an intermediate for high framerate reasons.
             | (The fact that most of those games aren't particularly fun
             | to play also hurts the hardware industry.)
        
               | sokoloff wrote:
               | I've got a couple of machines here (mine and the kids')
               | with RX480/580s in them.
               | 
               | At this point, I think a buyer might as well get a new
               | 6650XT (~US$260 plus tax) rather than trying to chase a
               | 5-year old, used, half-as-powerful 580.
        
         | vitaflo wrote:
         | Let's also remember when the 1080 first launched it was a
         | massive upgrade and the retail price was $600.
        
           | ricardobayes wrote:
           | And was considered really expensive, at the time
        
           | checkyoursudo wrote:
           | I bought a 1080 quite a while after they launched. I still
           | have it. I have thought about building a new PC lately, and
           | it seems to me that newer cards are ... not as much as an
           | upgrade as I might have been led to believe? Especially given
           | the cost. If I did build a new PC today, I am not entirely
           | sure I would buy a new video card. The performance/value
           | ratio doesn't seem as appealing as jumping to the 1080 did at
           | the time back then.
        
         | arecurrence wrote:
         | Reading this really points out to me how killer of a deal that
         | GeforceNow is. I truly don't understand why more publishers
         | don't allow their games on the platform.
        
           | wonnage wrote:
           | Having what feels like 100ms of input delay at a minimum is
           | pretty awful. Idk if there's some magic way to speculatively
           | render extra frames so that input is resolved locally
        
           | eric__cartman wrote:
           | It's not a killer deal for the consumer. That's just Nvidia
           | fucking you over either way and still profiting off it. The
           | market is not only bad in the high end, but there aren't any
           | ~$200 value oriented graphics cards that provide significant
           | upgrades over past generations. Just now the GTX 1060 has
           | stepped down from being the most popular card in Steam's
           | hardware survey just to be replaced by the GTX 1650, a much
           | newer card that costs about the same and performs about the
           | same. And with less VRAM!
        
           | dfghjjbhvg wrote:
           | because they all learned from stadia and know how crappy and
           | useless remote streaming of input/graphics is for most games.
           | 
           | this tech is bad. awful. only worth for casual games which
           | are already well served by comon hardware localy.
           | 
           | its all a ruse to try to sell idle capacity for their gpu
           | farms because AI model training hype never materialized
        
             | 0-_-0 wrote:
             | I tried geforce now, it's amazing, lag is almost
             | imperceptible with the 8ms ping I had.
        
           | geraldhh wrote:
           | i would suspect that they either fear a loss of profit (b/c
           | platform cut) or reputation (because latency/jitter from bad
           | connection will be wrongly blamed on the game rather than the
           | platform)
        
           | pdntspa wrote:
           | I am so glad this notion of renting your games is dying.
           | People need to own their bits.
        
         | Spooky23 wrote:
         | Enterprise drives shipments and the PC industry is fucked
         | because: the surge of Windows 7 migration is over, they've
         | become appliances, and everyone bought thousands of laptops in
         | 2020/1.
         | 
         | The only reasons to replace business desktops are swollen
         | batteries and Microsoft. My teams are responsible for 250k
         | devices globally. Failure rates are <3%, and 75% of failures
         | are batteries and power supplies. With the transition away from
         | spinning rust complete, we have more _keyboard_ failures than
         | desktop PC failure. I'm taking the PC refresh savings and
         | buying Macs, iPads and meeting room hardware.
        
           | Baeocystin wrote:
           | Speaking from the smaller scale side of IT, in the past ~6
           | months or so, I've deployed more BeeLinks (
           | https://smile.amazon.com/Beelink-Graphics-Desktop-
           | Computers-... ) or even smaller, Atom-powered boxes (
           | https://www.amazon.com/GMKtec-Nucbox5-Desktop-Computer-
           | Windo... ) than Lenovos or Dells. And my clients are really
           | happy with them, too. These are, of course, still technically
           | PC shipments, but the amount of money involved for the
           | manufacturers is absolutely minimal. And most office workers
           | don't need more.
        
           | slowmovintarget wrote:
           | Yep, tech work, and especially remote software work, is all
           | done with laptops and docks.
           | 
           | I'm writing this from my home gaming rig, which is an old,
           | not-cool-enough-for-Win-11 (thank god), desktop. I don't know
           | what I'll be replacing it with when it keels over and dies.
           | Maybe a Mac tower? Maybe a Linux rig. But it'll be my PC, not
           | Microsoft's if I can help it.
        
             | layoric wrote:
             | I think is largely correct, I've been working remotely for
             | the better part of 10 years. I used laptops for around 6.
             | The portability is great when I was split between multiple
             | jobs.
             | 
             | Changed to a desktop about 3 years ago and wouldn't go back
             | unless I really needed that portability again. I had
             | forgotten how much faster a well spec'd desktop machine
             | actually is. And upgrading parts I find a lot better than
             | replacing the whole laptop every 2-3 years.
             | 
             | Currently on a Ryzen 5950x, 64gb ram, multiple gen 4 ssds,
             | and a workstation GPU. The only laptops to beat such a
             | setup in most tasks weigh a lot, cost more than double the
             | desktop, and are 2 years newer.
        
             | downrightmike wrote:
             | mac mini is probably enough these days
        
         | Tuna-Fish wrote:
         | There are complex reasons related to patent licensing why it
         | makes no sense to put 10Gbe on a motherboard right now. If you
         | want 10Gbe, don't get a $1000 motherboard for it, get a
         | motherboard with a free pcie 4.0 slot and get an adapter.
         | That'll cost an extra ~$100 today, and an extra ~$20 in August.
        
           | robin_reala wrote:
           | Why $20 in August?
        
             | Tuna-Fish wrote:
             | Patents expire.
             | 
             | The cost of actually making a 10gbase-t network card is
             | really not much anymore. All of the cost is licensing.
        
           | Flockster wrote:
           | Are there substantial patents running out in this year or why
           | the huge expected priceloss?
        
             | Tuna-Fish wrote:
             | Yes.
        
             | chx wrote:
             | US7164692B2 and US6944163B2 at least. There might be even
             | more. The former expires 2023-07-18 the latter on
             | 2023-08-08.
        
               | FpUser wrote:
               | Lovely. Might actually switch to 10Gb network after
               | prices fall.
        
           | krisroadruck wrote:
           | Fair enough - I'm clearly paying a premium to keep a cleaner
           | interior on the build. I tend to run exactly one card - the
           | video card, and aim to get everything else onboard, but as
           | you pointed out that's definitely not the only way to go, and
           | certainly not the most cost efficient way.
        
             | willis936 wrote:
             | Vanity can cost as much as a market is willing to pay.
             | 
             | From a function standpoint: I'm happy that multiple PCIe
             | slots is still the standard. If I didn't have functional
             | wants such as "more accelerators and more I/O" then I'd go
             | with a cute ITX build.
             | 
             | I did get a compact case recently because 5.25", 3.5", and
             | 2.5" bays are no longer an interesting use case for my
             | daily driver, but now I find that even if I did want to
             | shell out for a new high-end GPU the only model that would
             | fit in my case is a perpetually sold out AIO model.
        
               | myself248 wrote:
               | Aye. I keep two desktops around for this reason.
               | 
               | One is my new Ryzen APU in a small (sub-7-liter) case,
               | which sits half empty because I haven't even dropped a
               | GPU into it. All my games and CAD run just fine on the
               | APU. All the included peripherals are more than I've ever
               | needed.
               | 
               | The other is my old Athlon64x2 4850e in a bigass M3N78
               | Pro board, with drives and I/O out the wazoo. It's old
               | enough to have floppy and PATA ports, but new enough to
               | have SATA and USB too (and even a PCIe slot), so it's my
               | media mule. I power it up whenever I need to do CD
               | ripping, floppy archiving, that sort of stuff. I actually
               | picked up the fastest chip that would fit the board (a
               | Phenom II X4 945), just for giggles because they're $30
               | on eBay now, but promptly swapped back to the 4850e
               | because 125 watts in a CPU is unconscionable when 45 does
               | the job just fine.
               | 
               | The latter, of course, is chock-full of cables like
               | they're goin' out of style. I made a few long floppy and
               | PATA ribbons for working with external disks so they sit
               | crammed up in the bottom when not in use, etc. And the
               | non-modular PSU has like a dozen cables all splattered
               | everywhere. It's the opposite of vanity, and I love it.
        
           | 14u2c wrote:
           | What's happening in August?
        
           | bauruine wrote:
           | You can get dual port 10 Gbit cards from ebay for less than
           | 20$ right now.
        
             | Tuna-Fish wrote:
             | The only ones I see are sfp+ cards, those also require
             | transceivers (at ~$50 per) to work. The cheapest 10gbase-t
             | cards I can see are refurbished PCIe 16x cards with active
             | cooling for ~$40, you probably don't want those either.
             | Really, as a consumer you want either a PCIe x4 or ideally
             | a PCIe 4.0x1 card, with a modern much more power-efficient
             | chip.
        
               | bauruine wrote:
               | SFP+ uses less power, has less latency and DAC / Twinax
               | cables cost less then 30$ for 6 meters. The only downside
               | is that DAC is limited to 6 meters I think.
        
           | chasd00 wrote:
           | Posted the basic question in another post but I'll ask here
           | too. What are you using to saturate a 10gbe nic? Inet??? I
           | find a true 10gbe inet connection unusual but I've been out
           | of that game for a while.
        
         | docfort wrote:
         | There's been a lot of ink spilled about the decline of Moore's
         | Law and how it hasn't yet exactly fallen for all aspects of
         | computer engineering. I think it's fallen for customers,
         | though. The economics of exponential speed improvement in
         | traditional CPU design have gone away, and the
         | capability/complexity ratio of software collapsed with the fall
         | of Dennard scaling. No fundamentally new applications have come
         | out (save ML, which is not particularly suited to CPUs or even
         | GPUs), so consumers are happy to keep chugging along with their
         | current setup or move more load to their phones.
         | 
         | Even if the increase in hardware cost stays at parity with
         | inflation, it's tremendously more expensive than it used to be,
         | when waiting six months could get you more machines for your
         | budget.
         | 
         | Gaming, a previous driver of high-end consumer growth, has
         | split into truly mobile, consoles, and very high end PCs. But
         | complex games take more capital and time to develop, so
         | recouping costs is important (except if the studio is part of a
         | conglomerate like Microsoft that can weather temporary market
         | forces). I'd imagine that places pressure on game developers to
         | aim for maximum compatibility and a scalable core. So too bad
         | for the Mac, great for phones, and great for consoles
         | (especially with monetizing the back catalog). And new PCs will
         | have to fight against good-enough, and lower demand funding new
         | hardware and software.
        
         | dotnet00 wrote:
         | Agreed, I don't mind spending extra money on GPUs (I have 2
         | 3090s sitting next to me) because the improvement are still
         | worthwhile for my use case, but the CPU prices have been
         | unjustifiable, especially on the AMD side. Increasing CPU
         | price, absurd motherboard price AND needing to buy new RAM, all
         | for an improvement that isn't really too meaningful unless in
         | very specific tasks, is not really worth it. I instead just got
         | a 5900x for my computer and moved its 3900x into a server,
         | retiring its 1600x (which was also sufficient for its work,
         | although at least the 1600x was noticeably slower for
         | transcoding, the 3900x is proving more than sufficient).
        
         | chx wrote:
         | I am sorry but the "rule of thumb" website,
         | https://www.logicalincrements.com/ disagrees with you, heavily
         | so. You can still buy an awful lot of computer for $2k, it's
         | right there how. The prices are real, they link -- yes, with
         | affiliate links -- to real sales on Amazon/Newegg/etc. To quote
         | what you can expect from their "outstanding" tier at 1628 USD:
         | 
         | > This expensive tier has the highest possible cards that still
         | maintain a reasonable performance/price. Sure it is pricey, but
         | it is luxurious!
         | 
         | > Expect high performance at 1440p, and solid performance at 4K
         | even in the most demanding games.
         | 
         | For 1865 USD:
         | 
         | > Max most titles at 1440p@144Hz, and solid framerates in 4K,
         | even with max settings.
         | 
         | And if you want to note the 6900XT card for $720 they used here
         | is out of stock then let me note a $700 6950XT:
         | https://www.newegg.com/asrock-radeon-rx-6950-xt-rx6950xt-ocf...
         | which makes it an even better bang for your buck.
        
           | waboremo wrote:
           | This is misleading. For nearly $2000, you better be getting
           | 4K with ray tracing (max settings), AKA, a top of the line
           | device. 1440p/144 is midrange now.
           | 
           | Which that $1865 one does not provide. Even before ray
           | tracing, most games struggle to hit the 144 rate you're
           | aiming for, so with ray tracing that drops down to like ~40
           | (Cyberpunk 2077 for example in that 6900XT card). You have to
           | enable workarounds like DLSS/FSR to make those games
           | playable.
           | 
           | The only way you're getting good framerate at 4k is without
           | ray tracing, but you're paying $2000 to have to worry about
           | still disabling settings? Ridiculous!
           | 
           | So yes, they are overpriced. For $2000 you should not be
           | worried about having to enable FSR.
           | 
           | The usual excuse when this is brought up is "well just don't
           | play those games, they seem unoptimized" to which again, the
           | question is, why are you spending two thousand dollars to
           | avoid playing certain games? How absurd.
        
             | operatingthetan wrote:
             | >This is misleading. For nearly $2000, you better be
             | getting 4K with ray tracing (max settings), AKA, a top of
             | the line device. 1440p/144 is midrange now.
             | 
             | You're both just setting an arbitrary baseline of
             | performance at $2k. Arbitrary comparisons cannot be
             | misleading.
        
             | mistrial9 wrote:
             | 4k displays are like giant televisions right? do coders
             | really buy 4k displays ..
        
             | mosquitobiten wrote:
             | Well the website is called logical increments, it's really
             | hard to call ray tracing a logical increment. You are
             | sacrificing too much to gain so little, at least for now.
        
               | breckenedge wrote:
               | This is how I see it now too. The manufacturers are
               | capitalizing on people needing to have the best by making
               | even more expensive top tier components. You don't need
               | 4K or ray tracing to enjoy a triple-A game. Last year, I
               | got a 3070-equipped system, force feedback driving wheel,
               | and a 4k TV as a monitor for under $2k. The games are
               | still quite gorgeous, and I have no idea what I'm missing
               | by not having something more expensive.
        
           | bialpio wrote:
           | > For 1865 USD:
           | 
           | You can also save up on the case (cheaper options should be
           | available), and grab a Ryzen 7900 which should have similar
           | perf & comparable price point to Intels they used, and comes
           | with stock cooler, shaving off additional ~$100. I'd also
           | probably skip the HDD and grab 32GB RAM.
        
             | lastLinkedList wrote:
             | That's what I'm thinking of doing when I build PCs for my
             | partner and I later this year. I've been looking at
             | benchmarks and I'm not as worried about the top end of
             | performance as much as I am Intel being about ready to
             | release a new socket design next refresh.
        
         | bluedino wrote:
         | I don't keep up with PC component prices but I thought crypto
         | crashing flooded the market with cheap cards? (Then again I
         | guess BTC is back into the $20k's)
        
           | thomastjeffery wrote:
           | You can only flood desert for a moment.
           | 
           | BTC hasn't been viable on GPUs for a while, either. It's the
           | Ethereum Proof-of-Stake change that was the most exciting,
           | but it doesn't seem to have had a significant effect,
           | especially with newer (3-4yr old) cards.
        
             | svnt wrote:
             | They are good heaters. Wait til summer.
             | 
             | Or maybe they just sell them to generative AI farms.
        
               | thomastjeffery wrote:
               | It's a generally global market, and it's always summer
               | somewhere.
        
         | jmyeet wrote:
         | > There are no good current gen (or even previous gen) cards
         | available for under $1000
         | 
         | The 4070 Ti is <$1000.
         | 
         | > Want a 10Gb onboard NIC? Be prepared to shell out $1000 just
         | for the motherboard.
         | 
         | Why not just a PCI-e 10GbE NIC on a regular motherboard?
         | 
         | > now that figure is closer to $5K.
         | 
         | I see plenty of prebuilt PCs (eg CyberpowerPC) with 3070 Tis
         | for $2k. 4070 Tis for $2.5k.
        
         | adrian_b wrote:
         | The current prices are indeed high, but if I would upgrade
         | right now my desktop with the best components that I can find,
         | that would cost only $1500.
         | 
         | The $1500 would pay for an AMD 7950X, an ASUS MB with ECC
         | support and a decent configuration of the expansion slots
         | (Prime X670E-Pro), a Noctua cooler suitable for 7950X (e.g.
         | NH-U12A) and 64 GB of ECC DDR5-4800 memory (which for 64 GB
         | costs $100 more than the non-ECC variant).
         | 
         | For the rest, I would continue to use the case, the SSDs, the
         | 10 Gb/s NIC and the GPU that I am currently using.
         | 
         | If I would want to also upgrade the GPU, I would buy a RTX 4070
         | Ti, for 3 reasons. It provides enough extra performance to make
         | an upgrade worthwhile, it has the best performance per $ among
         | all recent GPUs (at MSRP, a 7900 XTX would have better
         | performance per $, but it can be found only at much higher
         | prices), and lastly, 4070 Ti is the only recent GPU for which
         | the vendor specifies the idle power consumption and the power
         | consumption during light use (e.g. video decoding) and the
         | specified values are small enough to be acceptable in a desktop
         | that is not intended for gaming.
        
         | nickstinemates wrote:
         | > Want a 10Gb onboard NIC?
         | 
         | Just buy an Intel PCI-E 10g card. They're like $100. Slots on
         | motherboards are meant to be used.
        
         | teeray wrote:
         | I would have expected the crypto implosion to have a depressing
         | effect on graphics card prices (certainly in the secondary
         | market). Any theories why they remain elevated? Is it just
         | supply chain stuff that everything is experiencing?
        
           | neogodless wrote:
           | In comparison, they are much less expensive. (Though people
           | who refuse to switch brands might have a harder time.)
           | 
           | For example, The Radeon 6700 XT 12GB was commonly $900+
           | during the crypto boom, but is regularly around $350 now.
           | That's a pretty big drop.
           | 
           | "Current generation" - only very expensive high end models
           | have been announced (and some aren't selling as low as MSRP
           | yet.)
           | 
           | RTX 4070 Ti $800 | RX 7900 XT $900 | RX 7900 XTX $1000 | RTX
           | 4080 $1200 | RTX 4090 $1600
           | 
           | You have to stick to last generation for excellent
           | performance with less insane pricing.
        
           | [deleted]
        
         | [deleted]
        
         | sylens wrote:
         | Not only are the prices out of whack, but the newest games
         | coming out all seem to have some sort of technical issue on the
         | PC. Shader stutter is nearly a universal thing in most new
         | releases, or the developer doesn't optimize for the platform at
         | all (Callisto Protocol and to a lesser extent, Hogwarts
         | Legacy). So not only are you paying more money than ever,
         | you're experiencing certain issues that just aren't there on
         | consoles.
        
           | slowmovintarget wrote:
           | I recall Jonathan Blow talking about how it's basically
           | impossible to eliminate stutter on Windows now due to a
           | number of design decisions in the OS driver system itself.
           | 
           | I'm wondering if this is the moment for Linux gaming. Valve
           | has certainly taken it a long ways from where it was.
        
         | chasd00 wrote:
         | Unless you your inet connection can sustain 10gb then what's
         | the point of a 10gb nic? I have gbit fiber that rarely gets
         | above 6-700mbit. Is a full 10gb inet connection that common?
         | 
         | Even on LAN do you have I/O that can deliver 10gb/sec to the
         | wire?
        
         | gruez wrote:
         | >Add to all of that, this latest batch of CPUs are just stupid
         | power hungry - like 240w+ under load (except for the non-x
         | variants of AMD 7000 series, just released last month).
         | 
         | That's because in the race to get the highest benchmark scores,
         | both companies have set the stock clocks to a level that's way
         | beyond what's optimal (eg. adding 100W of power consumption to
         | get 5% higher benchmark scores). The CPUs themselves are fine,
         | you just have to adjust the power/clock limit lower.
        
           | BoorishBears wrote:
           | Honestly you don't have to adjust the limit: the "power under
           | load" angle just gets completely overblown because people go
           | based on a reviewer's definition of load
           | 
           | They might be 240W under _extreme_ load, but I can play AAA
           | titles on my i9 at 240hz barely cracking 50% CPU load. And
           | that 's with a 3090, so not exactly a mismatched CPU/GPU
           | situation.
           | 
           | At those types of loads the CPU doesn't even try to hit boost
           | clocks most of the time, so you're nowhere near the figures
           | you often see touted based on benchmarks.
        
             | karamanolev wrote:
             | By "CPU load", loading the CPU is implied. The fact that
             | your AAA titles don't load the CPU, because they're GPU-
             | bottlenecked is irrelevant. Load the CPU properly, e.g.
             | compiling, and you'll see large power consumptions.
        
             | 015a wrote:
             | Yeah; games to this day are still pretty bad at utilizing
             | multiple cores. Any idiot can do the math; the i7 13700k
             | has 16 cores, and an advertised max TDP of 250W. That's a
             | ton of power; but if you're only really stressing 2 or 3
             | cores the real power draw isn't that high. These chips are
             | so powerful that your bottleneck is almost always GPU,
             | unless you're playing CS:GO at 1080p and aiming for 800fps,
             | so realistically its common to see 25-60% utilization on 1
             | or 2 P-cores, and the rest just running Windows background
             | shit.
             | 
             | This is proven by any outlet which takes the time to
             | measure Performance Per Watt (e.g.
             | https://www.hwcooling.net/en/intel-
             | core-i7-13700k-efficient-...). Intel has consistently
             | driven higher PPW with every generation, when you're
             | comparing like-for-like binned chips. AMD, on the other
             | hand, has been a bigger victim of what the OP is
             | describing; while their raw PPW is generally higher than
             | Intel, so they have room to fall, their Ryzen 7x chips
             | aren't consistently posting higher PPW numbers over Ryzen
             | 5x.
             | 
             | In other words; both Intel and AMD are doing well here. If
             | you're stressing every core on the CPU at 100%, you're
             | going to draw a lot of power, but you're also going to be
             | completing workloads much faster than on 12th or 11th gen,
             | so your aggregate power draw will be lower. The low-tier
             | media outlets that post "OMG 250W" aren't doing research,
             | and also don't care to, because they get clicks from tons
             | of people like the OP who eat outrage at face value.
        
               | BoorishBears wrote:
               | It's not just low-tier media outlets (at least in terms
               | of reach), this is largely driven by large names like LTT
               | and Gamer's Nexus
               | 
               | People don't realize that mid range mobile chips from
               | both Intel and AMD out perform top of the range desktop
               | SKUs from a few years ago because they're constantly
               | bombarded by hot takes based on things like _CPU_
               | rendering and unzipping files...
               | 
               | At this point I'm convinced it's just an informal cycle,
               | where if they actually reported in a realistic manner, no
               | release would be exciting. If they didn't pair bottom of
               | the line CPUs with top end GPUs and odd settings
               | configurations under the guise of "not wanting to reflect
               | a GPU bottleneck", it'd be a lot clearer how badly needs
               | have stagnated vs the speed of these new SKUs
        
               | gruez wrote:
               | >At this point I'm convinced it's just an informal cycle,
               | where if they actually reported in a realistic manner, no
               | release would be exciting. If they didn't pair bottom of
               | the line CPUs with top end GPUs and odd settings
               | configurations under the guise of "not wanting to reflect
               | a GPU bottleneck", it'd be a lot clearer how badly needs
               | have stagnated vs the speed of these new SKUs
               | 
               | Hardware unboxed did a video explaining why reviews do
               | that: https://youtu.be/Zy3w-VZyoiM
               | 
               | tl;dw: you're not caring about the performance today,
               | you're caring about the performance a few years from now
               | when you bought a faster GPU.
        
           | chx wrote:
           | ^^^^ this
           | 
           | Someone on reddit did a power analysis of the 13900K and I
           | reposted it here
           | https://news.ycombinator.com/item?id=34404683 https://www.red
           | dit.com/r/hardware/comments/10bna5r/13900k_po... and it shows
           | the CPU at 100W has 75% of the performance at 40% of the
           | power consumption...
        
             | oblak wrote:
             | It's still pretty bad compared to Ryzen 7000. The non-x
             | models are just the regular guys at 65W, like my good ol
             | 3700x.
             | 
             | That said, Intel sure has magic dust given how far behind
             | their node disadvantage. Serves them right, though. What a
             | horrible company.
        
               | reisse wrote:
               | Their node disadvantage is mostly good marketing from
               | TSMC though. Intel 10 (10nm, but that's not the real
               | nanometers) is ~100M transistors per mm2, while TSMC 5
               | (5nm, but again, that's not the real nanometers) is ~130M
               | transistors per mm2.
               | 
               | Sure, 30% better, but not a 2x improvement as marketing
               | suggests.
        
         | scarface74 wrote:
         | I doubt the decline of PC shipments is because of a
         | statistically few people building gaming PCs
        
         | maerF0x0 wrote:
         | If you care about price, the 3060, RX 6700 XT, or Intel Arc
         | A770 16GB are your best bets.
         | 
         | xx80 and xx90s are in the class "If you need to ask the price
         | you cant afford it."
        
         | neogodless wrote:
         | > There are no good current gen (or even previous gen) cards
         | available for under $1000
         | 
         | Wait, what? Yes there are! An AMD Radeon RX 6800 will set you
         | back $480.
         | 
         | What is your criteria for "good" here?
         | 
         | My computer would be, at today's prices, about $1400.
         | 
         | AMD Ryzen 9 5900X 12 core | 32GB DDR4-3600 | 2 x 1TB PCIe gen 4
         | | Radeon RX 6700 XT 12GB | (Corsair case, PSU and AIO, MSI
         | X570)
        
         | shmerl wrote:
         | _> Add to all of that, this latest batch of CPUs are just
         | stupid power hungry - like 240w+ under load_
         | 
         | Eco mode is a thing for AMD CPUs. There is no point really in
         | not using it by default - benefits are very marginal with power
         | cost being disproportionally huge. And AMD are doing it more
         | for marketing reasons to gain some single digit percentages in
         | benchmarks.
         | 
         | So just enable it (105 W one) and enjoy way more reasonable and
         | efficient power usage with basically almost the same
         | performance.
         | 
         | See some details here: https://www.youtube.com/watch?v=W6aKQ-
         | eBFk0
        
         | AussieWog93 wrote:
         | I wonder how much of this increase is due to social media.
         | 
         | 10-15 years ago, people would buy graphics cards to play the
         | latest games on. Epeen was a thing, but limited to just some
         | text in your forum signature.
         | 
         | Now, it seems like half the reason people buy any sort of
         | "status" item is for the clout on Insta. It was eactly the same
         | thing with the PS5 2 years ago - people clamouring for a
         | useless object just to show other people they have it.
        
           | Clubber wrote:
           | I believe bitcoin mining contributed a lot to the demand side
           | as well.
        
           | tedivm wrote:
           | 10-15 years ago people were posting their rigs, adding liquid
           | cooling, ridiculous lighting, massive overclocking. It was
           | definitely more than a forum signature.
        
         | geraldhh wrote:
         | > Want a 10Gb onboard NIC? Be prepared to shell out $1000 just
         | for the motherboard.
         | 
         | quick search turns up options for around 500euros
         | 
         | but don't be fooled, 10GbE copper is a power hungry mess. go
         | with lower n-baseT speeds if just want some progression for the
         | last twenty years of networking innovation or invest in optical
         | and get a real upgrade (20/40/100 Gbps)
        
           | flangola7 wrote:
           | What is the use case of 10Gb Ethernet for the regular PC
           | user? Or even the enthusiast?
           | 
           | I have a sprawling homelab and home theater and scarcely need
           | regular 1G. Last summer I transferred a media archive of
           | ~10TB to new hardware, which completed overnight across Cat5.
           | Is there some hot new hobby that generates multi gigabit
           | bursts that I don't know about?
        
             | [deleted]
        
             | lazide wrote:
             | Uh, at maximum speed on a gigabit network, it takes 22
             | hours, 13 minutes to transfer 10 TB.
             | 
             | That's not overnight unless you're a hell of a sleeper.
             | 
             | On a 10Gig network, assuming no other bottlenecks (harder
             | to do), it's 2 and a quarter hours.
             | 
             | Personally, I use 10 gig because 100MB/s is really slow.
             | Even 2.5Gb is much, much better and makes it a lot more
             | likely something besides the network is the bottleneck.
             | 
             | I personally move around enough data, it bothers me enough
             | to be worth it.
        
             | sokoloff wrote:
             | Editing video files. My kids are both into making videos
             | and even though most of them are crap, I still want to get
             | them into good habits of "store important files on the
             | storage server" and we don't have any redundancy on local
             | file storage. The difference between 1 and 10Gbps ethernet
             | is noticeable there. (They only have 2.5Gbps to the desktop
             | switch which has 10Gbps link to downstairs, so they only
             | get ~250MB/s transfer, but they each have it simultaneously
             | if they want.)
        
             | progman32 wrote:
             | My only 10g link is between my main workstation and my nas.
             | Between the SSD on my workstation and the 2 ssd array in
             | the nas, I easily saturate 1g. Makes a big difference for
             | video work and certain dev tasks (i.e. generating disk
             | images). Granted I spent less than $100 for this capability
             | using older solarflare cards and optics.
        
           | krisroadruck wrote:
           | AM5 w/ PCIE 5, DDR5 support and 10GBe? I've already eaten the
           | cost, but I would love a link to the board you found. I saw a
           | lot in that price range with 2.5 or 5GB but didn't run into
           | any with 10GBe. That said I may have missed one. If nothing
           | else it may be useful for when the wifes gaming rig gets an
           | upgrade in a few months.
           | 
           | As to why I'm sticking with 10Gbe - I have a 24 port 10GBe
           | switch for the house so going with the kit that matches the
           | network I already have.
        
           | cptnapalm wrote:
           | I got an older Supermicro off eBay with 6 10Gb ports for $89.
        
       | nickpeterson wrote:
       | Is AMD completely out of the arm chip game? I know they had
       | interest a few years back but seemed to abandon it. I'd really
       | like an option to buy an arm cpu and motherboard from someone who
       | will support it. Basically something in between a rpi and a
       | MacBook. 400-500 with upgradable ram, storage, and gpu.
        
         | opencl wrote:
         | At least for now AMD doesn't have anything ARM. There are ARM
         | systems available but nothing anywhere close to that price
         | range.
         | 
         | The Honeycomb LX2 is probably the closest thing to that
         | currently, but it launched for $750 and has since gone up to
         | around $920. Performance is not remotely competitive with x86
         | systems at that price point.
         | 
         | There are some systems based on the Ampere Altra chips, but
         | nobody sells the motherboards/CPUs on their own and a full
         | system will run you at least ~$6000.
        
           | tambre wrote:
           | Adlink sells an COM-HPC motherboard [0] and a corresponding
           | Ampere Altra module [1].
           | 
           | [0] https://www.adlinktech.com/products/Computer_on_Modules/C
           | OM-... [1] https://www.adlinktech.com/Products/Computer_on_Mo
           | dules/COM-...
        
         | dehrmann wrote:
         | Googling "ARM ATX motherboard" has some results, but these are
         | so uncommon, everything's going to be an uphill battle.
        
       | varelse wrote:
       | [dead]
        
       | andrewstuart wrote:
       | I wonder if Nvidia and AMD are partly to blame.
       | 
       | They are artificially keeping GPU prices high, so people don't
       | want to buy GPUs.
       | 
       | And if they don't want to buy a GPU then they don't want the
       | thing that the GPU goes in.
        
       | than3 wrote:
       | The prices are simply too high for the marginal benefit they
       | offer.
       | 
       | Marginal costs outweigh the benefits so why would people buy?
       | This is simple economics and they know this, but they still price
       | fix because they must meet their minimum profits whatever that
       | may be.
       | 
       | Its a common problem with monopolies, as soon as the market place
       | shrinks to only a few players, where the means of production has
       | been concentrated, those people then start dictating prices and
       | may collude without even needing some conspiratorial agreement.
       | 
       | Many people also ignore the fact that Intel ME/AMT and the AMD
       | equivalent features that cannot be disabled, are not documented,
       | and are primary targets; are becoming more well known, and in
       | general people don't want it.
       | 
       | Businesses may find value in those features, but individuals find
       | cost (i.e. their privacy, and greater future risks that are
       | unquantifiable).
       | 
       | They've broken their own market, and the rot will only get worse
       | for them since its unlikely they will right the course. Many IT
       | people wonder if there isn't some secret government mandate
       | requiring these companies to embed management co-processors. It
       | is clearly offering only minimal value to IT, and its seen as a
       | cost for individuals that know about it.
       | 
       | They really need to reconsider their actual market instead of the
       | fairy magic kingdom type thinking they have been following.
        
         | xen2xen1 wrote:
         | You're putting the fact that modern computers have AMT up with
         | Covid, supply shortages, and Crypto crashes in terms of sales
         | loss???...????? You really, really need to get out of whatever
         | bubble you're in.
        
       | kibwen wrote:
       | _> McCarron shines a glimmer of light in the wake of this gloom,
       | reminding us that overall processor revenue was still higher in
       | 2022 than any year before the 2020s began._
       | 
       | Suggests a correction precipitated by panic-buying during the
       | supply chain chaos of the pandemic era. Too soon for doom and
       | gloom for the PC market just yet. Mobile devices have been
       | dominating PCs since long before 2020, and if revenues were still
       | growing in the past decade then there's nothing to suggest that
       | this moment is suddenly the inflection point where the whole
       | thing will come tumbling down, even if you do believe that
       | something like that is inevitable.
        
         | Finnucane wrote:
         | Yeah, everybody upgraded their WFH office setups in the prior
         | two years, now no one needs a new pc. We're going to be good
         | for a while.
        
           | ricardobayes wrote:
           | Games also don't really require that good PC's these days,
           | you can pretty much still play everything on a decent 3-4
           | year machine
        
             | D13Fd wrote:
             | Everybody targets console specs (and sometimes last gen
             | console specs...), so if your PC exceeds that bar, it's
             | often wasted.
        
             | lostmsu wrote:
             | I am hoping that more deep learning based stuff gets
             | integrated, and they pushes hardware further.
        
           | giuliomagnifico wrote:
           | > excluding ARM
           | 
           | These figures exclude ARM CPUs, another possibility is that
           | lots of people are switching/buying ARM devices.
        
             | coffeebeqn wrote:
             | Isn't Apple high end computers the only ones? Maybe some
             | Chromebooks
        
               | giuliomagnifico wrote:
               | Yes there aren't many but they're growing very fast:
               | https://www.techspot.com/news/97571-arm-cpus-forecast-
               | captur...
        
           | nordsieck wrote:
           | > Yeah, everybody upgraded their WFH office setups in the
           | prior two years, now no one needs a new pc. We're going to be
           | good for a while.
           | 
           | Also, it feels like phones have entered that "Core 2 Duo" PC
           | stage where upgrades don't really matter as much any more. I
           | know software support can still be an issue, but at least on
           | the iPhone side, I don't feel like I need to upgrade before
           | my phone loses OS support.
        
             | downrightmike wrote:
             | I'm waiting on USB-C
        
             | ThrowawayTestr wrote:
             | I upgraded from an S10 to an S22 and I can barely tell the
             | difference.
        
               | elcapitan wrote:
               | Makes it even more painful that you HAVE to upgrade
               | nowadays because you just won't get any updates anymore.
        
               | ThrowawayTestr wrote:
               | I had to upgrade because the battery couldn't hold a
               | charge and the two replacements I bought were worse than
               | the original.
        
               | coffeebeqn wrote:
               | Exactly it's the same with laptops. If you know anything
               | about decent specs and don't buy something with a
               | ridiculous bottleneck and do a fresh install you can find
               | a 5+ year old laptop that is amazing for anything other
               | than extreme workloads. I mean like rendering scenes -
               | not VSCode and Slack or something
        
               | ghaff wrote:
               | I use an Apple Silicon MacBook Pro for multimedia and
               | some other things. But most of what I do on a 7+ year old
               | MacBook and iMac does just fine.
               | 
               | I suspect you also have something of a generational thing
               | with many younger students not even using PCs.
        
               | robryan wrote:
               | Apple going to have a hard time getting people with M1
               | macs to upgrade any time soon.
        
               | tyre wrote:
               | I had a 2015 MBP that was falling apart and waited for
               | the M2 refresh. I expect that I won't get another laptop
               | until 2030.
               | 
               | I have an iPhone 11 (fall 2019) and I see no reason to
               | upgrade.
               | 
               | It does feel like we've hit a plateau. The only step up I
               | can see would be on-device ML/"AI" models. Removing the
               | latency and improving offline capabilities of something
               | like Siri would open some doors.
        
               | ghaff wrote:
               | Yeah. On-device ML. Cameras are also still improving YoY
               | as more people abandon standalones. But while I'm usually
               | on a 3 year cycle, I'd have to be convinced with this
               | year's model.
               | 
               | And I may slide in a Mac Mini/Studio in place of my iMac
               | at some point, I'm not really in a hurry in spite of
               | being out of OS update support. It's basically a browser
               | machine given I have a newish laptop.
        
             | bushbaba wrote:
             | Hence apples shift into the services segment. Pushing Apple
             | Music, Apple TV, iCloud, etc.
        
               | dfghjjbhvg wrote:
               | and payments.
               | 
               | visa and mastercards will not last another decade if
               | hardware sales don't pick up soon.
        
             | EamonnMR wrote:
             | Great way to put it. My current phone feels as useful now
             | as the day I got it, whereas previous smartphones started
             | to feel sluggish as apps and sites got slower and it felt
             | outclassed by newer cameras.
        
             | mosquitobiten wrote:
             | They also entered the extreme bloatware phase, see new
             | Samsung phone with 60GB occupied out of the gate.
        
         | newsclues wrote:
         | So you had a few factors.
         | 
         | COVID/WFH Panic Purchasing for work and school at home. COVID
         | Cash also put money into peoples hands to buy stuff, like
         | computers. Crypto mining and GPU shortage was also a factor. As
         | people were buying systems and parts for speculation. People
         | were buying prebuilt computers to mine or simply get the GPU.
         | Scalpers made everything worse, messing with parts in the
         | supply chain.
         | 
         | So there is the supply and demand factors, and the extra money
         | for consumers and the speculation for crypto, and it was a
         | perfect storm.
        
         | q1w2 wrote:
         | This is why year-over-year numbers are not good to look at.
        
         | kurthr wrote:
         | I agree with the sentiment, but mobile devices have also seen a
         | plunge in sales. Many in the industry expected 2B units/year,
         | but it maxed out around 1.6B. The last few years have seen
         | volatility, between 1.2-1.4B. Last year was the worst since
         | 2016, and the next worst was 2020.
         | 
         | Only Apple has been relatively flat, and probably only they and
         | Samsung are very profitable.
         | 
         | Most of the profitability in "PC" (GPU/CPU) has really been
         | datacenter for 5 years. Again, Intel executed badly, but the
         | decision to focus on datacenter was right.
        
           | throwaway5959 wrote:
           | Now all the cloud vendors are making their own chips. This
           | had to have been obvious given how a small company in the UK
           | could build the Raspberry Pi.
        
             | gruez wrote:
             | >This had to have been obvious given how a small company in
             | the UK could build the Raspberry Pi.
             | 
             | According to wikipedia they use chips from Broadcom. I'm
             | not sure how a small company being able to make SBCs using
             | chips from a massive multinational is indicative of how
             | easy it is for cloud vendors to "make their own chips".
        
       | nathants wrote:
       | i recently built my first pc and moved my daily driver from being
       | a thinkpad to a custom desktop.
       | 
       | i was already sitting at a desk, so ergonomically it's identical.
       | 
       | now i can compile blender in 20 seconds and fly around with
       | eevee. i can compile linux with custom lsm modules.
       | 
       | dual ssd makes it easy to dual boot. reboot into windows and i
       | can have a magical evening.
       | 
       | 7950x, 4090, 990 pro. it would be great if these were cheaper,
       | then more people could afford to use them. it's also ok that they
       | are overpriced. cest la vie.
       | 
       | to anyone spending a majority of their life on a computer and
       | making money, the cost of your primary computer doesn't matter
       | unless it's ludicrous.
       | 
       | the opportunity cost is far higher. what might you have learned
       | had you tinkered with blender or a kernel when you were bored?
        
       ___________________________________________________________________
       (page generated 2023-02-12 23:01 UTC)