[HN Gopher] Intel Core i7-12700K Review
___________________________________________________________________
Intel Core i7-12700K Review
Author : ItsTotallyOn
Score : 69 points
Date : 2021-11-20 14:46 UTC (1 days ago)
(HTM) web link (www.tomshardware.com)
(TXT) w3m dump (www.tomshardware.com)
| jbverschoor wrote:
| At the current energy prices, intel needs to up their game.
| HelixEndeavor wrote:
| Feels to me like we're starting to hit the brick wall with
| x86-64. Only way we can squeeze more performance out of these
| chips to make them physically larger, squeeze the components into
| tighter space, and suck more and more power and generate more and
| more heat.
|
| As much as I dislike Apple's overall business practices, they are
| 100% correct about ARM being the future of high performance, high
| efficiency computing.
|
| And we need high efficiency in a world that is becoming starved
| of energy as more and more people get connected every day.
| monocasa wrote:
| It's Intel's stagnant process nodes you're seeing here more
| than x86_64.
| gsnedders wrote:
| At least to me, it always seems like the differences in gaming
| performance between mid-to-high end CPUs are relatively small,
| and frequently dominated by the differences between GPUs. Given a
| fixed budget, does it actually make sense from a gaming point-of-
| view to put the money into a high-end CPU any more?
|
| (Aside from gaming, there's definitely plenty of wins from going
| to higher end CPUs, but I'm curious about the gaming case
| specifically.)
| TheGuyWhoCodes wrote:
| It depends on how much the game is CPU heavy, specially single
| core performance on some games makes a big difference. High end
| CPUs tend to have a higher single core performance.
| belval wrote:
| You are pretty much right on that point, both a 5600X from AMD
| or a 12600k from Intel won't bottleneck a 3080Ti or 3090 when
| gaming.
| JaimeThompson wrote:
| It depends on what sort of games you will be playing. For some
| simulation and related games such as Factorio and the like the
| CPU can make a large difference.
| Macha wrote:
| At the high end, factorio is one of those games (the other
| being flight sim) that also gains a lot from faster memory
| iirc.
|
| Though at the low end, factorio is also a really well
| optimised game and can run a 100spm factory at 60ups pretty
| comfortably on a mid range cpu. It's only the large
| multiplayer games or megabases where it starts to hit
| performance limits.
|
| Cities Skylines is probably the best example of a game where
| the average player has big gains in performance in regular
| play they could get from a cpu upgrade.
| adgjlsfhk1 wrote:
| I think you're underestimating factorio significantly.
| There are 20kspm bases that are run at 60 ups. I doubt most
| people will even feel the game performance start to degrade
| before 1kspm.
| Macha wrote:
| > There are 20kspm bases that are run at 60 ups.
|
| There are, but megabase (>1k spm) territory is where
| players start making design decisions around ups impact
| like avoiding large logistics zones or avoiding heat
| pipes.
| LordKeren wrote:
| Your assumption lines up with the general PC gaming advice.
|
| There are a few games that do strongly benefit from a better
| CPU, but those are the exception, and they still have
| significant dismissing returns
| Drew_ wrote:
| Not really true in my experience for multiplayer games.
| Pretty much every multiplayer game is CPU and memory bound
| and any GPU limitation can be mitigated by just lowering
| settings. Nothing can be done to compensate for weaker CPU
| performance unfortunately. If I had to skimp on GPU or CPU I
| would pick GPU every time.
| hmottestad wrote:
| I had the understanding that once you move from 1080p to 4K
| then the GPU becomes the bottleneck for most AAA games. Pro
| gamers will probably stick to 1080p and a 144hz monitor.
| redisman wrote:
| Esports games are very greatly optimized to run at high frame
| rates. You don't need a 3080 to get infinite FPS in CSGO
| bufferoverflow wrote:
| Most fps pro gamers have switched to 240hz. I see a fewon
| 165hz, some on 300+.
| op00to wrote:
| Very nice, but you can't buy ddr5 right now so ... enjoy that
| fancy processor?
|
| Edit: Holy cow, you can use ddr4!
| Shosty123 wrote:
| You can get Z690 DDR4 boards. That's what I had to resort to so
| we could test our software on the i5-12600k. Getting a CPU
| cooler with an LGA 1700 compatible bracket was actually the
| hardest part.
| cyber_kinetist wrote:
| One downside of this current Alder Lake release might be the
| fact that the Z690 motherboards are much more expensive than
| the AMD's current mainstream motherboards. The CPU might be a
| tad cheaper but overall it might not be worth it (well, until
| more affordable motherboards appear).
| InvaderFizz wrote:
| Alderlake supports DDR4 and the performance difference is
| minuscule.
|
| The downside of LGA 1700 is that budget boards don't exist
| right now. You can pick up a B550 board any day of the week for
| under $100, sales as low as $60. Z690 is all $200+.
|
| The Z690 is a nicer platform for sure, especially for
| peripheral connectivity.
| 5e92cb50239222b wrote:
| > the performance difference is minuscule
|
| I thought DDR5 brought a 2x increase in throughput? It didn't
| help that much, it seems?
| Dylan16807 wrote:
| Memory throughput doesn't have a huge impact on performance
| in most desktop scenarios, and right now DDR4 is easily
| 3600MHz while DDR5 is 4800/5200.
|
| 4800MHz DDR4 even seems to be cheaper than 4800MHz DDR5.
| redisman wrote:
| Eventually yes but the kits available right now are not
| very impressive. Early adopter tax definitely. You'll
| probably get at least 2x faster memory kits eventually, I'm
| not sure how high DDR5 can scale
| kllrnohj wrote:
| It's less than 2x when you factor in the mature DDR4 that
| greatly exceeds jedec specs vs. the immature ddr5 that's
| sort of at bare minimum.
|
| Eventually ddr5 will outpace DDR4, but right now it's the
| awkward transition time. Not unlike early DDR4 vs. ddr3
| InvaderFizz wrote:
| The timings on DDR5 are very loose right now. Don't expect
| much real gains for another year or two as it matures.
|
| If you have a specific task that is extremely memory
| bandwidth constrained, it's great. But then again, you are
| probably looking at something with 8 channels if it is that
| big of an issue to you.
| jquery wrote:
| I'm using 8-channel DDR4-3200. Would I get any benefit
| from 4-channel DDR5-5xxx? Is 8-channel DDR4-3200 roughly
| equivalent to 4-channel DDR4-6400, or am I missing
| something?
| InvaderFizz wrote:
| You're not missing much. Alder Lake (2ch DDR5-4800)
| achieved pretty impressive memory bandwidth, but it is
| still eclipsed by 4ch DDR4-3200 on Threadripper. If you
| could get 2ch DDR5-6400, it would be the same bandwidth
| as the TR4.
|
| What I don't understand well enough to know the impacts,
| is how latency plays into specific workloads.
|
| 2ch Alder lake is about 70GB/s[0]
|
| 4ch Threadripper is about 85GB/s
|
| 8ch/64c Threadripper Pro/Epyc is about 140GB/s[1]
|
| 0: https://hothardware.com/reviews/intel-12th-gen-core-
| alder-la...
|
| 1: https://www.anandtech.com/show/16805/amd-threadripper-
| pro-re...
| op00to wrote:
| Woah! I can use my existing ddr4 memory? I didn't even
| research it after seeing ddr5 was unobtanium! Thank you!
| miohtama wrote:
| Does K stand for Kelvins? :)
| VortexDream wrote:
| Wow, this seems like an impressive step forward for the CPU
| industry and the kind of change we've been hoping for since Ryzen
| started giving Intel a run for its money. I can only hope that
| AMD continues to stay competitive and keep Intel on its toes.
| xwdv wrote:
| They are still far behind AMD.
| jjcon wrote:
| Not according to these and other benchmarks - what makes you
| think that?
| xwdv wrote:
| These are lagging benchmarks, you don't think AMD hasn't
| been working on the next big thing?
| ed25519FUUU wrote:
| These benchmarks suggest they are _much_ slower at bringing
| the chips to market than AMD, but they're able to match or
| beat performance dollar-for-dollar.
| devonkim wrote:
| It's a bit of a complicated matter in terms of pricing as a
| measure of performance / dollar because with the current
| Ryzen CPUs AMD increased pricing relative to the previous
| generation and Intel being in its current position is
| lowering pricing to remain competitive. So AMD is reducing
| pricing accordingly and it's not clear what the margins
| look like (the 5800X in particular is an awkward chip to
| produce and now may be forced to even sell at a loss).
|
| There's many factors to consider in TCO like performance /
| watt as well combined with motherboard ecosystem which has
| been a historical AMD weak point. So it's tough to compare
| CPU pricing on an apples to apples basis even if the CPUs
| were otherwise exactly the same performance and pricing
| tehbeard wrote:
| > motherboard ecosystem which has been a historical AMD
| weak point
|
| Is this referring to pre AM4 socket? Or the faff around
| Ryzen 5000 not supporting the earlier AM4 motherboards?
| devonkim wrote:
| AM4 _and_ prior. Motherboard quality has varied
| considerably across different manufacturers compared to
| Intel's partners. One other example of recent annoyances
| is the launch situation with B450 and X570 and now with
| B550 and X570S existing years later basically. Gigabyte's
| one manufacturer that has been swapping out parts to
| lower spec and they don't get away with it in Intel's
| partner network but somehow AMD isn't penalizing Gigabyte
| at least publicly to keep this from happening.
| R0b0t1 wrote:
| For a long time AMD motherboards just had less features.
| Unsure if he means that specifically, but it was a pain
| point when I was building an AM4 system.
| jacquesm wrote:
| Assuming the chips and motherboards are actually available
| (at normal prices), and that the production systems will
| perform in the same way as the benchmark systems.
| Sponge5 wrote:
| what are normal prices anymore?
| vondur wrote:
| It looks like to get the best performance benefits out of these
| newer CPU's, you need to run Windows 11. (Linux will have support
| for them in the near future I believe)
| tehbeard wrote:
| 125 - 190W thermals.... That seems like a lot, almost into server
| territory.
|
| Sacrificing ~5% "game performance" (such a nebulous number given
| how varied the CPU/GPU load can be between games) for being able
| to sit comfortably in the same room seems like a no brainer.
|
| This still feels like a halo product? Except does the i5 or such
| compete well enough with Ryzen for that to work?
|
| I'm curious to see how AMD responds, I wonder if they're chiplet
| method lends itself well to launching their own hybrid E/P core
| architecture alongside "3D V-cache".
| jpalomaki wrote:
| Maybe we should start looking at placing the computer away from
| the gamer.
|
| There seems to be for example optical TB3 cables, maybe
| something like this could be part of the solution.
|
| https://www.macrumors.com/2020/03/26/optical-thunderbolt-3-c...
| ramshorst wrote:
| Murdered by words.
| cyber_kinetist wrote:
| I don't think most consumers would really care about power
| consumption at full load. From looking at some benchmarks,
| Intel seems to have okay thermals for gaming or light
| productivity tasks (thanks to the P-E core split). If you're
| doing any sort of rendering or number-crunching you're in
| trouble (unless you have a liquid cooler, which to be honest
| really sucks), but most people aren't that kind of person.
| asdfasgasdgasdg wrote:
| Plenty of air coolers can dissipate 180W. For example, almost
| all graphics cards come with air coolers that are capable of
| dissipating their full TDP, which is often much higher than
| 180W. Something like this:
|
| Noctua NH-D15 chromax.Black, Dual-Tower CPU Cooler (140mm,
| Black) https://www.amazon.com/dp/B07Y87YHRH/ref=cm_sw_r_apan_
| glt_fa...
|
| Would be able to dissipate 180W probably without even
| throttling up the fans that much.
| ben-schaaf wrote:
| A NH-U 14S is able to cool a (overclocked) 3990X at 450W
| reported.
|
| https://youtu.be/3GqqQQxdtUM
| cyber_kinetist wrote:
| Wow, guess I've underestimated the glorious power of
| Noctua.
| asdfasgasdgasdg wrote:
| Not just Noctua! Heat pipes are an amazing technology and
| there are a bunch of companies that have made really
| exciting thermal management products with them.
| vladvasiliu wrote:
| > Would be able to dissipate 180W probably without even
| throttling up the fans that much.
|
| That's very likely. I have an NH-D14 with only the center
| 140 mm fan installed on an i7-3930k overclocked to 4.3 GHz,
| and it barely ramps up the fan. The loudest fan in my
| computer is the PSU (an old 600 or 650 W Seasonic). It runs
| in a "silence-oriented" Define R3 case with closed door.
|
| Intel announces 135 W TDP for that CPU. Under load, OCCT
| says 160 W.
| tehbeard wrote:
| > Intel seems to have okay thermals for gaming or light
| productivity tasks
|
| That's not what you buy an i7 for though, is it?
|
| An i7 is typically for "I have work to do when not gaming" or
| content creators starting out, where as an i5 is more the
| category for gaming and light productivity / "the family pc".
|
| Getting an i7 exclusively for gaming seems more for bragging
| rights on "fastest CPU".
| asdfasgasdgasdg wrote:
| Depends on the type of game. There was a period of time in
| the 2010s where CPUs were rarely the bottleneck in gaming.
| But with the advent of better graphics techniques and more
| powerful graphics cards, along with more ambitious scenes
| and simulations, there are many cases today where a strong
| CPU is required for gaming.
|
| A recent example is Battlefield 2042. When I first got the
| game I had an AMD 3700X, which is no slouch of a CPU. But
| it could only drive the game at about 70fps, no matter the
| resolution. After an upgrade to a 5900X, I can run the game
| at 110-120fps. The strongest i5s would likely struggle to
| hit a stable 60fps on this game.
| gambiting wrote:
| So what you're saying really is that the 3700X had
| absolutely no issues whatsoever running that game. Yes
| it's nicer to play in 120fps, but it's not like you
| couldn't play the game because of the CPU. I know someone
| who had to upgrade from an older 4-core i7 to a modern
| CPU because Horizon 5 was actually stuttering. But the
| CPU "limiting" you when you're comfortably above 60fps,
| and you aren't playing competitive eSports, is not really
| a limit, just like how my car accelerating really poorly
| past 150mph is not really a limit in any real sense of
| the word.
| asdfasgasdgasdg wrote:
| > So what you're saying really is that the 3700X had
| absolutely no issues whatsoever running that game.
|
| Perhaps to specifications that would satisfy someone
| else. My requirements are ~120fps and 4k. For those
| requirements, no existing i5 would cut muster.
|
| > But the CPU "limiting" you when you're comfortably
| above 60fps
|
| I never used the word "limiting" in my comment. Not sure
| what you're quoting from. That being said, the 3700X was
| objectively limiting my framerates. It's not a value
| judgment, it's just an objective fact that such CPUs are
| inadequate to satisfy my preferences, and those of many
| other PC gamers. If you have different preferences,
| that's fine, but it's not really relevant when I'm
| talking about my own.
| gambiting wrote:
| Of course, and I myself play at 144Hz and would probably
| do the same. I think I just took an issue with the
| statement that most gamers are CPU limited nowadays - and
| while in strictly technical sense that's true, I don't
| think that a few years old CPU being able to run games at
| solid 60fps+ is a problem in any sense. Again, it's not
| like it physically can't run the game, it just doesn't
| run it "well enough" for some people.
| asdfasgasdgasdg wrote:
| > I just took an issue with the statement that most
| gamers are CPU limited nowadays
|
| I didn't say most gamers are CPU limited, though. I said
| "Depends on the type of game . . . there are many cases
| today where a strong CPU is required for gaming." I
| didn't say most cases.
|
| Conversely, it would be fine to observe that many console
| gamers have historically been satisfied with 30fps, and
| therefore PC gamers ought to only "require" a strong -2
| gen i5 processor. While you're at it, you could also say
| that console gamers game at 1080p, so PC players should
| be satisfied with that as well. And you'd be right, under
| a certain configuration of preferences, and a certain
| interpretation of the word "require."
| satvikpendem wrote:
| I mean, that's your personal opinion on >60 FPS gaming,
| it's not universally shared. I for one enjoy even single
| player games at the highest FPS I can get, which is why I
| have a 1080p 240 hz monitor. If the parent likes their
| games higher than 60 FPS then to them their 3700x really
| was a limiter in their enjoyment, something which the
| 5900x would not be for them.
| alasdair_ wrote:
| All the games I play are CPU bound. I have a 3090 and I
| still can't play World of Warcraft (a 16 year old game)
| with the settings on max without serious FPS drops in major
| cities because most of the game is bound by single threaded
| cpu far more than gpu.
| Negitivefrags wrote:
| There are rumours going around that the GeForce 4090 is going
| to have a max TDP of 650W.
|
| It seems insane but if true then consider that the combination
| of CPU and GPU is going to be 840W, and that is before the rest
| of the system is taken into account.
|
| At some point we are going to need to start worrying about
| installing dedicated venting for computers to the outside of
| the house.
|
| At some point you need to start integrating desktop computers
| into your HVAC designs.
|
| At some point you need to start having dedicated electric
| circuits installed like for your oven.
|
| My current computer (5950X with 3080) was already hot enough
| when gaming that I found it physically uncomfortable to have it
| under my desk and had to move it out from under there. My legs
| were burning up.
| oblak wrote:
| Sadly, not rumours. Next gen video cards from AMD/nvidia are
| going to cost a lot of money and eat many, many watts. I
| mean, we can live with that waste heat during the winter but
| I am not looking forward to having a 1kw spewing PC next to
| me in my gaming room
| Chris_Newton wrote:
| This seems crazy to me. I run an entire home office,
| sometimes with a reasonably powerful main workstation
| (enough for things like 3D graphics work), a second PC or
| laptop, and various server and networking equipment all
| running at once, and unless we're also using something
| power-intensive like a laser printer the entire room
| doesn't draw 1kW!
| baybal2 wrote:
| > There are rumours going around that the GeForce 4090 is
| going to have a max TDP of 650W.
|
| You will need a small DC welding machine for a power supply.
| gambiting wrote:
| I mean, 1000W power supplies are really not a big deal at
| all, but yeah, that's a lot of heat going into my room that
| I'd rather not have.
| ant6n wrote:
| Put the pc in a different room, run some cable for
| monitor, peripherals, drives. I wonder whether there's a
| way to extend the power button.
| righttoolforjob wrote:
| > I wonder whether there's a way to extend the power
| button.
|
| Any pair of wires will do...
| [deleted]
| tomcam wrote:
| > At some point you need to start integrating desktop
| computers into your HVAC designs.
|
| I think you have that reversed
| Jaygles wrote:
| Undervolting the 3080 can help a lot with thermals and
| sacrifices minimal performance up to a point. I undervolted
| my 3090 and it improved thermals by like 5-10c depending on
| the workload which lowered the fan noise and as a bonus
| lowered the coil whine. I don't have numbers on the power
| draw though but it should be a significant difference
| redisman wrote:
| I ran a beefy (3 power inputs) 3080 undervolted and you
| don't really lose more than a few percentage points of
| performance and it ran great on my 650W with a 3700X. I
| much prefer a little performance hit to loud fans
| nicolaslem wrote:
| Undervolting can even improve performance: my stock 5700 xt
| runs at 1.2V and reaches 110degC hotspot. It then drops the
| frequency a few hundred MHz to stay below the temperature
| threshold.
|
| By running it at 1.1V it reaches 90degC hotspot, below the
| 110degC threshold so it runs at maximum frequency
| constantly.
|
| Many GPUs (and CPUs) have their stock voltage quite high
| because it improves yield for the manufacturer. Unless a
| GPU is really bottom of the barrel regarding silicon
| lottery, it can usually be undervolted without issues. This
| is however less true for flagship models which are already
| pushing the silicon as far as it can.
| jmnicolas wrote:
| At some point we need to realize that wasting so much
| electricity for gaming isn't reasonable.
| kitsunesoba wrote:
| It really is insane, and limits upgradability of even brand
| new machines.
|
| Point in case, I have a newly built Ryzen 5900X tower that
| reuses a GPU from an old build. I chose a high quality 80+
| Platinum 750W PSU thinking that would give more than enough
| headroom for upgrading the GPU at some point, but if the next
| generation of GPUs are as power hungry as rumored I'll be
| limited to current generation cards unless I buy a new PSU.
|
| I hope efficiency comes to be a focal point again soon,
| because if the trend continues you'll need a 1kW PSU just to
| comfortably accommodate current enthusiast parts and have any
| hope of upgrading in the future, which is ridiculous.
| jacquesm wrote:
| With what the GPU will cost you will not see the price of a
| better PSU as more than a speedbump. You can also sell your
| old one to recoup some of it.
|
| Even so, 1KW+ personal computers are a bit strange, back in
| the day the very largest workstations (think IRIX fridges)
| were in that domain.
|
| What would be nicer would be if manufacturers aimed for
| minimal power consumption with the same performance as a
| previous generation to give people the option.
| mise_en_place wrote:
| FWIW I consider the thermals a feature, not a bug. Helps save
| on the heating bill, plus my house is already very well
| insulated. I have noticed that my Radeon 5500 XT has better
| thermals when passed thru to a Windows VM. It would seem that
| Linux kernel 5.15 has better thermals/fan control for amdgpu,
| I'm on a slightly older version.
| mastax wrote:
| I'm not into cryptocurrency, but I'm mining on my desktop
| right now because it's like running a space heater that
| pays for itself.
| spullara wrote:
| I currently have a couple machines in a room and it heats up
| so much I had to get a separate window AC in addition to
| central air.
| Macha wrote:
| Something like a QX9770 + SLI 680s could use similar power to
| a 12900k + 4090, which is similar in market positioning as
| the "halo product" now vs 10 years ago. The difference now is
| as more and more people who grew up gaming are now in the
| employed professional category, there are more people looking
| at what would have been the ultimate top end setups back
| then.
| AnthonyMouse wrote:
| > There are rumours going around that the GeForce 4090 is
| going to have a max TDP of 650W.
|
| This is kind of a move of desperation. GPUs are massively
| parallel so it's straightforward to make performance scale
| linearly with power consumption. They could always have done
| this.
|
| Nvidia is used to not having strong competition. Now AMD and
| Intel are both gunning for them and the market share is
| theirs to lose. Things like this are an attempt to stay on
| top. But the competition could just do the same thing, so
| what good is it?
| AussieWog93 wrote:
| More interesting than this is what's caused the performance
| expectation of gamers to rise faster than Moore's Law.
|
| It took us almost 10 years to move the standard from 1024x768
| to 1920x1080 (~2.5x increase in pixels/sec), yet in the past
| 5 years we've gone from 1920x1080@60Hz to 4k@120Hz (~8x
| increase).
|
| I'm not sure if it's because graphics aren't naturally
| improving the same way they were in the 00s and early 2010s,
| or streaming culture showing things off with the highest
| resolutions and framerates possible, but it's absolutely a
| modern phenomenon and I'm not sure how long it can last.
| kungito wrote:
| I'm not sure which setup and games you are thinking about
| but if I'm not mistaken, right now we are still struggling
| with 1440p@60hz for ubisoft games and other AAAs
| jpalomaki wrote:
| Maybe this is driven by the increase in disposable income.
|
| Many avid gamers likely want the best possible rig (in
| terms of computing power) they can comfortably afford.
| Hardware vendors try to meet this demand.
|
| Powerful setups are useless without games that benefit from
| power. This creates a demand for super highres graphics,
| even though those might not add much to the actual gameplay
| experience.
| Macha wrote:
| I don't think 4k120 today is in the same market position as
| 1080p60 5 years ago. 5 years ago 1080p60 was the bar, if
| your device couldn't do 1080p60, it wasn't a gaming
| product, it was a business product. I don't think 4k120 is
| there, I think we're just at the bar where 1440p60 is
| considered at that level. In that time, the higher end
| target has moved from 1440p60 or 4k30 to 1440p120 or 4k60,
| which means the halo products now have to promise 4k120 to
| keep ahead of the merely high end.
|
| I think the current increase in interest in high refresh
| rate gaming came from the shift away from TN panels. Once
| IPS became reasonably available, companies with TN products
| couldn't push for them as premium products from an image
| quality perspective, but high refresh rate was something
| they could still market with TN that there was a window
| where IPS and VA could not do that. So they did, gamers got
| exposed to high refresh rates and then moved those
| expectations to other products.
|
| Also tech like gsync/freesync meant that if your monitor
| had a higher refresh rate than your hardware could produce,
| you no longer had to deal with tearing.
|
| I think monitor tech is leading GPU tech at the moment,
| with 1440p240 and 4k120 monitors. There's certainly _some_
| games where those resolution/framerate combinations are
| achievable, mostly esports titles and older/smaller titles,
| but for the most part, no (also for me at least, past like
| 90hz you're into dimishing returns territory). But now
| people have these monitors capable of X resolution and Y
| refresh rate, people want to have their cake and eat it,
| wanting games that have modern effects for higher fidelity
| and yet also higher fps to take advantage of their
| monitors.
| dlevine wrote:
| I just upgraded my PSU from 500W -> 750W when I got my RTX
| 3070 (I'm running a 5600X).
|
| Seeing this makes me worried that maybe it wasn't quite
| enough.
| monocasa wrote:
| It's really concerning for Intel. Their high margin parts are
| in the data center space, which is extremely sensitive to
| perf/watt. If they're getting perf by throwing extra watts at
| the problem, are the data center parts going to be competitive?
| If not, it doesn't bode well for Intel, as that gravy train
| money is part of what let's them spend a stupid amount on R&D.
| They could starve themselves out of the investment money they
| need in an increasingly competitive market.
| ant6n wrote:
| Well they want to already go to the next node in H2/22,
| "intel 4", so hopefully performance/watt will improve then:
| https://www.anandtech.com/show/16823/intel-accelerated-
| offen...
| monocasa wrote:
| TSMC is set to have full production of N3 by then too, and
| they have actually been meeting their public statements
| about process timelines. Intel unfortunately has a
| increasingly uphill battle.
| boyadjian wrote:
| Just buy the Core i7-12700 non K version : It is cheaper, have
| lower thermals, and is almost as fast in most usages
| piyh wrote:
| Non K SKUs don't boost forever
| SmellTheGlove wrote:
| This was hard to read. I felt like the content was a little
| repetitive - maybe for SEO purposes? And full of over the top
| superlatives, mostly focused on the performance per dollar, and
| mostly ignoring the increased costs of motherboards (DDR5 aside)
| over the prior generation or the current generation of the
| competitor's product. [0] Not to mention focusing on an OS that
| basically no one uses right now.
|
| I'm sorry but it reads more like an ad. And just because it often
| needs to be said, I have no brand loyalty here, and mostly think
| brand loyalty in this space is kind of dumb.
|
| [0]: An example:
|
| ```Given its more amenable $409 price tag, it is quite shocking
| to see the Core i7-12700K deliver such a stunning blow to the
| $549 Ryzen 9 5900X in threaded work, highlighting the advantages
| of the x86 hybrid architecture.```
|
| I'd love to save $140 and get better performance, but where am I
| getting a motherboard that doesn't eat up most of that savings?
| Maybe it exists, but it's not mentioned. That's fine, it's a CPU
| review, but then leave the dollar figures out of it if they
| aren't complete.
| systemvoltage wrote:
| TomsHardware has been doing this ever since I remember getting
| on the internet.
| jeffbee wrote:
| The writing really is offensive, but at least it's all on a
| single page instead of spread across 75 pages like Anandtech.
| zitterbewegung wrote:
| Those are for ad impressions and they do a much better job
| testing and being neutral
___________________________________________________________________
(page generated 2021-11-21 23:02 UTC)