[HN Gopher] Intel plans spinoff of FPGA unit
___________________________________________________________________
Intel plans spinoff of FPGA unit
Author : ChuckMcM
Score : 109 points
Date : 2023-12-21 18:17 UTC (4 hours ago)
(HTM) web link (www.networkworld.com)
(TXT) w3m dump (www.networkworld.com)
| ChuckMcM wrote:
| Apparently they announced this in October. They are kicking
| Altera, which they acquired for 16.7 billion back out into the
| world to live or die on its own. Which feels a bit weird given
| what feels like AMD's relative success with Xilinx.
|
| It also dampens their "US developed" technology pitch (which they
| had been pushing pretty hard vs other solutions that were fabbed
| at TSMC) I wonder if they will also give up on their third party
| use of their fabs.
|
| Intel was the first "real" job I ever had and it was during Andy
| Grove's tenure as CEO and his "paranoid" attitude was really
| pervasive. I had been working on graphics chip which were the
| next big thing until the chip recession and then they weren't.
| And it followed a long series of Intel sending out tendrils
| outside its core microcomputer (eventually x86/x51 only) winners
| only to snap them back at the first sign of challenges.
|
| I wonder if we can get better open source tool support from the
| new entity. That would be a win.
| AceJohnny2 wrote:
| > _only to snap them back at the first sign of challenges._
|
| It's really quite impressive how bad Intel's track record has
| been. Would you say it's a broad cultural problem within Intel?
| Bad incentives?
|
| (I'b be interested if someone could come up with a list of all
| of Intel's failed ventures. StrongARM, Optane, Altera...
| Obviously some of these are par for the course for such a large
| company as Intel, but Intel still seems to stand out)
|
| (I know one reason Apple is so secretive about its ventures is
| because it knows many of them won't pan out, and it knows that
| due to its size anything it does has huge repercussions, and
| wants to avoid that)
| 1-6 wrote:
| When Intel does something like this, it reveals how tight
| their operating margin is. This is not a good sign. Intel has
| never been able to push away from the fact that they'll
| always be a hardware first company.
|
| It's a very tight market out there. TSMC was able to make it
| with very tight partnerships with OEMs and I doubt Intel can
| pull off something like that.
| ChuckMcM wrote:
| I agree with 1-6 that they have always been really focused on
| their margins, they were never a company that wanted to go
| into debt for something that was 'unproven.'
|
| They have had lots of things they stopped doing, the graphics
| chip I was involved in (82786) was one, the speech chips, the
| iAPX431, the modem chips, their digital watch was an early
| one.
|
| I always saw it as an exceptionally low tolerance for risk.
| When I was their the 386 had just fabbed and I was one of the
| people watching when the very first chip was powered up and
| it actually worked (which amazed pretty much everyone). Sun
| Microsystems was a little startup in Mountain View using the
| 68000 and getting some traction I suggested to Intel that if
| they used the graphic chip I was working on and the 386 they
| could build an Intel branded "Workstation" and both replace
| the VAXStations they were using for CAD work internally and
| sell them as part of Intel's initiative to become a "systems
| company." I got a response to my memo[1] from Dave House who
| was later CEO but at the time just the head of Microprocessor
| Operation (MIPO) Marketing explaining to this lowly NCG (new
| college grad) that Workstations were not much of a market and
| Intel was only interested in pursuing the "big"
| opportunities. Six months later I joined Sun (just after they
| IPO'd sadly) and didn't look back. :-)
|
| [1] Yes memo, as in typed up and printed out and put in a
| manilla envelope where on the last line of the front had it
| coming from Dave and being addressed to me. And yes, it
| started as a memo I had printed out and given to my manager
| because while Intel had "heard" of electronic mail they
| preferred the "durability" of actual mail and forcing people
| to use actual mail cut down on "frivolous" communications.
| FirmwareBurner wrote:
| _> I always saw it as an exceptionally low tolerance for
| risk._
|
| Big laugh at the claim of intel having a exceptionally low
| tolerance to risk when they're the only semi company left
| still running their own fabs, decades after AMD sold off
| their own for being too risky of a venture to keep running
| by themselves. And every fab company out there will vouch
| for how risky that line of business is. Even more so to
| pour billions into a business like this while knowing
| Samsung/TSMC might beat you.
|
| More correct would be that Intel has a low tolerance to
| risk for things that are way outside of their core
| competencies which have traditionally been designing X86
| CPUs and fabbing them, and that's mostly it.
|
| I commend them for not giving up on advancing their fab
| nodes despite the massive issues and setbacks(they could
| have spun it off like AMD did), and for building the ARC
| GPUs and keep improving them despite low sales.
| Grazester wrote:
| I thought AMD had to spin off their fabs due to cost and
| cash flow issue because Intel was too busy playing dirty?
| FirmwareBurner wrote:
| _" Real men have fabs!"_ - AMD founder and CEO at the
| time, Jerry Sanders [1]
|
| He said this some time in the 80's to early 90's as a
| defence for their expensive commitment to still be
| running their own fabs when most big semi companies like
| Motorola were selling off their fabs and going fabless
| because it was eating their resources and it was
| impossible to compete with the likes of TSMC on node
| shrinks and value.
|
| So even without Intel screwing with them in anti-
| competitive ways, they could not have kept their fabs
| competitive much longer. The writing was already on the
| wall way back then but Jerry Sanders was just being
| stubborn.
|
| [1] https://archive.is/igC4r
| kimixa wrote:
| I mean... Maybe? But Intel proved their success of owning
| fabs throughout the 90s and 2000s, only 20-30 years after
| that statement were Intel even considered to be possibly
| dis-advantaged by owning it's fabs. But that was also a
| very different market.
|
| So I don't think it's as direct a link as you seem to
| claim. Maybe it was the right decision even if AMD had
| the money to keep it going. But it will likely forever be
| one of those questions that has no solid answer either
| way.
| phkahler wrote:
| >> I thought AMD had to spin off their fabs due to cost
| and cash flow issue because Intel was too busy playing
| dirty?
|
| IMHO Ruiz cut back on engineering and increased
| marketing. Then purchased ATI which I though was an odd
| or even dumb move (it was not). But that expensive
| purchase along with the failure of bulldozer almost
| killed the company. That's why they spun off Global
| Foundries - to stay alive.
| oumua_don17 wrote:
| >> when they're the only semi company left still running
| their own fabs
|
| Yes and even within that they are risk averse, no need to
| look beyond 10nm and their reluctance to EUV.
|
| So they are really risk averse and they just want their
| success to continue infinitely. In fact, contrary to Pat
| G claiming that Nvidia is lucky; it's really Intel which
| was lucky.
|
| >> commend them for not giving up on advancing their fab
| nodes despite the massive issues and setbacks(they could
| have spun it off like AMD did), and for building the ARC
| GPUs
|
| even if the turn around time in semi market is longer,
| the way they faltered on their Arc timelines and no
| encouraging signs of their new fab nodes is not
| commendable!
| FirmwareBurner wrote:
| _> even if the turn around time in semi market is longer,
| the way they faltered on their Arc timelines and no
| encouraging signs of their new fab nodes is not
| commendable!_
|
| Sure, Intel fucked up along the way, and it's easy to
| point fingers and laugh from the comfort of the armchair
| at Intel tripping over massively complex engineering
| challenges, but if designing and building successful GPUs
| and sub-7nm fabs was easy, everyone would be doing it.
| ethbr1 wrote:
| > _Yes and even within [fab] they are risk averse, no
| need to look beyond 10nm and their reluctance to EUV._
|
| Wasn't there an article on HN 2 days ago about Intel
| pushing 2 new technologies simultaneously in their next
| node? (ribbon gate and backside power?)
| quercusa wrote:
| > _They have had lots of things they stopped doing_
|
| I have wondered if leadership didn't understand the Gartner
| Hype Cycle, because a lot of projects seem to get killed
| after the investments but before significant success was
| even possible (the Trough of Disillusionment).
| thijson wrote:
| I worked at Intel for 10 years, mainly the Otellini
| years. To myself I called it Corporate Attention Deficit
| Disorder. I think it's a symptom of bad management,
| always moving on to the next shiny rock, no long term
| vision.
|
| During my tenure I witnessed several failed initiatives.
| Itanium, Wimax, Digital Home, x86 phone (android), LTE
| modem. Imagine the billions wasted.
| senderista wrote:
| Surely Itanium was a massive risk (that obviously didn't
| pay off)?
| drjasonharrison wrote:
| Wikipedia has a partial list: https://en.wikipedia.org/wiki/L
| ist_of_mergers_and_acquisitio... what it is missing is the
| eventual disposition of the acquisition (product brought to
| market, unit pushed out, unit closed down, ...).
| AceJohnny2 wrote:
| > _I wonder if we can get better open source tool support from
| the new entity. That would be a win._
|
| Tragically, I wouldn't bet on it. In fact, I'd be shocked if
| that were to happen.
| raverbashing wrote:
| I think Intel is living their Nokia moment, except their lunch
| will be eaten much more slowly by the competition
| burnte wrote:
| They do this so often, buy a company, sell it a few years
| later. So weird.
| lisper wrote:
| Sometimes they don't even bother to sell it. In the case of
| Barefoot Networks, they bought it, spent three years
| developing the next-generation product, got it to within
| 80-90% of being ready for tapeout, and then just pulled the
| plug.
| andrewia wrote:
| One of my relatives is a high level Barefoot employee who
| was integrated into Intel leadership after the acquisition.
| I was told that it was killed because of worries about
| internal competition with Intel IPUs. It still seems like a
| ridiculously foolish idea, considering how much money Intel
| spent acquiring Barefoot. Not that the founders and CEO
| care, they got their money!
|
| The whole thing definitely seems symptomatic of Intel's
| extreme caution. I wonder why they didn't apply the same
| caution when buying Barefoot. Intel obviously didn't have a
| great idea on how to leverage them.
|
| This is all in stark contrast to AMD buying Pensando, a
| company that my SO works for. It makes sense for AMD to
| acquire Pensando to expand their data center offerings and
| compete directly with Intel. I think AMD made a smart buy.
| lisper wrote:
| I was a part-time contractor for Barefoot, and came along
| for the ride during the acquisition, so I have some
| first-hand knowledge of this. I am very much out of the
| managerial loop so I have no insight into the actual
| motives for killing the project, but I can tell you that
| at least one of the founders cared very much and fought
| tooth and nail to keep it alive.
|
| > It still seems like a ridiculously foolish idea,
| considering how much money Intel spent acquiring Barefoot
|
| Acquiring companies in order to kill them is a horrible
| business practice but not unusual. The thing that makes
| no sense to me is why they kept it going for three years
| before killing it. If they bought it in order to kill it,
| they should have done that _before_ spending another
| hundred million on it.
| ithkuil wrote:
| Perhaps having plausible deniability that they didn't
| acquire a competitor just to kill it was worth that extra
| hundred million?
| lisper wrote:
| Why on earth should Intel care about having plausible
| deniability about that? Buying a company in order to kill
| it is not illegal. It isn't even considered unethical
| except by a few utopian idealists. It's a common and
| accepted practice in the business world.
| crote wrote:
| Antitrust laws are a thing, you know. Buying out your
| competition in order to get rid of them is indeed
| illegal, once you get big enough.
| touisteur wrote:
| They might have discouraged alternatives to pop-up. 'Oh
| Intel has this in the pipeline, it's hopeless to compete,
| let's invest in something else'. I've heard this so many
| times, just to see Intel kill said product, product-line.
| ..
|
| At this point, if it's not about x86 cpus, listening to
| Intel's roadmaps seems foolish.
| ethbr1 wrote:
| > _The thing that makes no sense to me is why they kept
| it going for three years before killing it._
|
| Internal politics?
|
| Presumably the internal team(s) with overlap were against
| the acquisition.
|
| But it was likely easier to see if the integration failed
| on its own before spending the political capital to kill
| it.
|
| Folks forget executives at large companies usually
| optimize for "my career" over "the company."
| 7speter wrote:
| I don't know I don't have the credentials to be an engineer at
| Intel, but I think they know its a bad idea to end or spin off
| their foundry service, especially now that they have a new
| client that will know the ins and outs of their fabrication
| process.
| UncleOxidant wrote:
| > I wonder if we can get better open source tool support from
| the new entity. That would be a win.
|
| Unfortunately, this seems unlikely. The FPGA vendor tools are a
| complete shit show and open source tools would be greatly
| welcomed - they're doing really well in the Lattice/ICE40 space
| but those are small FPGAs. But Altera and Xilinx don't seem at
| all inclined to encourage the development of open source
| alternatives.
| chrsw wrote:
| That's not where the money is. They don't care about
| hobbyists, startups or schools. In FPGA world is about big
| contracts, big design wins. Networking, defense, space,
| industrial control etc. It's why the design tools are so bad
| too: there's no reason for the FPGA vendors to invest in
| those tools when the design wins come from size and
| performance. In fact, giving customers an escape hatch from
| vendor lock in would be a Very Bad Idea.
| UncleOxidant wrote:
| Indeed. I was working at a startup doing some FPGA work
| about 10 years ago. We ran into several bugs in the Xilinx
| software. Initially we could submit bug reports, but about
| 6 months in Xilinx suddenly changed their policy and
| decreed that only tier 1 customers (definitely not us)
| could submit bugs - everyone else had to look for help on
| the forums. We spent a lot of our time just working around
| the bugs in their tools. At that point I determined that,
| though FPGA development was kind of fun and interesting, I
| would not go into a field where I was going to be dependent
| on such buggy software and went back to the software side
| (where pretty much all the development tools are open
| source and if you run into trouble you're going to be able
| to get help).
| crote wrote:
| It does beg the question whether this is simply a chicken-
| and-egg problem.
|
| When it is almost impossible to develop for, you only get
| big contracts because nobody else has the resources to
| design products for it. On the other hand, with good and
| free design tools the toy projects done by hobbyists and
| schools can serve as the catalyst for using it in medium-
| scale projects.
|
| There are plenty of applications imaginable for something
| like Intel's SmartNIC platform - but you're not going to
| see any of them unless tinkerers can get their hands on
| them.
| mook wrote:
| Of course, that's kind of also why nVidia is trouncing AMD
| in ML/GPGPU. AMD chased after the big iron, but nVidia got
| stuff working on consumer hardware... (their early lead
| definitely helped too, of course)
| phkahler wrote:
| >> The FPGA vendor tools are a complete shit show and open
| source tools would be greatly welcomed - they're doing really
| well in the Lattice/ICE40 space but those are small FPGAs.
|
| Thanks for the stock tip. The size of a chip can change, and
| will. Stupidity really tends not to. Great dev tools are
| incredibly important, and the people using them actually
| _can_ influence decisions on which parts to use.
| wslh wrote:
| Side question, how do you see the future of Intel? Do you share
| the doom perspective or think that they continue to have
| opportunities ot catch up? Thanks!
| ethbr1 wrote:
| The answer to that is the intersection of thousands of state
| of the art fab technical issues and management decisions.
|
| Recent hindsight with TSMC makes it look "easy", but it's
| incredibly uncertain simply due to the complexity and number
| of pieces that have to align.
| vachina wrote:
| So what're they gonna name it this time? Altera? Hope they kept
| the original documentations.
| shrewm wrote:
| I'm thinking Ctrlera(tm).
| krallja wrote:
| I like how you think. Could tie into VR/metaverse hype by
| naming it after the Meta key, though.
| 1-6 wrote:
| This was actually a bright-spot in Intel's lineup that had a
| chance to be a moonshot. I don't know why Intel is giving up this
| early especially when AMD acquired Xilinx recently. Is intel
| trying to emulate Nvidia? I don't think they should try to become
| another Nvidia. They should be building programmable AI chips
| with FPGA. Heck, LLMs are able to code Verilog. There are many
| many possibilities.
| tester756 wrote:
| Why "giving up"?
|
| It's just spinoff and their people are at executive levels
| brucethemoose2 wrote:
| > They should be building programmable AI chips with FPGA
|
| Eh, people have been saying this for over a decade, and I think
| that opportunity has passed.
|
| FPGAs are not going to beat GPUs anytime soon due to the
| software ecosystem (among other things), and ultimately they
| are not going to outrun ASICs that are now economically viable
| (especially in the embedded space).
| crotchfire wrote:
| > FPGAs are not going to beat GPUs anytime soon due to the
| software ecosystem
|
| Completely true as long as the chip companies stonewall open-
| source toolchains.
|
| If Xilinx or Altera published the bitstream format for their
| highest-volume high-end chip (i.e. one or two generations
| back from the bleeding edge) you'd see the effect on the
| entire AI space within twelve months. I'm not kidding.
|
| The interest in this kind of access from developers is
| enormous, and the problems in this space are extremely
| regular. There are massive opportunities for FPGAs to avoid
| spilling to DRAM _or even to SRAM_ -- you have an ocean of
| tiny register-speed memories in LUTRAM mode.
|
| But it will never happen. And so FPGAs will continue to be
| trinkets for weapons manufacturers and not much else.
| MBCook wrote:
| They gave up on Xscale right before mobile demand shot to the
| moon.
| contrarian1234 wrote:
| They acquired Altera in 2015, which in the tech world is when
| the dinosaurs roamed the earth. How much more time would they
| need to see a profitable synergy? Bearing in mind they can
| probably see what's in the pipeline for the next couple of
| years.
|
| I'm not an expert... But in my naiive opinion it seems entirely
| reasonable to expect an acquisition to start paying off within
| ten years?
|
| Would love to hear counterexamples
| crote wrote:
| A big issue is that sticking an FPGA onto the _CPU die
| itself_ doesn 't really make sense. Those are some really
| expensive transistors, and you're essentially wasting them
| when they are not being used for cores or cache. Besides,
| you'd be spending significant amounts of money building a
| niche product.
|
| It would be a lot more viable to use a chiplet approach: bake
| the FPGAs on a slightly-cheaper node, and just glue them
| right next to the CPU itself. Unfortunately Intel is still
| _miles_ behind AMD when it comes to chiplets, and their tech
| is only just now barely starting to ship.
| adrr wrote:
| Because intel messes up everything with their bureaucracy. Look
| at the NUC, they made it extremely hard to get because of their
| convoluted distribution strategy. They could have easily done
| direct to consumer and made it easy to get.
|
| Till they fix their culture problem, they should be unloading
| these business units and profit by holding majority stakes in
| them.
| JoshTriplett wrote:
| This is sad news; I was hoping one day we'd see chips with
| substantial on-die FPGA fabrics, ideally in ways that we could
| program with open tools. This announcement makes that less
| likely.
| rwmj wrote:
| Red Hat supported a lot of research into this and there's some
| really interesting stuff, but nothing that is very compelling
| for commercial use. What uses do you have in mind?
|
| To my mind the more interesting stuff are the PCIe FPGA boards
| like https://www.xilinx.com/products/boards-and-
| kits/alveo/u200.h...
|
| One particularly interesting research project was using the
| FPGA fabric to remap addresses, allowing database tables to be
| "virtually" rearranged (eg. making a row-major data source into
| a column-major source for easier searching).
| https://disc.bu.edu/papers/edbt23-relational-memory
| https://arxiv.org/pdf/2109.14349.pdf
| thereisnospork wrote:
| As (very much) a layman I've hoped to see something like
| dynamic hardware acceleration. Eg for a game that has
| advanced hair simulation, or 3d sound simulation: reconfigure
| the fpga and offload the calculations. Maybe even more
| pedestrian things like fpga driven bullet tragectory physics
| might be implemented at the game engine level.
|
| More optimistically something along the lines of hands-off
| offloading, where if the scheduler sees enough of the same
| type of calculation (e.g sparse matrix multiplication) it can
| reconfigure the fpga and offload.
| magicalhippo wrote:
| I was thinking it would be a good fit for crypto. New
| algorithms could be implemented with better-than-software
| performance, constant-time algorithms could be ensured etc.
| adrian_b wrote:
| True.
|
| For instance, SHA-3 is quite slow on x86 CPUs, which,
| unlike the recent ARM CPUs, do not have any hardware
| instructions to accelerate it.
|
| If there had been an included FPGA, it would have been
| easy to implement a very fast SHA-3, as its hardware cost
| is very low.
| rwmj wrote:
| I think the problem is this misunderstands what FPGAs are
| good at. They're actually very bad (or at least painfully
| slow) at calculations like your examples. GPUs are good at
| that.
|
| FPGAs excel in parallelizing very simple operations. One
| example where FPGAs are good is where you might want to
| look for a specific string in a network packet. Because
| FPGAs are electronic circuits you can replicate the "match
| a byte" logic thousands of times and have those comparisons
| all run in parallel and the results combined with AND & OR
| gates into a yes/no decision. (I think the HFT crowd do
| this sort of thing to preclassify network packets before
| forwarding candidate packets up to software layers to do
| the full decision making).
| soulbadguy wrote:
| > FPGAs excel in parallelizing very simple operations.
|
| I don't think that's a good characterization. FPGA are
| good at what they end up being programmed for. And in the
| final analysis, everything a chip does is broken down to
| simple operation.
|
| The FPGA selling point has always been around perf/W and
| efficiency with regard to a "set of task". An ASIC will
| always be faster for a specific task, and CPU will always
| be faster on average on everything. However, when
| considering say "compression" or "check-sum" as general
| class of algorithm, and FPGA with a set of predefined
| configuration could be better and cheaper.
| 70rd wrote:
| Good at what they're being programmed for is a bit of
| tautology.
|
| FPGAs being selected for performance per watt was only a
| fairly recent phenomenon, when they were deployed on semi
| large scale as password crackers/cryptocurrency miners.
|
| Their real strength is ultimately real time processing
| (DSP or networking), with reconfigurability often quite
| valuable for networking applications. For DSP
| applications it's usually because a MOQ of custom silicon
| can't be justified.
| JoshTriplett wrote:
| > What uses do you have in mind?
|
| Acceleration of new hashing and cryptographic algorithms, new
| compression algorithms, new codecs, and many other similar
| things, without adding special-purpose instructions for them.
|
| Implementation of fast virtual peripherals. Accurate
| emulation of special-purpose hardware.
|
| Implementation of acceleration modules for well-established
| software. Imagine if popular libraries or databases or other
| engines didn't just come with acceleration using SIMD and
| other instruction sets, they also came with modules loadable
| on an attached FPGA if you have one.
| vlovich123 wrote:
| The problem is that these kind of applications suck for
| multi-tenant clouds which FPGAs turn out to be poorly
| suited for (high costs to switch out programs) and are too
| expensive for the consumer. So the applications are quite
| niche and limited to traditional use-cases of prototyping
| ASIC designs rather than actual algorithm accelerators
| which benefit more from dedicated circuitry/instructions in
| terms of adoption.
| rkagerer wrote:
| Anything where you need cycle-perfect timing.
| Laaas wrote:
| The new Ryzen chips already have this as "XDNA" AFAIU
| wmf wrote:
| XDNA is not an FPGA.
| wtallis wrote:
| I think that's just a neural accelerator, not an FPGA. The IP
| block happens to be from the Xilinx side of the company.
| blackguardx wrote:
| This is pretty funny. Intel bought Altera, which forced AMD to
| buy Xilinx with all the zero interest rate money floating around.
| AMD's purchase of Xilinx made less sense because AMD is fabless,
| but Intel didn't end up doing anything with Altera. Its not clear
| if Altera even started using Intel fabs for its chips. AMD's
| Xilinx has been comparatively more successful, but I don't think
| that had anything to do with AMD.
|
| Maybe we can look forward to all the ZIRP semiconductor
| consolidations to unwind.
| pclmulqdq wrote:
| The only "synergy" that has come from AMD-Xilinx is that AMD
| took a (relatively simple) DSP for machine learning that Xilinx
| had built and put it into their newer CPU lines. That's still
| better than Intel-Altera, which basically didn't integrate at
| all, despite having grandiose plans.
| kjs3 wrote:
| _AMD 's purchase of Xilinx made less sense because AMD is
| fabless_
|
| Xilinx was fabless before the acquisition. I'm missing how that
| made less sense.
|
| _with all the zero interest rate money floating around_
|
| Hehe...as one of my finance-world pals said: "everyone's doing
| M&As like drunken sailors".
| 0x457 wrote:
| Well, it "made" sense for intel because this allowed Altera
| to switch to Intel fabs. Not saying it's a huge strategic
| value or anything, just saying that's the difference.
| kjs3 wrote:
| Altera switched to Intel fabs from TSMC in 2013, so yeah it
| makes sense for Intel to pick up the folks you don't have
| to do much to integrate into your manufacturing process.
| But AMD and Xilinx both already being fabless meant it was
| a wash...no process integration to be done.
| 1-6 wrote:
| I hope Nvidia comes along and scoops up Altera. That will show
| Intel a lesson.
| ak217 wrote:
| Yeah, it's interesting to compare Nvidia's strategy to Intel's.
| I'm sure there are quite a few Nvidia projects that have been
| cancelled or even acquisitions liquidated, but they all seem to
| be small. Every significant part of Nvidia that I can remember
| is something they are committed to, sometimes over multiple
| decades, even when the market is not there and sales are near
| zero. This seems to come from actually having a consistent,
| stable long-term vision and buy-in all the way up to Jensen
| Huang serving as a driving force behind acquisitions and
| projects, unlike Intel where the driving force seems to be bean
| counting and market domination related.
|
| To give credit to Pat Gelsinger, his stated goal is to shed
| non-essential units and refocus on the fundamentals. But I'm
| not sure how well that's going.
| no_wizard wrote:
| Part of this I'm sure is management style and culture (Nvidia
| famously has a very strong get-it-done-at-all-costs culture
| for example) but I think the reason these ventures were
| persisted is because the founder (Jensen Huang) is still CEO
| and controls the majority of the company.
|
| Intel now, without Andy Grove and the founders steering the
| culture[0] it seems there is no-one who is able to steer this
| ship past quarterly results anymore.
|
| AMD had suffered decades of mishaps and near bankruptcy to
| finally find competent leadership in recent years which
| turned the company.
|
| [0]: After all, one core virtue that Andy Grove had was "only
| the paranoid survive", which in context, was all about never
| getting too comfortable with your market position. Granted,
| they also engaged in many illegal practices as essentially a
| monopoly in the 90s. I guess they lost his second virtue of
| _Competitive Mindset_ which as Grove saw it _viewed
| competition as the key driver of innovation and progress_.
| This was out the window by then, though I don 't know if he
| was effectively leading the company by 96. He stepped down in
| 98, but he had health problems before that - eventually
| diagnosed with prostate cancer. This is all to say that
| Intel's culture is a mixed bag of unhealthy paranoia and
| being used to being #1 in their categories, which is all
| incoming executives and major shareholders see when they
| think Intel. There's no-one to faithfully steer the ship back
| to clear waters it seems
| soulbadguy wrote:
| I think you might be attributing too much agency to the CEO
| and management of both companies. IMO there are a lot of
| external factors too.
|
| It might be the case that Andy Grove "only the paranoid
| survive" was really a factor. Or simply the fact that under
| Andy, the tech sector was really different : X86-64 market
| was the dominant and growing platform of compute, they
| pretty much had a strong set of patent around X86-64. The
| initial military and gov contract give them enough run away
| money to build a moat around having a fab.
|
| Also let's not forget despite all those advantage, intel
| was also fine 3 or time, for more than a billion each time
| for very anti competitive practice. It's unclear to me that
| intel was never the great company they think they are.
|
| Fast forward to now, X86-64 is no longer growing as fast,
| having a fab is no longer a moat with TSCM around (might
| even be seen as a liability), the WinTel monopoly doesn't
| have strength he had...etc... etc... The game is just much
| harder now.
|
| For nvidia, they had a triple boom market with first PC
| gaming, then crypto,the now AI... Hard to distinguish CEO
| performance from market performance.
| no_wizard wrote:
| Companies tend to graft their culture from the founders
| establishing it with early employees and outgrowth from
| there, it does say _something_.
|
| Now the market moving to mobile ARM chipsets definitely
| damaged Intel, though Intel had _years_ to come up with a
| viable alternative and ways to re-capitalize its fab
| infrastructure etc. They let their own moat run dry.
| Theories abound as to why, and since I 'm not CEO nor do
| I have access to Intels internal records, I can only
| speculate, however from available information I've been
| able to read, it seems that they became _more_ risk
| adverse as time has gone on, due in part, because there
| is immense pressure for any _new thing_ to quickly
| produce margins similar to that of their main CPU
| business. Which of course, is a conundrum, as barring any
| major breakthrough that puts them well and above any
| alternative, these sorts of things to diversify into
| strong businesses often take alot of time.
|
| Compare that to Nvidia: the market dynamics is what is
| allowing them to capitalize on 2 booms that ultimately
| leveraged GPUs for compute over traditional processors,
| but the _foundation_ for all this was laid in ~2007 (CUDA
| first came out then) and Nvidia had to both improve CUDA
| over time and invested in research around parallelized
| computing. If were not for their long horizon investment
| in this, they would not have been primed to take
| advantage of both crypto and AI.
|
| I'd argue that gaming while lucrative no doubt, is
| comparatively modest to what they are raking in over
| crypto and AI, all the while still investing in CPU / GPU
| integrated chipsets[0] and CPUs tailored to massive data
| center requirements around AI specifically[1], all of
| which they wouldn't be able to really take advantage of
| it not for their long term investment in things like CUDA
| and parallelized computing. I suspect quite strongly that
| without the founder at the helm, there would have been
| little internal cover to continue investing in these
| things at the rate Nvidia was prior to the boom times.
|
| [0]: https://www.reuters.com/technology/nvidia-make-arm-
| based-pc-...
|
| [1]: https://www.nvidia.com/en-us/data-center/grace-cpu/
| jonnycoder wrote:
| Every year Intel announces a new failure, and every year I feel
| more ashamed for having Intel as the bulk of my software
| engineering experience on my resume. I saw the signs when I was
| still there and attending quarterly updates. It felt like in one
| breath they admitted to missing the mobile boat and in another
| they said only gamers need dedicated/discrete powerful gpus.
| jvanderbot wrote:
| Intel can and does produce phenomenal technical products. Their
| compilers and HPC toolsets have done enormous good. Their chips
| are second to none (or perhaps tied for first, depending on who
| you ask). They are not a shameful company. They are just
| _yesterday 's giant_, and as such, are due for a Microsoft-like
| reinvention.
|
| Intel is my daily driver for gaming, due to it's incredible
| (but sadly cancelled) Extreme Compute Element form factor with
| its incredible customization.
|
| What they're struggling with is adapting to a changing world
| and to find new business centers. But this is a
| hardware/software world built on Intel, even if it changes
| daily.
| hedora wrote:
| Which intel chips are second to none, and by what metric?
| jvanderbot wrote:
| Trying to be measured here: Depending on who you ask,
| you'll find that for gaming, AMD or Intel alternate pretty
| regularly on technical benchmarks. At the moment, AMD is
| most power efficient, on average, and can eek out higher
| overall benchmark performance (as of late 2023). Throughout
| 2023, AMD and Intel were both listed as "best" on Toms
| Hardware, PC Mag, etc etc
|
| https://www.pcmag.com/picks/the-best-cpus-for-gaming
|
| For server marketshare, intel vastly dominates (80%?). For
| gaming marketshare, Intel still has 60-70% the marketshare.
|
| Even if it's a mixed bag, if given an Intel (or AMD)
| processor, there's no way I would throw it in the trash and
| if a current-gen chip, you can find things that either are
| better than the other with.
|
| Both are top-tier accomplishments. AMD has a signifigant
| fan-following though, which tends to distort the story a
| little.
| greenknight wrote:
| > Their chips are second to none (or perhaps tied for
| first, depending on who you ask).
|
| > For server marketshare, intel vastly dominates (80%?).
| For gaming marketshare, Intel still has 60-70% the
| marketshare.
|
| Marketshare is one thing, but saying they are the best
| but the marketshare is 60-70% is a totally different
| metric.
|
| Its saying oh Toyota has the best cars because they ship
| the most... but they cant be as good as a Lambo / Ferrari
| because they dont ship the most.
|
| In terms of server cpus, if you are looking at raw
| performance per socket, intel is no where near amd --
| https://www.phoronix.com/review/intel-xeon-
| platinum-8592/10 . This is based on Emerald Rapids /
| Bergamo.
|
| In terms of gaming cpus, they definetly are neck and neck
| with performance.
|
| But yes you are right, Intels chips arent something you
| would throw away in the trash. They are good chips. They
| need to stick around. Its just that you cant say
| marketshare makes them the best.
| jvanderbot wrote:
| I didn't say marketshare makes them the best. I said
| there were lots of metrics, that on any given metric, AMD
| and Intel trade first place spots frequently, and
| marketshare implies there's something special still about
| Intel.
| kouteiheika wrote:
| > Their chips are second to none (or perhaps tied for first,
| depending on who you ask).
|
| No. They're definitely behind AMD. AMD has significantly
| better power efficiency[1], better performance in gaming for
| most of the titles[2], and vastly more cores and better
| multithreaded performance (with Threadrippers[3]).
|
| [1]: https://gamersnexus.net/megacharts/cpu-power
|
| [2]: https://gamersnexus.net/cpus/intels-300w-core-i9-14900k-
| cpu-...
|
| [3]: https://gamersnexus.net/cpus/best-cpus-2023-intel-vs-
| amd-gam...
| kyrra wrote:
| I would argue that AMD is not the reason they are more
| power efficient, it's TSMC.
|
| Intel bet on a fabrication process that did not succeed. I
| will bet in the next 2 years Intel will be on par with AMD
| again.
| soulbadguy wrote:
| > I would argue that AMD is not the reason they are more
| power efficient, it's TSMC.
|
| Why ?
| allie1 wrote:
| Because the areas where Intel stumbled wasn't design, it
| was their manufacturing being behind, something tsmc does
| for AMD.
| jvanderbot wrote:
| > Based on the numbers we've given you, from our data, and
| the prices we have today, the decisions for the most part
| are pretty clear: The best gaming CPU is the 7800X3D
| (that's an objective fact), the most efficient part is the
| 7980X, the 5800X3D is the best upgrade path, and Intel
| makes the strongest showing in the i5-13600K or 14600K
| (whichever is cheaper) for a balanced build, or the 12100F
| for an ultra-budget build.
|
| Compare to toms' hardware 2023 listing earlier in the year.
|
| https://www.tomshardware.com/reviews/best-cpus,3986.html
| Category | Winner | Alternate Overall Best CPU for
| Gaming: Intel Core i5-13400 (Buy) [More] AMD Ryzen 5 7600
| (Buy) | Ryzen 5 5600X3D High Performance Value Best
| CPU for Gaming: AMD Ryzen 7 7800X3D (Buy) [More] Intel Core
| i7-14700K (Buy) | Ryzen 7 5800X3D (Buy) Highest
| Performance Best CPU for Gaming: AMD Ryzen 9 7950X3D (Buy)
| [More] Intel Core i9-13900K (Buy) Mid-Range Best
| CPU for Gaming: Intel Core i5-13600K (Buy) [More] AMD Ryzen
| 5 7600X (Buy) Budget Best CPU for Gaming: Intel
| Core i3-12100F (Buy) [More] AMD Ryzen 5 5600 (Buy)
| Entry-Level Best CPU for Gaming: AMD Ryzen 5 5600G (Buy)
|
| (summarizing: Intel best CPU, Best mid and budget entries,
| so winner in 3/5 categories. And what does it mean to give
| AMD "Best high performance" and "Highest performance?").
|
| This is a neck-and-neck race, with AMD a fan favorite, and
| year over year changes in leadership. No way is Intel out
| of this as a viable competitor of equal-ish standing. I
| don't mean to denegrate AMD in my original post, but I do
| mean to say that Intel is still producing top-tier results.
|
| https://www.tomshardware.com/news/amd-and-intel-cpu-
| market-s... In particular, Intel still has a 5x lead on AMD
| in server market, which, of course, is a pretty big market.
| soulbadguy wrote:
| > This is a neck-and-neck race, with AMD a fan favorite,
| and year over year changes in leadership.
|
| > (summarizing: Intel best CPU, Best mid and budget
| entries, so winner in 3/5 categories. And what does it
| mean to give AMD "Best high performance" and "Highest
| performance?").
|
| Intel as much more brand recognition than AMD. So calling
| AMD fan favorite is kinda strange to me. There are only
| two x86-64 chip designer right now, and from a pure
| market strategy it doesn't make sense for AMD to offer
| much more value to customer than what they can get from
| the only competition in town. If you want to really
| understand how intel is strugling you have to did into
| the execution speed of both companies, the profit margin
| on each SKU etc... etc...
|
| Intel is only neck and neck with AMD if you don't look at
| things like power efficiency and how much money they are
| actually making per chip.
|
| > In particular, Intel still has a 5x lead on AMD in
| server market, which, of course, is a pretty big market.
|
| This a function of market inertia more than anything.
| From a technical perspective AMD Zen 4 workstation and
| server offering seem to be much better these days.
| smolder wrote:
| Toms has always had a _slight_ bias towards Intel in
| their assessments, I think. Their picks are pretty
| reasonable, but the "highest performance for gaming"
| category with the 7950X3D and 13900k doesn't make a lot
| of sense to me, when the 7800X3D which won the "High
| perf" category beats the 13900k significantly in most
| games benchmarks, while drawing much less power. Intel
| has some great value offerings, but the 7800X3D is the
| real champ in gaming right now. The fact they have Intel
| winning 3/5 of their categories seems like a
| demonstration of their subtle bias. Other sources like
| Anandtech have historically done a better job of neutral
| reporting.
| J_Shelby_J wrote:
| I've been on the extreme of maxing out fps in games for a
| decade.
|
| Gamersnexus is great! But they and the other YouTube
| benchmarkers don't tell the whole story. when you are
| playing games that are bottlenecked by single thread
| performance single core performance and therefore memory
| speed becomes important. This is very difficult to
| benchmark as most of those games where it matters (CoD
| warzone) don't have an easy way to benchmark. When I was
| testing my OC I literally would change the ram frequency in
| the bios, and drop into a pubg match, and screen record my
| fps and manually add it to a spreadsheet.
|
| So take those benchmarks and then add some performance
| gains from memory overclocking that I (and others) been
| able to get meaningful gains in fps beyond just
| overclocking the cpu. 5-10% more fps is not amazing, but
| it's definitely worth it to me for games like warzone or
| valorant.
|
| AMD's support for high end ram has always lagged. Happy to
| use amd for non-gaming, but if you're trying to build "the
| best gaming rig" I'm not sure AMD can take that crown. And
| until people are willing to benchmark single threaded
| bottleneck games like warzone with different ram speeds, I
| doubt this argument will be settled.
| bee_rider wrote:
| Intel's engineering is pretty solid, I think anyone who looks
| down on Intel's engineers because they "only" managed to
| overcome their management's wasteful dithering for... like...
| 30 years is not worth working for.
| kjs3 wrote:
| That's a pretty personal take; don't take that shame on
| yourself. Stick to the usual, comfortingly impersonal
| "Obviously, Intel is dead and buried and all their management
| sucks and they ruined the world" whenever Intel has announced a
| setback any time in the last 50 years. No one has a blanket "we
| don't hire losers from Intel" policy; it's not like you worked
| at CA or Oracle.
| boshalfoshal wrote:
| These sound largely like management/product direction issues
| and is not really indicitive of bad engineering quality. I
| wouldn't be concerned about the quality of your resume.
| trynumber9 wrote:
| Intel bought Altera in 2015 when it still thought 10nm would be
| on time and an advanced node. That did not work out. The idea was
| to get a better FPGA and have more customers to justify fab build
| out expenses. Gelsinger more recently said he does not want to
| force products to be on Intel fabrication. Use Intel processes
| where it makes sense. No reason to push Altera FPGA to Intel
| 10/7/4. No reason to push NICs to Intel 10/7/4. And so on.
| jasoneckert wrote:
| The first thing this reminded me of was when Intel got rid of
| StrongARM/XScale because they didn't think it would amount to
| much in the long run. Hopefully they don't regret this particular
| spinoff in the future.
| reachableceo wrote:
| One would presume Intel will get a decent chunk of the stock in
| any IPO and capture the upside value.
|
| That does seem to be how these kind of deals are usually
| structured. Spinco is 60% owned by the parent or wherever.
| lawlessone wrote:
| Ten years from now..
|
| "Intel should have dominated this space but Xilinix etc got
| lucky"
| somethoughts wrote:
| It'd be interesting if some of the funds from the sale will be
| used for AI software development to provide a better coordinated
| response to CUDA.
| mardifoufs wrote:
| How's the FPGA market at the moment? Has Altera been able to keep
| up with Xilinx (or vice versa) under Intel ownership?
| dboreham wrote:
| FPGAs have never made sense. They're way too expensive to use in
| volume. There's no practical use case for "cool, I can reprogram
| the chip in the field to implement different functionality".
| Nobody has figured out how to usefully integrate them with a CPU
| to make a low-volume SOC. CPUs became so fast that most
| applications don't need customer hardware. Regular gate arrays
| are cheaper and faster above minimal volume.
|
| They seem to only have been useful for prototyping and military
| applications (low volume and infinite budget).
| vatys wrote:
| I see them used in pro/prosumer audio equipment, synthesizers,
| and effects, which is relatively low volume and medium-to-high
| budget. FPGAs (and CPLDs, uC+AFE, etc) are great for these
| applications because they have great capabilities you might
| otherwise need a pile of discrete components or a custom chip
| for, but it doesn't make sense to design fully custom silicon
| if you're only ever going to sell about 50-500 of something.
|
| So sure, prototyping and military, but there are other uses as
| well. But none of them are super high-volume because once
| you're selling millions of something you should be designing
| your own chips.
| aleph_minus_one wrote:
| > CPUs became so fast that most applications don't need
| customer hardware.
|
| When complicated realtime signal processing is to be done,
| FPGAs shine - in particular if there exists no DSP that is
| competitive for the task.
| Bluebirt wrote:
| Consumer application and FPGAs are an oxymoron in itself. FPGAs
| are used in applications requiring special interfaces, special
| computing units or other custom requirements. If there is
| enough demand, SoCs are developed for these applications, but
| this is only useful in mid to high volume production. Areas
| like the ones you gave and many more are making heavy use of
| FPGAs. I work in medical for example. We are using custom
| designed chips for special detection purposes. But when it
| comes to data processing and interfacing with computers, we use
| FPGAs.
| crotchfire wrote:
| The problem is that FPGA companies are really CAD tool
| companies who see their chips as copy-protection/payment-
| assurance schemes for their software.
|
| Unfortunately their CAD tools suck, but that's beside the
| point.
| soulbadguy wrote:
| Large acquisition rarely seems to pan out well in the tech
| sector. Especially when big companies try to acquire their way
| into an adjacent market.
|
| Also some company seems to be significantly worst than other,
| MSFT/microsoft/Dell come to mind. My suspicion is those type of
| acquisition are mainly driven by C/executive level employee as
| way to hide the real struggle of the company.
|
| Is there a report analyzing bit tech acquisition say for the last
| 30 years, and they economical impact ? That would be an
| interesting read.
|
| Maybe it's time for a new form of regulation around acquisitions
| Kon-Peki wrote:
| I take exception to the usage of the word "spinoff". Intel is
| selling a portion of Altera. If this was a true spinoff, Intel
| shareholders would get shares in the new entity.
|
| Intel needs the cash, so this is understandable.
___________________________________________________________________
(page generated 2023-12-21 23:00 UTC)