[HN Gopher] The Rise and Fall of Silicon Graphics
___________________________________________________________________
The Rise and Fall of Silicon Graphics
Author : BirAdam
Score : 207 points
Date : 2024-04-05 16:42 UTC (6 hours ago)
(HTM) web link (www.abortretry.fail)
(TXT) w3m dump (www.abortretry.fail)
| neduma wrote:
| Irix OS was my second best OS exposure after Ubuntu during
| college days.
| HeckFeck wrote:
| What did you do with it?
|
| I've always wanted to play with Irix . The UI looks very
| intuitive.
|
| The boxes are hard to find, so I went for emulation but I
| couldn't get it any further than a boot screen in MAME.
| davidw wrote:
| I remember having access to an Irix box at my first job. It was
| seen as a real, professional, serious OS, not like the Linux box
| I set up. Pretty incredible how that all changed in a matter of
| years.
| kelsey98765431 wrote:
| A close friend of mine growing up had a parent that worked at
| SGI. I'm not trying to start a holy war, but I just have to let
| it be known that emacs was the recommended editor at SGI. Just
| saying. Maybe other things contributed to the fall but, in my
| heart I will always remember emacs.
| theideaofcoffee wrote:
| Oh how I lusted over the Challenges, the Octanes, the Indigo2s of
| the time. It was a revelation when I finally was able to sit down
| at a console of an Octane (with two, count 'em TWO R14000 and a
| whopping 2.6G of RAM), tooling around in IRIX via 4dwm was so
| much more satisfying than today's UIs. It was snappy and low-
| latency unlike anything I've used since.
|
| Later on, I was able to do some computational work on an Altix
| 3700 with 256 sockets and 512G of RAM spread over four full-
| height cabinets with the nest of NUMAlink cables at the back), at
| the time running SuSE linux and that was wild seeing the 256
| sockets being printed out with a cat /proc/cpuinfo. Now the same
| capabilities are available in a 4U machine.
|
| The corporate lineage story is also just as interesting as the
| hardware they made as well. Acquisition, spinoff, acquisition,
| rename, acqusition, shutter, now perhaps just a few books and
| binders and memories in the few remaining personnal at HPE are
| all that's left (via Cray, via Tera, via SGI, via Cray Research).
|
| RIP SGI
| bitbckt wrote:
| I still keep a maxed out Octane2 in running order for
| posterity. Occasionally logging in to it reminds me just how a
| desktop environment should feel. We truly have lost something
| since then.
| hypercube33 wrote:
| I really wish they'd do movies like they made for RIM about:
| Cray, DEC, Compaq, SGi, Pixar. sounds like these places were
| either wild or strait up IBM culture or some clash or both
| either inside or outside. Raven and id Software would be neat
| too. Westwood Studios.
| ChrisMarshallNY wrote:
| I remember having a Personal Iris, at the company I worked at,
| and, later, an Indigo. We never used them. I think they were
| really there, to impress the visitors (They were in our
| showroom).
|
| I remember the colors as being very different, from the photos,
| though.
|
| The Personal Iris was a deep purplish-brown, and the Indigo was
| ... _indigo_.
|
| Jim Clark sounds like my kinda guy. I made a hash of my teenage
| years, and barely squeaked in, with a GED, myself. It has all
| worked out, OK, in the end, though.
| randomdata wrote:
| _> We never used them._
|
| When I was in high school we had a lab full of SGI machines.
| They also never got used. Hundreds of thousands of dollars of
| computing equipment, and probably that much again in software
| licenses (at the commercial rate), just sitting there doing
| nothing. It was heartbreaking.
|
| On a happy note, the SGI bus (a semi-trailer full of SGI
| machines demoing their capabilities) came to school one time.
| As a teenage nerd, getting to play with a refrigerator-sized
| Onyx2 was a good day.
| mrpippy wrote:
| My goodness, at a high school? Like Indys, or O2s? Was this a
| private school?
| randomdata wrote:
| They were O2s. Rural public school.
|
| There were all kinds of toys, though. There was a dedicated
| classroom setup for video-based remote learning some 30
| years before COVID - that got used for one semester, from
| what I gather (was never used while I was there). The
| school was even host to a dialup ISP at one point.
|
| The administrators were all in on technology. The teachers,
| not so much...
|
| Eventually, in my last year, the government changed the
| funding model and the party ended.
| fuzztester wrote:
| >The Personal Iris was a deep purplish-brown, and the Indigo
| was ... indigo.
|
| Nice.
|
| I once worked at a startup that had a Cobalt Qube in the server
| room, and the Cobalt was ... _cobalt blue_.
|
| https://en.m.wikipedia.org/wiki/File:Cobalt_Qube_3_Front.jpg
|
| https://en.m.wikipedia.org/wiki/Cobalt_Qube
| jefflinwood wrote:
| I worked on the SGI campus as a consultant/vendor to them in
| 1999/2000 during the dot.com boom. I really wanted one of those
| 1600SW flat screens (everything was CRT back then), but they
| weren't really in use at the time.
|
| One of the neatest things is that they let us (Trilogy/pcOrder)
| put together a sand volleyball team to compete in their company
| intramurals.
|
| Their cafeteria was also top notch.
| dekhn wrote:
| That cafeteria went on to be known as "Charlie's" at Google and
| was the main HQ cafeteria (serving great food, and then later,
| extremely meh food). TGIF was also held there. If there ever
| was a place that was "Google central", that was it.
| fuzztester wrote:
| >"Charlie's" , and its chef, Charlie, are mentioned in the
| book called The Google Story:
|
| https://en.m.wikipedia.org/wiki/The_Google_Story
| tombert wrote:
| There's a few cases in the history of computers where it feels
| like the world just "chose wrong". One example is the Amiga; the
| Amiga really was better than anything Apple or Microsoft/IBM was
| doing at the time, but for market-force reasons that depress me,
| Commodore isn't the "Apple" of today.
|
| Similarly, it feels like Silicon Graphics is a case where they
| really should have become more standard. Now, unlike Amiga, they
| were too expensive to catch on with regular consumers, but I feel
| like they should have become and stayed the "standard" for
| workstation computers.
|
| Irix was a really cool OS, and 4Dwm was pretty nice to use and
| play with. It makes me sad that they beaten by Apple.
| causi wrote:
| "Revolutionaries rarely get to live in the societies they
| created"
|
| I think it's a combination of a skillset/culture needed to
| create a paradigm shift isn't the same one needed to compete
| with others on a playing field you built, and of complacency.
| It happens over an over. We saw it happen with RIM, and we're
| watching it happen right now with Prusa Research.
| itronitron wrote:
| Both Prusa and SGI are (and were) probably largely unknown to
| 90% of their potential market. The globally recognized
| companies tend to spend far more on marketing than anyone in
| a STEM field would consider remotely reasonable.
| fuzztester wrote:
| True. In the early to middle days of Java, I read that Sun
| spent millions of dollars on marketing it, and related
| stuff around it.
| hn_throwaway_99 wrote:
| > Similarly, it feels like Silicon Graphics is a case where
| they really should have become more standard. Now, unlike
| Amiga, they were too expensive to catch on with regular
| consumers, but I feel like they should have become and stayed
| the "standard" for workstation computers.
|
| I think you highlighted very correctly there, though, why SGI
| lost. It turned out there were cheaper options, which while not
| on par with SGI workstations initially, just improved at a
| faster rate than SGI and eventually ended up with a much better
| cost/functionality profile. I feel like SGI just bet wrong. The
| article talks about how they acquired Cray, which were
| originally these awesome supercomputers. But it turned out
| supercomputers essentially got replaced by giant networks of
| much lower cost PCs.
| tombert wrote:
| Yeah, I'm more annoyed about Amiga than SGI. They were priced
| competitively with Apple and IBM offerings.
|
| I guess it's just kind of impossible to predict the future. I
| don't think it's an _incompetent_ decision to try and focus
| entirely on the workstation world; there are lots of
| businesses that make no attempt to market to consumers, and
| only market to large companies /organizations, since the way
| budgeting works with big companies is sort of categorically
| different than consumer budgets.
|
| But you're absolutely right. Apple and Windows computers just
| kept getting better and better, faster and faster, and
| cheaper and cheaper, as did 3D modeling and video editing
| software for them. I mean, hell, as a 12 year old kid in
| 2003, I had both Lightwave 3D (student license) and
| Screenblast Movie Studio (now Vegas) running on my cheap,
| low-spec desktop computer, and it was running fast enough to
| be useful (at least for standard definition).
| mike_hearn wrote:
| Of course, the reason they got better so fast is volume.
| There was just way more investment into those platforms.
| Which means this explanation is somewhat circular: they
| were successful because they were successful.
|
| I think a more useful explanation is that people rate the
| value of avoiding vendor lockin extraordinarily high, to
| the extent that people will happily pick worse technology
| if there's at least two competing vendors to choose from.
| The IBM PCs were not good, but for convoluted legal reasons
| related to screwups by IBM their tech became a competitive
| ecosystem. Bad for IBM, good for everyone else. Their
| competitors did not make that "mistake" and so became less
| preferred.
|
| Microsoft won for a while despite being single vendor
| because the alternative was UNIX, which was at least sorta
| multi-vendor at the OS level, except that portability
| between UNIXen was ropey at best in the 90s and of course
| you traded software lockin for hardware lockin; not really
| an improvement. Combined with the much more expensive
| hardware, lack of gaming and terrible UI toolkits (of which
| Microsoft was the undisputed master in the 90s) and then
| later Linux, and that was goodbye to them.
|
| Of course after a decade of the Windows monopoly everyone
| was looking for a way out and settled on abusing an
| interactive document format, as it was the nearest thing
| lying around that was a non-Microsoft specific way to
| display UI. And browsers were also a competitive ecosystem
| so a double win. HTML based UIs totally sucked for the end
| users, but .... multi-vendor is worth more than nice UI,
| so, it wins.
|
| See also how Android wiped out every other mobile OS except
| iOS (nobody cares much about lockin for mobile apps, the
| value of them is just not high enough).
| bunderbunder wrote:
| Hypothesis:
|
| What smaller businesses are using will tend to be what takes
| over in the future, just due to natural processes. When
| smaller businesses grow, they would generally prefer to fund
| the concurrent growth of existing vendors that they like
| using than they are to switch to the existing "industrial-
| grade" vendor.
|
| At the same time, larger organizations that can afford to
| start with the industrial-grade vendors are only as loyal as
| they are locked in.
| tombert wrote:
| I mean, there are corporations who _only_ sell to very
| large corporations and have had plenty of success doing so.
| Stuff like computational fluid dynamics software, for
| example, has a pretty-finite number of potential clients,
| and I don 't think I could afford a license to ANSYS even
| if I wanted one [1], since it goes into the tens of
| thousands of dollars. I don't think there are a ton of
| startups using it.
|
| But I think you're broadly right.
|
| [1] Yes I know about OpenFOAM, I know I could use that if I
| really wanted.
| 01HNNWZ0MV43FF wrote:
| I see the same trend in programming languages. Say a really
| solid career lasts from about 20 to 60, 40 years long. Say
| that halfway through your career, 20 years in, you're
| considered a respectable senior dev who gets to influence
| what languages companies hire for and build on.
|
| So in 20 years in, the current batch of senior devs will be
| retiring, and the current noobies will have become senior
| devs.
|
| *Whatever language is easy to learn today will be a big
| deal in 20 years*
|
| That's how PHP, Python, and JavaScript won. Since
| JavaScript got so much money poured on it to make it fast,
| secure, easy, with a big ecosystem, I say JS (or at least
| TS) will still be a big deal in 20 years.
|
| The latest batch of languages know this, and that's why
| there are no big minimal languages. Rust comes with a good
| package manager, unit tester, linter, self-updater, etc.,
| because a language with friction for noobies will simply
| die off.
|
| One might ask how we got stuck with the languages of script
| kiddies and custom animated mouse cursors for websites.
| There's no other way it could turn out, that's just how
| people learn languages.
| chuckadams wrote:
| Back in the old days there was a glut of crappy bloated
| slow software written in BASIC. JS is the BASIC of the
| 21st century: you can write good software in it, but the
| low bar to entry means sifting through a lot of dross
| too.
|
| My take: that's just fine. Tightly crafted code is _not_
| a lost art, and is in fact getting easier to write these
| days. You're just not forced into scrabbling for every
| last byte and cpu cycle anymore just to get acceptable
| results.
| cmrdporcupine wrote:
| This betting wrong on specialization happened over and over
| again in the late 70s and 80s. The wave of improvements and
| price reduction in commodity PC hardware was insane,
| especially from the late 80s onwards. From Lisp machines to
| specialized graphics/CAD workstations, to "home computer"
| microcomputer systems, they all were buried because they
| mistakenly bet against Moore's law and economies of scale.
|
| In 91 I was a dedicated Atari ST user convinced of the
| superiority of the 68k architecture, running a UUCP node off
| my hacked up ST. By the end of 92 I had a grey-box 486
| running early releases of Linux and that was that. I used to
| fantasize over the photos and screenshots of workstations in
| the pages of UnixWorld and similar magazines... But then I
| could just dress my cheap 486 up to act like one and it was
| great.
| kazinator wrote:
| Atari ST and Intel PC are not distant categories. Both are
| "'home computer microcomputer' systems". Not all home
| computer systems can win, just like not all browsers can
| win, not all spreadsheets can win, not all ways of hooking
| up keyboards and mice to computers can win, ...
| cmrdporcupine wrote:
| They were distant on market tier but most importantly on
| economies of scale. The Intel PC market grew
| exponentially.
| kazinator wrote:
| Sure, but the economy of scale came from the success. The
| first IBM PC was a prototype wire-wrapped by hand on a
| large perf board.
|
| When you switched to Intel in 1992, PC's had already
| existed since 1981. PC's didn't wipe out most other home
| computers overnight.
| gspencley wrote:
| I still dream of having a Beowulf Cluster of Crays.
|
| One day ...
| analognoise wrote:
| https://github.com/DarkwaveTechnologies/Cray-2-Reboot
|
| I'm on board for this project?
| hnhg wrote:
| The people that created the Amiga weren't the same people as
| the ones leading Commodore. Apple's success seems to have been
| heavily based on the company's leader being very involved in
| product development and passionate about it.
|
| Along the same lines, there is an alternate timeline where the
| Sharp X68000 took over the world:
| https://www.youtube.com/watch?v=OepeiBF5Jnk
| tombert wrote:
| I've actually seen that video!
|
| Yeah, I think that would also have been a better timeline;
| I'm just stuck in the anglo-world and thus my knowledge is
| mostly limited to what was released in the US or Europe.
| randomdata wrote:
| I'm not sure Apple did continue to succeed after its early
| success. It eventually gave up its name to NeXT, which is who
| found later success.
| samatman wrote:
| The standard quip here is that NeXT purchased Apple for
| negative $400 million.
| ip26 wrote:
| We've seen again and again that the high end of the computer
| market can't sustain itself; the mass market outruns it. The
| result is that the high end works best when leveraging the mass
| market instead of trying to compete with it.
|
| See the dominance of Threadripper in workstations, which is
| built on top of mainstream desktop and server parts bin. Or
| look at the Epyc based supercomputers, rumored to be the only
| supercomputers to turn a net profit for the suppliers, thanks
| to leveraging a lot of existing IP.
| prpl wrote:
| It's just a lesson in worse is (often) better. If you can do
| some most of the job with something that is either cheaper,
| easier to build, or easier to iterate on, then it will often
| overtake a better engineered solution.
| qqtt wrote:
| My main problem with Silicon Graphics (& have the same problem
| with Sun Microsystems) is that they just tried to do too much
| in propriety hardware and completely resisted standards.
| Microsoft & IBM "won" because they made computers with actual
| upgrade paths and operating systems with wide support among
| upgrade paths. With SGI/Sun you were very much completely
| locked in to their hardware/software ecosystem and completely
| at the mercy of their pricing.
|
| In this case, I think the market "chose right" - and the reason
| that the cheaper options won is because they were just better
| for the customer, better upgradability, better compatibility,
| and better competition among companies inside the ecosystems.
|
| One of the most egregious things I point to when discussing
| SGI/Sun is how they were both so incredibly resistant to
| something as simple as the ATX/EATX standard for motherboard
| form factors. They just had to push their own form factors
| (which could vary widely from product to product) and allowed
| almost zero interoperability. This is just one small example
| but the attitude permeated both companies to the extent that it
| basically killed them.
| thisislife2 wrote:
| > _With SGI /Sun you were very much completely locked in to
| their hardware/software ecosystem and completely at the mercy
| of their pricing._
|
| How is that in anyway different from Apple today with it's
| ARM SoCs, soldered SSDs and an OS that requires
| "entitlements" from Apple to "unlock" features and develop
| on?
| mcculley wrote:
| Are there entitlements or unlockable features other than
| when talking about App Store distribution?
| Gracana wrote:
| You can buy a cheap Mac and easily write programs for it.
| You don't have to spend $40k on a computer, you don't have
| to buy a support contract, you don't have to buy developer
| tools.
| fuzztester wrote:
| >You can buy a cheap Mac and easily write programs for
| it.
|
| Interesting. How cheap? Never used Macs, only Windows and
| Unix and Linux.
| icedchai wrote:
| You can get a Mac Mini for $600-ish. Never get the base
| model though. (FYI, macOS is Unix.)
| cryptoxchange wrote:
| Every time I've checked over the last decade (including
| today), you can buy a mac mini that supports the latest
| macOS for under $250 on ebay. You can also test your app
| using github actions for free if your use case fits in
| the free tier.
|
| There is no way to do this for an IBM z16, which is the
| kind of vendor lock in that people are saying Apple
| doesn't have.
| CountHackulus wrote:
| Thanks to web browsers and web apps it's not QUITE as bad
| of a lock-in nowdays. At least from a general consumer
| point of view.
| dekhn wrote:
| The big exception here is that SGI took IrisGL and made it
| into OpenGL which as a standard lasted far longer than SGI.
| And OpenGL played a critical role preventing MSFT from taking
| over the 3D graphics market with Direct3D.
| pjmlp wrote:
| Except that OpenGL only mattered thanks to Carmack and id
| Software mini-GL drivers.
|
| It hardly matters nowadays for most game developers.
| dekhn wrote:
| When I say "hardware graphics market" I'm referring to
| high performance graphics workstations, not gaming. There
| is a whole multibillion dollar market there (probably
| much smaller than games, but still quite significant).
| It's unclear what carmack's influence on the high
| performance graphics workstation environment is, because
| mini-GL left out all the details that mattered to high
| performance graphics (line rendering would be a good
| example).
|
| In my opinion, Mesa played a more significant role
| because it first allowed people to port OpenGL software
| to run on software-only cheap systems running Linux, and
| later provided the framework for full OpenGL
| implementations coupled with hardware acceleration.
|
| Of course, I still greatly enjoyed running Quake on
| Windows on my 3dfx card with OpenGL.
| pjmlp wrote:
| Well, put that way it is a market that runs on Windows
| with OpenGL/DirectX nowadays, or if using GNU/Linux, it
| is mostly with NVIDIA's proprietary drivers, specially
| when considering VFX reference platform.
| JohnBooty wrote:
| If Amiga really "deserved" to win, I think they wouldn't have
| been eclipsed by the PC ecosystem in terms of performance.
|
| They leapt out ahead of the competition with an advanced OS,
| purpose-built for graphics and sound in a way that PCs and Macs
| weren't.
|
| Which was great. But they weren't really _better_ than the
| competition. They were just doing something the competition
| wasn 't. And when the competition _actually started doing those
| things_ they got eclipsed in a hurry.
|
| I wonder if Tesla will suffer the same fate. They were
| obviously around a decade ahead of the established players when
| it came to electric cars. But once the other established
| players actually got serious about electric cars, Tesla largely
| stopped being special, and upstarts like Lucid and Rivian are
| neck and neck with them (in terms of compelling products, not
| sales) as well.
| cduzz wrote:
| "the future is already here, it just isn't evenly
| distributed."
|
| This means there are products out there with futuristic
| features that will be seen as requirements for all things
| going forward and right now those features are niche elements
| of some product.
|
| The Amiga was a fantastic device but not a general purpose
| device. Lots of things are fantastic at a niche but not
| general, and those almost always fail.
|
| Is this also the "worse is better" truism?
| hinkley wrote:
| Tesla will also suffer a reverse cult of personality problem.
|
| I don't know anyone at Rivian so my opinion of them is
| neutral. Meanwhile Tesla is run by the jackass who ruined
| twitter.
| bluedino wrote:
| They were destined for eventually dying like the rest of the
| high end UNIX workstation market. Linux and x86 got better and
| better every year.
| tombert wrote:
| Yeah, and OS X more or less mainstream-ized consumer UNIX as
| well. It gave you access to the UNIX tools in the command
| line if you wanted them, had a solid UNIX core, but was a lot
| cheaper than an SGI and also easy to use.
| epcoa wrote:
| > the Amiga really was better than anything Apple or
| Microsoft/IBM was doing at the time
|
| At the _time_. A brief moment in time, and then they had no
| path forward and were rapidly steamrolled. Nothing was "chosen
| wrong" in this aspect.
| tombert wrote:
| Well, wait, the Amiga had preemptive multitasking way before
| Apple or Windows got it, like the mid 80s. I don't think
| Windows got it until Windows NT, and it didn't become
| mainstream until Windows 95. Macs had bizarre cooperative
| multitasking that would freeze if you just thought it about
| it funny [1] all the way until OS X.
|
| There's other stuff too; they had better color graphics in
| the 80s while DOS was still dealing with CGA and EGA, and
| decent sound hardware. Even by 1990, the video toaster was
| released, well before it got any port to DOS.
|
| [1] I'm sure it got better, my first exposure to it was
| System 7 and that thing was an unholy mess. I didn't touch
| macOS again until OS X.
| epcoa wrote:
| Long before Windows 95 there was DOOM and DOOM would not
| run on an Amiga.
|
| > 80s while DOS was still dealing with CGA and EGA, and
| decent sound hardware.
|
| And then the 80s ended. What point did I make that you are
| contradicting?
|
| > Even by 1990, the video toaster was released,
|
| And if you wanted to do CAD? Would you use an Amiga?
| Probably not. What about desktop publishing? Pointing out
| that Amiga had carved out a niche (in video editing) when
| that was the norm back in those days doesn't make any
| strong comment about the long term superiority or viability
| of the platform.
|
| Also, I don't buy into the idea that just because a company
| had something "superior" for a short period of time with no
| further company direction that they didn't lose fair and
| square. That Amiga had something cool in the 80s but didn't
| or couldn't evolve isn't because the market "chose wrong".
| Commodore as a company was such a piece of shit it made
| Apple of the 80s look well run. Suffering a few more years
| with the occasional bomb on System 7 was not a market
| failure.
|
| > Macs had bizarre cooperative multitasking
|
| What was bizarre about it, compared to any other
| cooperative multitasking system of the time? Also you seem
| to be fixated on preemptive multitasking to the neglect of
| things like memory protection.
| tombert wrote:
| > Long before Windows 95 there was DOOM and DOOM would
| not run on an Amiga.
|
| Yeah fair. I do wonder if a port like the SNES version
| would have been possible if id would have greenlit it,
| but that's a "what if" universe. Alien Breed 3D would run
| on a 1200, but IIRC it ran pretty poorly on that.
|
| > And then the 80s ended. What point did I make that you
| are contradicting?
|
| I mean, yes, VGA cards and Soundblaster cards were around
| in 1990, but they weren't really standard for several
| years later.
|
| > And if you wanted to do CAD? Would you use an Amiga?
| Probably not. What about desktop publishing? Pointing out
| that Amiga had carved out a niche (in video editing) when
| that was the norm back in those days doesn't make any
| strong comment about the long term superiority or
| viability of the platform.
|
| Also fair. I'll acknowledge my view is a bit myopic,
| since I don't really do CAD or desktop publishing, but I
| do some occasional video editing, and I do think Amigas
| were quite impressive on that front. You're right in
| saying it was a "niche" though.
|
| > Commodore as a company was such a piece of shit it made
| Apple of the 80s look well run.
|
| No argument here. Still think that the hardware was
| pretty cool though.
|
| > What was bizarre about it
|
| I guess "bizarre" was the wrong word. It was just really
| really unstable, and System 7 would constantly freeze for
| seemingly no reason and I hated it.
|
| > Also you seem to be fixated on preemptive multitasking
| to the neglect of things like memory protection.
|
| I feel like if Commodore had been competently run, they
| could have done work to get proper protected memory
| support, but again that's of course a "what if" universe
| that we can't really know for sure.
|
| I guess what frustrates me is that it did genuinely feel
| like Commodore was really ahead of the curve. I think the
| fact that they had something pretty advanced like
| preemptive multitasking (edit: fixed typo) in the mid 80s
| was a solid core to build on, and I do kind of wish it
| had caught on and iterated. I see no reason why the Amiga
| _couldn 't_ have eventually gotten decent CAD and Desktop
| publishing software. I think Commodore didn't think they
| had to keep growing.
| icedchai wrote:
| The Amiga OS was designed in a way that protected memory
| support was basically impossible. Message passing was
| used everywhere. How did it work? One process ("task",
| technically) sent a pointer to another, a small header
| with arbitrary data, which could contain anything,
| including other pointers. Processes would literally read
| and write each other's memory.
| logicprog wrote:
| > they had no path forward
|
| This is I think the premise that you and people like me who
| think Amiga could have gone on to do great things disagree
| on, I think. Most Amiga fans would say that it totally had a
| path forward, or at least there is no evidence that it
| didn't, and the failure to follow that path therefore it
| wasn't an inherent technical problem, but a problem of
| politics and management. Do you have any evidence to the
| contrary?
| sys_64738 wrote:
| Commodore's story is more about achieving the impossible with
| 1-2 engineers building each computer. Commodore was a company
| built around Jack Tramiel who wanted his widgets to ship in
| volumes to "the masses, not classes". When he left then it
| was a lifestyle sucking cash machine for Irving Gould who
| appointed incompetent CEO after incompetent CEO after
| Tramiel. The miracle is it staggered on ten years post-Jack.
|
| But the reality is the Commodore 64 kept Commodore going
| during most of that period rather than Amiga sales. It's
| similar to Apple where the Apple 2 kept Apple afloat during
| the 80s and 90s until Steve returned.
| cmrdporcupine wrote:
| Times changed though, too, and Tramiel couldn't replicate
| his success w the C64 at Atari Corp, despite bringing the
| same philosophy (and many key engineers) over there.
|
| By the late 80s the "microcomputer" hobby/games market was
| dead and systems like the ST and Amiga (or Acorn
| Archimedes, etc.) were anachronisms. You had to be a PC-
| compat or a Mac or a Unix workstation or you were dead.
| Commodore and Atari both tried to push themselves into that
| workstation tier by selling cheaper 68030 machines than
| Sun, etc, but without success.
| snakeyjake wrote:
| >One example is the Amiga; the Amiga really was better than
| anything Apple or Microsoft/IBM was doing at the time
|
| Amiga was only better 1985-1988.
|
| I still have my original Amiga and A2000. I was an Amiga user
| for a decade. They were very good. I was platform agnostic,
| caring only to get work done as quickly and easily as possible
| so I was also an early Macintosh user as well as Sun and PA-
| RISC. And yes, I still have all of those dinosaurs too.
|
| By 1987 PC and Mac caught up and never looked back.
|
| But by 1988 the PS/2 with a 386 and VGA was out and the A2000
| was shipping with a 7MHz 68000 and ECS.
|
| By 1990 the 486s were on the market and Macs were shipping with
| faster 030s and could be equipped with NuBUS graphics cards
| that made Amiga graphics modes look like decelerated CGA.
|
| After the A2000 the writing was on the wall.
|
| Note: my perspective is of someone who has always used
| computers to do work, with ALMOST no care for video games so
| all of the blitter magic of Amiga was irrelevant to me. That
| being said when DOOM came out I bought a PC and rarely used my
| Amigas again.
|
| What I can confidently assert is that I upgraded my A2000 many
| times and ran into the absolute configuration nightmare that is
| the Amiga architecture and the problems with grafting upgrades
| onto a complex system with multiple tiers of RAM and close OS
| integration with custom chips.
|
| One more bit of heresy is that I always considered Sun's
| platform to be superior to SGI's.
| logicprog wrote:
| > Amiga was only better 1985-1988. By 1987 PC and Mac caught
| up and never looked back.
|
| Oh indubitably! I don't think even the most committed Amiga
| fan, even the ones that speculate about alternate histories,
| would deny that at all.
|
| The thing is, though, that only happened because Commodore
| essentially decided that since it had so much of a head
| start, it could just rest on its laurels and not really
| innovate or improve anything substantially, instead of
| constantly pushing forward like all of its competitors would
| do, and so eventually the linear or even exponential curve of
| other hardware manufacturers' improvements outpaced its
| essentially flat improvement curve. So it doesn't seem like
| IBM PCs and eventually even Macs outpacing the power of Amiga
| Hardware was inevitable or inherent from the start.
|
| If they had instead continued to push their lead -- actually
| stuck with the advanced Amiga chips that they were working on
| before it was canceled and replaced with ECS for instance --
| I certainly see the possibility of them keeping up with other
| hardware, and eventually transitioning to 3D acceleration
| chips instead of 2D acceleration chips when that happened in
| the console world, eventually perhaps even leading to the
| Amiga line being the first workstation line to have the gpus,
| and further cementing their lead, while maintaining
| everything that made Amiga great.
|
| Speculating even further, as we are seeing currently with the
| Apple M-series having a computer architecture that is
| composed of a ton of custom made special purpose chips is
| actually an extremely effective way of doing things; what if
| Amiga still existed in this day and age and had a head start
| in that direction, a platform with a history of being
| extremely open and well documented and extensible being the
| first to do this kind of architecture, instead of it being
| Apple?
|
| Of course there may have been fundamental technical flaws
| with the Amiga approach that made it unable to keep up with
| other hardware even if Commodore had had the will; I have
| seen some decent arguments to that effect, namely that since
| it was using custom vendor-specific hardware instead of
| commodity hardware that was used by everyone, they couldn't
| take advantage of the cross-vendor compatibility like IBM
| PCs, could and also couldn't take advantage of economies of
| scale like Intel could, but who knows!
| pjmlp wrote:
| From retrogaming talks from former Commodore engineers, the
| issues were more political and management than technical
| alone.
| logicprog wrote:
| That's definitely how it seems to me, which is why I
| focused on Commodores poor management decisions first and
| only mentioned the possible technical issues second
| AnimalMuppet wrote:
| That's kind of typical, though, isn't it? When a company
| falls off, it's almost always not just technical.
| pjmlp wrote:
| It took a bit more than 1990, for PC 16 bit sound card, Super
| VGA screens, with Windows 3.1 to be widely adopted for the PC
| to out perform the Amiga, specially in European price points.
|
| My first PC was acquired in 1992, and still only had a lousy
| beeper, on a 386SX.
| geophile wrote:
| I was similar, not really interested in graphics, just a nice
| programming environment. PCs had that stupid segmented
| address space (which was not ignorable at the programming
| language level), expensive tools, and crappy OSes. My Amiga
| 2000 had a flat address space, a nice C development
| environment, and multitasking actually worked. It really was
| ahead of its time, in combining a workstation-like
| environment and an affordable price.
| snakeyjake wrote:
| >My Amiga 2000 had a flat address space
|
| Chip ram, fast ram, cpu ram, expansion board ram, or slow
| ram? Did too much ram force your zorro card into the
| slooooooooooow ram address space (mine did)? Tough cookies
| bucko!
|
| Macintosh, pounding on table: "RAM is RAM!"
| logicprog wrote:
| As someone trying to get into Amiga retro competing as a
| hobby in today's day and age, I find it keeping all the
| different types of ram straight very confusing lol
| dylan604 wrote:
| We kept our A2000 viable longer by adding the CPU board with
| the 030 chip. We went from 7MHz to somewhere around 40MHz or
| whatever. It meant that my Lightwave render went from 24
| hours per frame to a few hours per frame.
| icedchai wrote:
| I think you are mostly right, I just think your timing is
| off. Those early 386 machines and Mac II systems were very
| expensive, at least 2 to 3x the cost of an Amiga. The average
| home user wasn't going to drop $8K on a PS/2 model 80 with a
| 386/16.
|
| By the early 90's the Amiga just wasn't competitive. The chip
| set barely evolved since 1985. ECS barely added anything over
| the original chip set. By around 1992 or 1993, 386 systems
| with SVGA and Soundblaster cards were cheap. Amiga AGA was
| too little, too late. Also consider the low end AGA system
| (Amiga 1200) was totally crippled with only 2 megs of slow
| "chip" RAM.
|
| I was an Amiga fan until 1993. Started with an A500, then
| A3000. Eventually I moved on to a 486 clone w/Linux. Later on
| I had a Sun SparcStation 10 at home, so I agree with you on
| Sun and SGI.
| mtillman wrote:
| The Amiga couldn't handle the performance requirements of Doom
| at the time (Game Engine Black Book Doom). Workbench was more
| fun than Windows and at least the install process that was
| early linux.
|
| As much as I loved my O2 (my first professional computer), it
| was underpowered for the time for anything other than texture
| manipulation. The closed source nature of that time period and
| the hardware sales motion meant that you were paying through
| the teeth for compilers on top of already very expensive
| hardware. The Cray-linked Origin 200's ran Netscape web server
| with ease but that's a lot of hardware in a time period when
| everything went out of date very quickly-donated ours! Irix
| still looks better than the new Mac OS UIs IMO but no-Motif is
| a small price to pay for far cheaper access to SDKs IMO. Also,
| Irix was hilariously insecure due in part to its closed source
| nature. https://insecure.org/sploits_irix.html
| downut wrote:
| "... hardware sales motion meant that you were paying through
| the teeth for compilers..."
|
| For Fortran? My memory is hazy but at NASA NAS a bunch of us
| were using gcc/g++ starting ~1990. g++ was... an adventure.
| Building my own (fine!) compiler for free(!) got me hooked on
| OSS to the point that when Linux/FreeBSD launched I jumped in
| as fast as I could.
|
| I really loved my various SGI boxen. Magical times. I was a
| NASA "manager" so had the Macintosh "manager" interface box
| that I solved by keeping it turned off.
| axpvms wrote:
| >Also, Irix was hilariously insecure due in part to its
| closed source nature.
|
| That was in addition to having three default accounts with
| well known passwords and a telnet server.
| icedchai wrote:
| Some versions of IRIX (4.x, maybe?) also defaulted to
| having X11 authentication disabled. Anyone in the office
| could "xmelt" your screen... or worse.
| HarHarVeryFunny wrote:
| The reason SGI failed, and eventually Sun too, isn't because
| the world "chose wrong", but because their performance simply
| did not keep up with x86.
|
| When these RISC-based workstations were initially released
| their performance, especially at graphics, was well beyond what
| a PC could do, and justified their high prices. A "workstation"
| was in a class by itself, and helped establish the RISC
| mystique.
|
| However, eventually Intel caught up with the performance, at a
| lower price, and that was pretty much the end. Sun lived on for
| a while based on their OS and software ecosystem, but
| eventually that was not enough especially with the advent of
| Linux, GCC, etc, as a free alternative.
| hinkley wrote:
| Sun really struggled to make full use of their multicore
| systems. That m:n process model is coming back with fibers
| and libuv, but we have programming primitives and a deeper
| roster of experienced devs now than we did then. Back then
| they caused problems with scalability.
|
| There were times when Java ran better on Intel than on
| Solaris.
| sys_64738 wrote:
| Sun had the perfect opportunity with Utility Computing around
| the mid-2000s but when cloud took off we had Oracle buying
| SUNW. They killed Sun Cloud which had the opportunity to be
| big, vast, and powered by JAVA hardware.
|
| Sun Microsystems was a company like no other. The last of a
| dying breed of "family" technology companies.
| msisk6 wrote:
| I was at the MySQL conference when it was announced that
| Oracle was buying Sun. It just took all the life out of the
| conference. All the Sun folks were super pissed off. Truly
| the end of an era.
| icedchai wrote:
| I remember that time. It felt like Sun was on death's
| doorstep since the dot-com crash. On the hardware side,
| the market was flooded with used Sun hardware. On the
| software side, Linux was "good enough" for most
| workloads.
| hodgesrm wrote:
| I was there too. It certainly felt "timed" to maximize
| the sense of deflation for people working on MySQL.
| Perhaps it was just coincidence. IIRC Larry Ellison said
| that the crown jewel in the deal was actually Java.
| cduzz wrote:
| Ivan Sutherland described the reason [1] why PCs won a long
| time ago. Basically a custom tool may do a thing "better"
| than a general purpose tool for a while, but eventually,
| because more resources are spent improving the general tool,
| the generalized tool will be able to do the same thing as the
| specialty tool, but more flexibly and economically.
|
| [1] http://www.cap-lore.com/Hardware/Wheel.html
| jandrese wrote:
| SGI dug their own grave. Not only were the workstations
| expensive, but they demanded outrageously priced support
| contracts. This behavior drives people nuts and will insure
| that the switch to a competitor the instant it becomes an
| option. Despite the high cost, the support contracts had a
| pretty lousy reputation as well, with long wait times for
| repairs from a handful of overworked techs. Even worse is the
| company turned away from its core competencies to focus on
| being an also-ran in the PC workstation market.
|
| There was a window in the mid-90s where it would have been
| possible for SGI to develop a PC 3D accelerator for the
| consumer market using their GE technology, but nobody in the
| C-Suite had the stomach to make a new product that would
| undercut the enormous profit margins on their core product.
| It's the classic corporate trap. Missing out on the next big
| thing because you can't see past next quarter's numbers.
| Imagine basically an N64 on a PCI card for $150 in 1996. The
| launch versions could be bundled with a fully accelerated
| version of Quake. The market would have exploded.
| grumpyprole wrote:
| > The market would have exploded
|
| Absolutely, they could have been where Nvidia is now!
| christkv wrote:
| Or they could have 3dfxed themselves.
| Keyframe wrote:
| I'd argue Nvidia is ex-SGI, and so is ex-ATI. It's all
| their crew in the beginnings.
| foobarian wrote:
| I wish we could have a debugging view of the universe,
| draw a diagram with clusters of people labeled with
| company names, and watch them change over time. :-)
| jmtulloss wrote:
| This view would certainly explain to people outside of
| Silicon Valley/ SF why the Bay Area has been so dominant
| in our industry for so many years.
| cduzz wrote:
| Ugh.
|
| Worked at a university in the early 90s.
|
| Maybe irix was okay to use if you were just sitting in front
| of it doing rando user / graphics things, but administering
| it was unbearable. The license fees to get OS updates were
| exorbitant; you'd have to get wacky new licenses to enable
| NFS or NIS and you'd need new kernels for just about
| anything.
|
| As far as I could tell they were a cursed company that hated
| their users. "Here's a pretty thing that does one thing well
| but is otherwise insane and will ruin you when you need it
| most."
|
| Good riddance.
| knorker wrote:
| Well, for SGI that's like saying the world "chose wrong" that
| long distance travel is not done by Saturn 5 rockets.
|
| The Saturn 5 was clearly a technical marvel better than any
| plane, and it'd get you anywhere much faster.
|
| If you spare no expense, you get a better product. Sure. I'm
| also not surprised that a $100k BMW is more comfortable than a
| Renault Clio.
| pjmlp wrote:
| Yes, Irix is one of the few UNIX based OSes that I actually
| find cool.
| hinkley wrote:
| It certainly got fewer complaints than HP-UX.
| pjmlp wrote:
| On HP-UX 10, back in 2000, the C compiler version I was
| using still wasn't fully ANSI C, and needed K&R C function
| declarations, but hey at least we had containers (HP
| Vault), and 64 bit file system access.
| fuzztester wrote:
| What were the complaints that HP-UX used to get?
|
| I used it for a while earlier at work, and don't remember
| many problems with it. One did have to apply OS patches
| fairly regularly to it, but IIRC, that process was somewhat
| smooth.
| hinkley wrote:
| In the time of SGI, I believe it had a lot of posix
| compliance problems.
|
| And if memory serves, the Bible (https://www.goodreads.co
| m/book/show/603263.Advanced_Programm...) didn't cover it,
| which was a problem.
| sys_64738 wrote:
| People are always passionate about various UNIX systems and
| their derivatives like Linux. Windows is so utilitarian.
| pjmlp wrote:
| Outside Irix, Tru64, Appolo, Solaris with NeWS, NeXTSTEP,
| all other UNIXes are pretty meh.
|
| Regarding Windows, some time reading the excellent Windows
| Internals book series is recommended.
| cladopa wrote:
| I never had an Amiga, but I had friends that had it. It was a
| superior tech only for a very small period of time.
|
| What happened was Intel, they took great decisions like
| automating the design of their processors and this made them
| grow at an incredible pace. The Amiga depended on a different
| processor that stagnated.
| KerrAvon wrote:
| The 68k CPU lineup at the heart of the Amiga was competitive
| well into the 90's; the Amiga had run out of juice by 1989.
| The Amiga was only as good as the custom chips. If Commodore
| kept investing in R&D for the custom chips, they would have
| at least remained competitive.
| sys_64738 wrote:
| Intel never pulled ahead until the Pentium but by then
| Motorola weren't interested in the 68K series.
| cameldrv wrote:
| I used some SGIs in the mid-late nineties, and they did have
| cool 3D graphics capabilities. I found 4dwm to be kind of cool
| but mostly gimmicky and it was really slow on the Indy and O2.
| Windows 95/NT were much snappier on contemporary hardware.
|
| By '97 or so SGI actually had essentially given up competing
| when they shut down the team that was developing the successor
| to InfiniteReality.
|
| In a sense though, Silicon Graphics did become more standard,
| in that their original 3D framework was Iris GL, which then
| evolved into OpenGL, which became the main 3D graphics standard
| for many years.
| rongenre wrote:
| I played with SGI machines in college and they felt like.. the
| future. I really hoped they would hire me when I graduated.
|
| Incredible, though, how the relatively cheaper Windows NT
| machines and 3dfx cards and graphics software just killed them. I
| was a little sad when I wandered around the campus of an employer
| in Mountain View and noticed the fading sign that had what was
| left of the SGI logo.
| jandrese wrote:
| The awesome old cube logo or the new "we spent millions of
| dollars on a professional marketing department to design a new
| logo" that is just the initials in a boring font and off
| center?
|
| I co-oped for SGI onsite in the sales/marketing/support for a
| major ISP of the day back in the late 90s and the buzz around
| the office was that the company (at this point experimenting
| with overpriced Windows NT boxes and generic Linux servers) was
| experiencing massive brain drain to some brand new startup that
| was going to make something called a "GeForce" card for cheap
| PCs that was going to avoid the pitfalls of the then popular
| Voodoo cards. Apparently the engineers were unhappy with the
| direction the company was taking under the new leadership and
| thought that there was still an interest in graphics
| acceleration.
| mrpippy wrote:
| The "sgi" logo was a big step down from the cube, but it was
| a lot more attractive than the Rackable/Silicon Graphics
| International "sgi" logo that looked like a cheap knockoff of
| the previous one.
|
| https://en.wikipedia.org/wiki/Silicon_Graphics_International
| theideaofcoffee wrote:
| It really was a letdown when Rackable resurrected SGI and
| then brought about that ... thing of a logo. It just felt
| it hollowed out the brand even more, even if SGI itself was
| still making some interesting hardware at the time-namely
| the Altix 4700, UV and ICE, the soul just wasn't there
| anymore.
| technothrasher wrote:
| > I played with SGI machines in college and they felt like..
| the future.
|
| I had a couple of Indigos that I supported while an
| undergraduate (I had a student job with the University's Unix
| group in their computing center), and the SGIs felt to me
| exactly like the Amiga- Really cool, but kind of lopsided. I
| tended to do most of my work on the SPARCstations and ignore
| the SGIs unless I specifically wanted to play with the graphics
| stuff.
|
| I actually still have an Indigo XS24 that I collected at one
| point over the years. Tried to get it to boot a bit ago but
| it's dead, unfortunately.
| nullindividual wrote:
| 3Dfx didn't play in the SGI space. But Matrox (for 2D), 3Dlabs
| (another RIP), Orchard (used 3Dlabs chip), STB (again, 3Dlabs
| chip...), and Diamond (uh... 3Dlabs!).
|
| 3Dfx grew up in the arcade market. They were always consumer-
| focused.
| davepeck wrote:
| I was there near the end. First, as a summer intern in 1998, and
| then in 1999 as a full time engineer on what is now Google's
| Mountain View campus. SGI had always been a dream company for me.
| I'd first learned about them in high school; now, right out of
| college, I'd somehow managed to land a dream job.
|
| SGI's hardware was cutting-edge and exotic. IRIX was killer
| (sorry Solaris). Cray was a subdivision. My coworkers used emacs,
| too. They put an O2 on my desk!
|
| The dream didn't last long. Major layoffs hit just a few months
| after I started full time. I wrote about the experience here:
| https://davepeck.org/2009/02/11/the-luckiest-bad-luck/
| mrpippy wrote:
| What did you work on at SGI during your brief stint?
| davepeck wrote:
| MineSet, their data mining and visualization package.
| oaktowner wrote:
| I worked at Google from 2013 to 2020. There were definitely
| employees (maybe a majority) who assumed that Google would
| _always_ be _the_ dominant force in technology. Those of us who
| were a bit older always understood that _everything_ changes in
| Silicon Valley.
|
| Those buildings represented that change to me. I can remember
| coming to concerts at the Shoreline in the 90s and looking at
| those Silicon Graphics buildings: they _looked_ so cool, and
| they represented the cutting edge of technology (at the time).
| And yet...it all disappeared.
|
| Same goes for the Sun campus which is where Meta/Facebook is
| now. Famously, the Facebook entrance sign is literally the same
| old Sun sign, just turned around! [0]
|
| So I always cautioned co-workers: this too, shall pass. Even
| Google.
|
| [0] https://www.businessinsider.com/why-suns-logo-is-on-the-
| back...
| dbreunig wrote:
| Meta still has the Silicon Graphics logos on a few glass
| conference room doors in building 16, I believe. At least
| they were there in 2012.
|
| Great memento mori.
| samatman wrote:
| Presumably you mean the Sun logo:
| http://www.logobook.com/logo/sun-microsystems/
|
| Which is one of the all-time greats IMHO. I'd keep it
| around too.
| ryandrake wrote:
| I graduated undergrad in 1998 and can confirm that SGI was
| _the_ company to go to. I felt so jealous of those few guys who
| had SGI offers, where I had to settle for a more generic PC
| graphics company. History is what it is but the SGI really had
| that luster that only a handful of companies ever boasted.
| alecco wrote:
| I had to support an open source library for all major unixes
| and the Irix compiler was by far the best one. It took years
| for the rest to catch up. But it took ages to compile with
| optimizations on. Good times.
| dxbydt wrote:
| > SGI had always been a dream company
|
| It was a dream company for pretty much every siggraph person at
| that time. I was in grad school, eagerly awaiting a very
| popular 3-semester course in computer graphics. It had been
| devised and taught by a young promising professor who had
| published some pioneering siggraph papers. I signed up for the
| course. On the first day of class, the head of the department
| walked in and said the professor had been recruited by his
| dream company SGI for an ungodly sum of money to work on some
| Jewish director's movie about a dinosaur themepark. I thought
| ok, whatever, someone else will teach the course. The bastards
| scrapped the entire 3 series computer graphics module because
| there wasn't anyone else who could teach that. So we had to
| pick from one of the usual dumb options - databases, OS,
| Networks, Compilers. Since then I've always held a grudge
| against sgi.
| brcmthrowaway wrote:
| Jewish director? Hrmph
| Y_Y wrote:
| Spielberg had a bar mitzvah, what more do you want?
| ska wrote:
| > SGI's hardware was cutting-edge and exotic.
|
| This was their downfall, trying to scale out adoption with
| esoteric hardware.
|
| I remember being quoted $18k ish for memory upgrade on a O2 or
| origin, same amount of memory I had just bought for $500 for an
| intel Linux box at home.
|
| Sure, it wasn't apples to apples, but I remember thinking very
| clearly that this wasn't going to end well for SGI.
| assimpleaspossi wrote:
| I was a system engineer for SGI in 1992 working mainly with
| McDonnell-Douglas in St Louis. It was thrilling to be sitting in
| the cafeteria and have Jim Clark plop down next to me for lunch.
| Just one of the guys.
|
| As an outsider--cause I didn't live and work in California--this
| was the go-go atmosphere of such companies back then where they
| thought they could do no wrong. And the after work parties were
| wild (how the heck do you break off half a toilet bowl?).
|
| One of the buildings had plastic over the windows cause that's
| where they were working on the plugin GL card for the PC. (Ssh!
| No one's supposed to know that!)
|
| Being the first system engineer in St Louis, my eyes lit up when
| my manager told me he had ordered an 16-core machine for my
| office--just for me!
|
| I was hired as a video expert. The company re-org'ed and my new
| boss decided he needed a Fortran expert so that was the end of my
| job with SGI.
| johndhi wrote:
| When I was a kid I remember my brother and I asking my dad a ton
| of questions about SG. We viewed them as this amazing awesome
| company that made cool looking towers and the fastest computers
| in the world.
| matthewmcg wrote:
| Much of this history is also described in Michael Lewis's book
| _the New New Thing_ (2000) which profiles Clark and his various
| ventures. It's really a snapshot of pre-.com crash Silicon
| Valley.
| CalChris wrote:
| My takeaway from that book was that Clark invented the dot com.
| mobilio wrote:
| This was explained here: https://vizworld.com/2009/04/what-led-
| to-the-fall-of-sgi-cha...
|
| https://vizworld.com/2009/04/what-led-to-the-fall-of-sgi-cha...
|
| https://vizworld.com/2009/04/what-led-to-the-fall-of-sgi-cha...
|
| https://vizworld.com/2009/04/what-led-to-the-fall-of-sgi-cha...
|
| https://vizworld.com/2009/04/what-led-to-the-fall-of-sgi-cha...
|
| https://vizworld.com/2009/05/what-led-to-the-fall-of-sgi-epi...
| dekhn wrote:
| I was in love with SGI when I was an undergraduate just over the
| hill at UC Santa Cruz in the early to mid 90s. Everything about
| the machines from their industrial designed cases but wonderfully
| colorful cases, and the sexy desktop OS ("This is UNIX. I know
| this!") and the way IrisGL rendered molecular graphics.
|
| Driving to a Phish show at Shoreline, we passed the low-slung
| office buildings of SGI which seemed like the sexiest place to
| work. When I graduated, I thought I was "too dumb in CS" to get a
| job in Mountain View and went to grad school in biophysics
| instead.
|
| By the time I was a few years into grad school, I worked in a
| computer graphics lab outfitted with Reality Monsters and Octanes
| and other high end SGIs (when you maxxed out an SGI's graphics
| and RAM, they were really fast). I was porting molecular graphics
| code to Linux using Mesa (much to the derision of the SGI fans in
| the lab). When we got a FireGL2 card it had a linux driver and
| could do reasonable molecular graphics in real time and the SGI
| folks looked real scared (especially because the SGI Visual
| Workstation had just come out and was a very expensive turkey).
|
| Less than a decade after that I was working in those very
| buildings for Google. Google took over SGI's old HQ (Jeff Dean
| told me there was a period where Google and SGI overlapped in the
| GooglePlex and the SGI folks looked very sad as they paid for
| their lunches and teh googlers got free food). There was still
| plenty of SGI signage strewn about. And now Google has gone dumb
| and also built their own HQ next door (note the correlation
| between large SV companies building overly fancy HQs and then
| going out of business).
|
| Such is the cycle of sexy tech.
| dalke wrote:
| We've talked before about our respective molecular graphics
| background.
|
| I started with Unix on a Personal IRIS as an undergrad working
| in a physics lab which used it for imaging capture and
| analysis. I was the nominal sys admin, with one semester of
| Minix under my belt and just enough to be dangerous. (I once
| removed /bin/cc because I thought it was possible to undelete,
| like on DOS. I had to ask around the meteorology department for
| a restore tape.)
|
| The summer before grad school I got a job at the local
| supercomputing center to work on a parallelization of CHARMm,
| using PVM. I developed it on that PI, and on a NeXT. That's
| also when I learned about people at my future grad school
| working on VR for molecular visualization, in a 1992 CACM
| article. So when I started looking for an advisor, that's the
| lab I chose, and I became the junior co-author and eventual
| lead developer of VMD.
|
| With a Crimson as my desktop machine, a lab full of SGIs and
| NeXTs, and the CAVE VR setup elsewhere in the building. Heady
| times.
|
| I visited SGI in 1995 or so, on holiday, thinking that would be
| a great place to work. They even had an Inventor plugin for
| molecular visualization, so I thought it would be a good lead.
| I emailed and got an invited to visit, where the host kindly
| told me that they were not going to do more in molecular
| visualization because they wanted to provide the hardware
| everyone uses, and not compete in that software space.
|
| In the early 1990s SGIs dominated molecular modeling (replacing
| Evans & Sutherland), so naturally the related tools, like
| molecular dynamics codes, also ran on them. But we started
| migrating to distributed computing, where it didn't make sense
| to have 16 expensive SGIs, leaving them more as the head ..
| which as you pointed out, was soon able to run just fine on a
| Linux machine.
| latchkey wrote:
| Back in 1993, I was in college and working for the extended
| education department running their all their computer
| infrastructure.
|
| One day, someone wheeled this approx. 3x3 foot sized box to my
| door and asked me if I wanted it. It was a SGI Onyx with a giant
| monitor sitting on top, with a keyboard and mouse.
|
| I plugged it in and it sounded like an airplane taking off. It
| immediately heated up my entire tiny office. It was the 4th Unix
| I had ever played with (Ultrix, NeXT and A/UX were previous
| ones). It had some cool games on it, but beyond that, at the
| time, I had no use for it because A/UX on my Quadra950, was so
| much more fun to play with.
|
| I don't even think I ever opened it up to look at it. I don't
| know what I was thinking. lol.
|
| After realizing it did not have much going for it, I ended up
| just turning it on when the office was cold and using it as a
| foot rest.
|
| Oh yea, found a video...
| https://www.youtube.com/watch?v=Bo3lUw9GUJA
| nonrandomstring wrote:
| I agree that these machines and their OS were too proprietary and
| over-engineered to weather the PC revolution. But when I think
| back to my days using Suns and SG (Indigo) the memory feels like
| driving a Rolls Royce or Daimler with leather seats and walnut
| panels.
| Uhhrrr wrote:
| The article doesn't mention the reason for the fall: less good
| but cheaper competitors. First Sun, then Windows NT and Linux.
| cf100clunk wrote:
| > The article doesn't mention the reason for the fall: less
| good but cheaper competitors
|
| The article has this: ''As Bob Bishop took the reigns of SGI,
| things looked dark. AMD announced their 64 bit architecture in
| October, PC graphics had made massive strides while remaining
| significantly less expensive than SGI's offerings, NT was
| proving to be a solid and less expensive competitor to UNIX,
| Linux was eating away at traditional UNIX market segments, and
| Itanium still hadn't launched.''
|
| I can agree with almost all of that statement but I object to
| the ''NT was proving to be a solid and less expensive
| competitor to UNIX'' part as mostly false in any mixed OS
| environment over which I'd ever been admin.
| BirAdam wrote:
| Well, do remember the cost of a UNIX license at the time
| (unless you were using BSD). If you didn't have thousands of
| dollars on hand, NT was a good choice.
| Uhhrrr wrote:
| But that's 1999, and they were already losing money in 1997,
| and the article doesn't say why. Sun was why.
| pipeline_peak wrote:
| Look harder
| browningstreet wrote:
| I had an Indigo2 on my desk in college. I moved back and forth
| between that an a NeXT cube that a colleague had in their lab.
| The NeXT was nice but SLOW. The Indigo2 wasn't especially fast
| but it was nice and could do visual things that just weren't
| available as readily on our alternatives. We had SunOS and
| Solaris systems that were mostly used for network and engineering
| projects, and I was engaged in some visualization work. When the
| O2 was announced I was quite sure it would be the solution to our
| speed issues.. around the same time, another colleague was the
| first to install a beta of Win95, and it did seem awfully pretty.
| ThinkBeat wrote:
| I remember when we got the first Indys at the uni. That was magic
| like. People nearly got into physical fights to use one of them.
| People came in at night to use them as well.
|
| I wonder if the uni is so locked down now that students can sit
| in the lab all night.
|
| Being a bit pragmatic in getting my actual thesis done I
| discovered that there was all of a sudden, a lot more resources
| available on one of the (older) Sun servers.
|
| It saved me days if not weeks.
| vondur wrote:
| It's pretty simple why these Unix vendors all died. Linux and
| Intel chips. Sure you could get a really nice Sun system at the
| time with all of the redundancy tech built in, which cost around
| $50k. Or you can go and get 4 or 5 Linux servers from Dell
| running RedHat which by the early 2000's were faster too.
| KineticLensman wrote:
| They had me at 'Skywriter Reality Engine'. I'd used vaxen at Uni
| in the 80s, then got into industry using Symbolics Lisp machines,
| then had several dark years programming on DOS boxes. The return
| to a truly innovative workstation blew my mind
| sneed_chucker wrote:
| I remember to thank SGI every time I format an XFS filesystem.
| betaby wrote:
| Yes, XFS has been in the Linux kernel since 2001! Time flies.
| martinpw wrote:
| Whenever this topic comes up there are always comments saying
| that SGI was taken by surprise by cheap hardware and if only they
| had seen it coming they could have prepared for it and managed
| it.
|
| I was there around 97 (?) and remember everyone in the company
| being asked to read the book "The Innovator's Dilemma", which
| described exactly this situation - a high end company being
| overtaken by worse but cheaper competitors that improved year by
| year until they take the entire market. The point being that the
| company was extremely aware of what was happening. It was not
| taken by surprise. But in spite of that, it was still unable to
| respond.
| szundi wrote:
| Thanks for this comment, very much appreciated.
| ghaff wrote:
| Having worked longtime for a minicomputer company--which
| actually survived longer than most mostly because of some
| storage innovations along with some high-end Unix initiatives--
| it's really hard. You can't really kick a huge existing
| business to the curb. Or otherwise say we're going to largely
| start over.
|
| Kodak was not actually in a position to be big in digital. And,
| of course, the digital camera manufacturers mostly got eclipsed
| by smartphones anyway a decade or so later.
| loloquwowndueo wrote:
| Data General?
| ghaff wrote:
| Yes. CLARiiON eventually enabled a sale to EMC (which
| arguably saved EMC for a time) and the Unix business
| (especially NUMA servers) were sufficient revenue producers
| for a while to keep the lights on. ThinLiiNe (or whatever
| the capitalization was) never went anywhere but neither did
| a lot of things in the dot.com era.
| loloquwowndueo wrote:
| I knew it :) thanks for confirming! And for sharing.
| ghaff wrote:
| I was the PM for a bunch of the minicomputers from the
| mid-80s on. Then I was PM for the initial Unix AViiONs
| and later the NUMA servers including being one of the
| main liaisons with CLARiiON.
| aurizon wrote:
| On the contrary, Kodak was well placed to do well by
| anticipating 'Moore's Law' as pertinent to sensor pixel
| density and sensitivity versus film. Film resolution was
| towards the end of intense development in pixel terms - not
| much further to go. They had pioneering patents and ongoing
| R&D would have enabled a long period of dominance during the
| transition and to this day!! The board and scientists were
| asleep on a mountain of cash, and they sold their future for
| a few crumbs left for shareholders after bankruptcy.
| Blackberry did much the same with fewer excuses. I met with
| some board members of Kodak in the 80's and they were like
| old English gentlemen - long on pomp and procedure, but they
| wore blinders and a vision bypass - TRIH.
| ghaff wrote:
| Kodak was essentially a chemical company at one point. They
| even spun off an actual chemical company. Kodak could
| probably have played a better hand even if they did
| probably before their time things like PhotoCD. But they
| could have been Apple or maybe Instagram? That's a stretch.
|
| I'm not a particular Kodak apologist but suggesting that a
| company should have been able to anticipate and correct for
| their business collapsing by 90% in a decade or so seems to
| need a lot of particulars.
| xcv123 wrote:
| > But they could have been Apple? That's a stretch.
|
| They could have been a Sony. The iPhone camera sensor is
| made by Sony.
| ghaff wrote:
| And Sony has certainly had rough patches too. And that's
| for a company coming from an electronics manufacturer
| angle.
|
| Kodak could have spun off a consumer electronics or
| semiconductor manufacturing company. But it's not clear
| why that is actually a better model than someone else
| just spinning up similar entities.
|
| I don't need all the chemical engineers and a lot of
| other people connected with the old business anyway. And
| I'm sure not turning them into semiconductor experts.
|
| So you're one of the 10% of employees in HR who snuck
| through to the other side. Is that really a big deal?
| ianburrell wrote:
| Kodak did fine in the transition to digital. They made some
| popular compact cameras and tried to make DSLRs. They were
| wiped out by compact cameras being killed by smartphones.
| The survivors are the old camera makers like Canon and
| Nikon that have ecosystems. The other big survivor is Sony,
| which bought a camera company and makes most of camera
| sensors.
|
| Fuji is interesting, they weren't that successful in first
| digital cameras, but now have some interesting mirrorless
| ones. They still make film.
| maire wrote:
| Kodak was well aware of what was going to happen. Company
| culture killed digital photography.
|
| I was at Apple when we worked with engineers from Kodak who
| were working to change various format standards to allow
| digital photos. This was in the late 1980s or early 1990s.
| ghaff wrote:
| But, from the perspective of today, Kodak would have had to
| basically eclipsed Apple.
|
| Even displacing the big Japanese camera manufacturers, who
| by then had dominated high-end photography, would have
| required reversing decades of a shift away from high-end
| cameras like the Retina line.
|
| I don't doubt there was company DNA against digital
| photography but it's not like non-smartphone photography,
| especially beyond relatively niche pro/prosumer level, has
| had such a good run recently either.
| nradov wrote:
| There is still a lot of business opportunity in supplying
| image sensors and lenses to smartphones.
| chiefgeek wrote:
| But it is nowhere near as profitable as the 35mm film
| system was.
| foobarian wrote:
| I think it's a near impossible situation - the status quo is
| literally something that should not exist given the new market
| realities. Pivoting is pretty much asking a company to commit
| seppuku - asking the layers of leadership to basically replace
| themselves and quit in many cases. Which is pretty much what
| happens anyway.
| ghaff wrote:
| And, at some point, what does it matter if the leadership and
| most of the employees turn over, typically involuntarily?
|
| Is there any significance really to Foot Locker basically
| being a reorganized Woolworth's as opposed to being a brand-
| new company?
|
| If you're big enough and have some product lines that still
| bring in a lot of money and didn't totally collapse like IBM
| you can sometimes pull it off. But it's hard.
| bunderbunder wrote:
| And, just like every Unix workstation vendor of the 1990s,
| they got hit with a perfect storm. They had their hardware
| being disrupted by x86 really coming into its own as a viable
| option for higher-end computing at the exact same time that
| Linux was becoming a serious option for the operating system.
|
| "Literally something that should not exist" is the perfect
| way of putting it. In 1990, lots of people needed boutique
| workstation vendors. In 2000, nobody did.
| bunabhucan wrote:
| It's worse than that. Instead of "nobody" it was
| conservative slow moving vendor locked clients that could
| convince you to keep selling at those prices ("look at the
| margins!") instead of focusing on alternatives. I remember
| $5,000+ PCs being considered "cheap" workstations when
| those clients finally migrated.
| rbanffy wrote:
| Even Apple, which became the unlikely last of the Unix
| workstation vendors, was disrupted by Intel (and moved from
| PowerPC to x86 for a while). Ironically, Apple is now the
| very last Unix workstation vendor in existence.
| AlbertCory wrote:
| > Pivoting is pretty much asking a company to commit seppuku
|
| This is conventional wisdom (and thus, usually correct).
|
| However, it's always interesting to look at counterexamples:
| Beretta, for example (in business for 500 years).
|
| https://www.albertcory.io/lets-do-have-hindsight
|
| or the IBM PC, which cannibalized IBM's business, at least in
| IBM's mind. Thus, they screwed it up and let Wintel make the
| real billions. So it worked, until they took your advice and
| decided that had to stop.
| neuralRiot wrote:
| Probably the lack of vision is not just failing to turn into
| the direction of new "products" but not acquiring, digesting
| and eliminating those busines who start to grow before
| they're too big. See Microchip for example, how many
| relatively small semiconductor and technologies manufacturers
| have already eaten.
| yndoendo wrote:
| When statements like X was better than Y come up. I always
| think of "All models are wrong, but some are useful", from
| statistician George E. P. Box, and rephrase it as "All models
| are flawed, but some are useful" so that model becomes
| ambiguous, such as a smartphone, TV, computer, car, programing
| language, programing design pattern, social media platform, and
| so on.
|
| Price-point, SGI technology was a financially flawed model
| pertaining to the growing market and more useful than flawed
| performance of the low cost technology market.
|
| Did anyone at SGI try to simply buy the low tech products, play
| with them a bit, and see about slowly integrating your tech to
| make that low tech product just a little better than the
| competition and cost effect for the market?
| AnimalMuppet wrote:
| Someone at SGI wrote a paper/web page/blog post titled "Pecked
| To Death By Ducks", claiming that x86 chips could never compete
| with SGI, and claiming to refute all the arguments that they
| could.
|
| Then Intel introduced dual core (or maybe just two chips in one
| housing sharing a bus), and that generated a lot of buzz. So he
| wrote a follow-up titled "Pecked To Death By Ducks With Two
| Bills".
|
| I don't recall the timing, though, how it related to the timing
| of asking everyone to read The Innovator's Dilemma. But at
| least some of the time, there was a pretty deep denial (or at
| least a pretty deep effort to keep the customers in denial).
| MichaelZuo wrote:
| That's really funny for some reason.
| HeyLaughingBoy wrote:
| IIRC, "Pecked to Death by Ducks" is the title of either a
| short (nonfiction) story or a book by Gerald Durrell, one
| of my favorite childhood authors.
| jfk13 wrote:
| I don't recall that one, and I thought I knew Gerald
| Durrell's work pretty thoroughly. There is a book of that
| title by Tim Cahill, though; maybe that's what you're
| remembering?
| HeyLaughingBoy wrote:
| Yesss! Thank you. I'm embarrassed because I used to have
| that one, too :)
|
| On the upside, I've never known anyone else who had even
| heard of Durrell.
| rbanffy wrote:
| A bit like Seymour Cray's plowing a field with 1024 chickens
| instead of two oxen.
| Animats wrote:
| They sort of tried. Around then they had a Windows NT machine
| that cost around US$12,000. But it was too late. The first
| serious graphics cards for PCs were appearing, from Matrox and
| others, with prices of a few thousand dollars.
|
| (I tried some early NT graphics cards on a Pentium Pro machine.
| This was before gamer GPUs; these were pro cards from tiny
| operations. Fujitsu tried unsuccessfully to get into that
| business, with a small business unit in Silicon Valley. At one
| point they loaned me a Fujitsu Sapphire graphics card
| prototype. When I went back to their office to return it, the
| office had closed.)
|
| Also, there was a bad real estate deal. SGI owned a lot of land
| where Google HQ is now. They sold it to Goldman Sachs in a sale
| and lease-back transaction, selling at the bottom of the
| market. That land, the area north of US 101 in Mountain View
| had, and has, a special property tax break. It's the "Shoreline
| Regional Park Community", set up in 1969. The area used to be a
| dump. Those hills near Google HQ are piles of trash. So there
| was a tax deal to get companies to locate there. That made the
| land especially valuable.
| msisk6 wrote:
| SGI tried its hand at the PC video card business as early as
| 1990. I was at Autodesk at the time and got one of these to
| beta test on a DOS 486 running AutoCAD. It was an impressive
| product. But huge; it took up two full-length ISA slots. And
| the display drivers were a bit buggy.
| sillywalk wrote:
| Here's a brochure for the IrisVision boards - uses 2 ISA or
| Microchannel slots.
|
| Prices start start at $3,495
|
| https://www.1000bit.it/js/web/viewer.html?file=%2Fad%2Fbro%
| 2...
| Y_Y wrote:
| Sounds just like Nvidia in 2024
| rbanffy wrote:
| I wish they ported IRIX to x86. You can make more money by
| making stuff for Windows, but it won't protect you from
| market erosion.
| mrandish wrote:
| You highlight one of the most interesting (and perhaps less
| understood things) about the key Innovator's Dilemma insight.
| Even if the senior management have read the Innovator's Dilemma
| books, know they are being catastrophically disrupted and
| desperately trying to respond - it's _still_ incredibly
| difficult to actually do.
|
| Not only are virtually all organizational processes and
| incentives fundamentally aligned against effectively
| responding, the best practices, patterns and skill sets of most
| managers at virtually every level are also counter to what they
| must do to effectively respond. Having been a serial tech
| startup founder for a couple decades, I then sold one of my
| startups to a valley tech giant and ended up on the senior
| leadership team there for a decade. I'd read Innovator's
| Dilemma in the 90s and now I've now seen it play out from both
| sides, so I've thought about it _a lot_. My key takeaway is
| that an incumbent 's lack of effective response to disruption
| isn't necessarily due to a lack of awareness, conviction or
| errors in execution. Sure, there are many examples where that's
| the case but the perverse thing about I.D. is that it can be
| nearly impossible for the incumbent to effectively respond -
| even if they recognize the challenge early, commit fully to
| responding and then do everything within their power perfectly.
|
| I've even spent time sort of "theory crafting" how a big
| incumbent could try to "harden" themselves in advance against
| potential disruption. The fundamental challenge is that you end
| up having to devote resources and create structures which
| actually make the big incumbent less good at being a big
| incumbent far in advance of the disruptive threat appearing.
| It's hard enough to start hardcore, destructive chemo treatment
| when you actually _have_ cancer. Starting chemo while you 're
| still perfectly healthy and there's literally no evidence of
| the threat seems crazy. It looks like management incompetence
| and could arguably be illegal in a publicly traded company
| ("best efforts to maximize/preserve shareholder value" etc).
| throwaway4good wrote:
| Everyone has been reading that book since the late 90es.
|
| I remember a talk by Clayton Christensen talking specifically
| about Intel and how they setup the Celeron division to
| compete with themselves (based on his advice).
|
| A key property of tech in economics lingo is that it is
| "natural monopolies" - all fixed cost and no variable cost.
|
| This creates these winner takes all games. In this case both
| Intel, SGI plus others knew the rules and it just ended up
| with Intel taking the prize and it all becoming Wintel for a
| decade or so - basically until the smart phone allowed enough
| capital to be accrued to challenge the old monopoly.
| bsder wrote:
| "Maximally efficient is minimally robust."
|
| A company that optimizes for efficiency _will_ get stomped
| flat when the environment changes.
|
| The problem is that there are no incentives in business to
| optimize for robustness.
| roughly wrote:
| Well, that's partially because of the converse: a company
| that optimizes for robustness will get stomped flat before
| the environment changes to require robustness. Short term
| games are bad in the long term, but often good enough in
| the short term to win before the long term arrives.
| rbanffy wrote:
| I think SGI failed to understand that there was a point where
| desktop PCs would be good enough to replace dedicated
| workstations. Continuing to make hardware that's much better
| than the best PCs wasn't going to save them after PCs crossed
| the good-enough line - whatever they had, would be relegated
| to increasingly rarefied niches - the same way IBM now only
| makes POWER and mainframes - there is no point of making PCs,
| or even POWER workstations anymore for them, as the margin
| would be too narrow.
|
| SGI could double down on their servers and supercomputers,
| which they did for a while, but without entry-level options,
| their product lines becomes the domain of legacy clients who
| are too afraid (or too smart) to port to cheaper platforms.
| And being legacy in a highly dynamic segment like HPC is a
| recipe for disaster. IBM survived because their IBMi (the
| descendant of the AS/400) and mainframe lines are very well
| defended by systems that are too risky to move tied to
| hardware that's not that much more expensive than a similarly
| capable cluster of generic and less capable machines. As the
| market was being disrupted from under them, they retreated up
| and still defend their hill very effectively.
|
| The other movement they could do was to shift downwards,
| towards the PC, and pull the rug from under their workstation
| line. By the time Microsoft acquired Softimage and had it
| ported to NT, it was already too late for SGI to even try
| that move, as NT was solidified as a viable competitor in the
| visual computing segment, running on good-enough machines
| much, much cheaper than anything SGI had.
| mrandish wrote:
| I think your analysis of the shifting technology landscape
| is largely on target. However, I'm not convinced that the
| true root of SGI's failure was the technology. Clearly
| their tech did need to evolve significantly for them to
| remain competitive but that's a transition which many
| companies successfully make. Even though SGI chose not to
| evolve the tech soon enough, fast enough nor far enough, I
| suspect they still would have failed to survive that time
| period due to an even more fundamental root cause: their
| entire corporate structure wasn't suited to the new
| competitive environment. While the "desktop transition" was
| most clearly seen in technology, I think the worst part for
| SGI was that desktop shifted the fundamental economics to
| higher volumes at lower costs.
|
| SGI had invested in building significant strengths and
| competency in its sales and distribution structure. This
| was one of their key competitive moats. Unfortunately, not
| only did the shift in economics make this strength
| irrelevant, it turned it into a fundamental weakness. All
| that workstation-centric sales, distribution, service and
| support infrastructure dramatically weighed down their
| payroll and opex. This was fine as long as they could count
| on the higher margins of their existing business. While
| it's easy to say they should "just layoff all those people
| and relaunch as a desktop company" that can't be done in
| one quarter or even one year. It requires fundamentally
| different structures, processes, systems and skill sets.
| Hiring, training and integrating all that while paying for
| massive layoffs and shutting down offices, warehouses etc
| takes time and costs a lot of money. Plus, once their
| existing workstation customers saw them shutting down the
| SGI they were customers of to become a different company
| entirely, sales revenue would have taken an overnight
| nosedive. In making such a dramatic move SGI would have
| effectively dumped much of the current quarterly revenue
| _and_ the value of one of their core strengths - all at the
| same moment. Thus turning them into one of their emerging
| startup competitors with all of their disadvantages (no big
| ongoing revenue streams, no big cash pile) and none of
| their strengths (nimble, lower-paid staff and more patient
| venture investors).
| VelesDude wrote:
| It was clear they were trying to do more consumer grade things,
| just look at the N64. Couldn't get more mainstream than that.
| Seeing how the graphics market ended up, it looks obvious from
| here but in the mid 90's it was still the wild west and
| everybody was throwing mud at the wall seeing what would stick.
|
| I have never really said that they where "taken by surprise",
| but a part of it felt like (from the outside) that management
| had been a little blinded by their pass success and the profit
| margins from their workstations combined with no clear path
| forwards for the whole industry. Nvidia could have very easily
| been just a curiosity of the past but they managed to strike it
| lucky standing on the shoulders of others.
|
| If SGI had always been a company that could provide graphics
| workstations the worked with x86/Windows PC's early for example
| - maybe they would have fared better. Would have gone with the
| flow of technology at the time rather than fighting uphill no
| matter the potential technical brilliance. But being saddled to
| their MIPS processors and custom OS meant that once people
| left, they almost never came back. One can have the best tech
| and still fail.
| the_mitsuhiko wrote:
| > It was clear they were trying to do more consumer grade
| things, just look at the N64.
|
| Yes, but the team that did that also left SGI, then worked
| directly with Nintendo for the GameCube and are acquired by
| ATI. I'm not sure how SGI managed to not support that effort
| within itself.
| cuno wrote:
| Nintendo wasn't loyal to the company it was loyal to the
| team, so when they just decided to leave and form ArtX they
| took the customer with them... SGI was happy with the
| Nintendo contract. They earned $1 in additional royalties
| for every single N64 cartridge sold worldwide. Losing the
| team was a big blow.
| rbanffy wrote:
| > provide graphics workstations the worked with x86/Windows
| PC's
|
| Integraph started making PCs with high-end graphics at one
| point, when they abandoned CLIX and gave up on their
| (Fairchild's, really) Clipper processor. It didn't work for
| them either. SGI did their own "Visual Workstation" that ran
| Windows and had a Pentium, but that too was a huge
| disappointment.
| appstorelottery wrote:
| I was making crazy money in the dot-com boom and bought a SGI
| 540 in 1999 (with an SGI monitor).
|
| With money to burn SGI was a childhood brand, legends in 3D.
| Such wonderful memories. 15k on a desktop setup - it was loose
| change, however it shows how clueless I was back then. However
| I'd felt like I'd "arrived".
|
| SGI with Windows NT - lol - I wrote my first OpenGL game in
| Visual Basic... I've always been somewhat of an outlier ;-) God
| help me.
|
| The point? My personal experience says something about the
| strength of the SGI brand - even in the face of what was
| happening at the time (3DFX and so on - my previous company was
| one of the few 3DFX api devs - illustrating how clueless I
| was...)... it all happened so quickly... I'm not surprised SGI
| couldn't respond - or more importantly understand the strength
| of Microsoft/OpenGL/DirectX in the boiling pot of 3DFX / Nvidia
| and the rest... From memory it took three years and SGI was
| done - shared memory architecture? No longer worth the cost.
| :-(
|
| Looking back, I was such a kid - a complete fool.
|
| Greybeard advice: bet on the monopoly. Be smart. Brands like
| SGI are nothing in the face of install base. Think about how
| crazy it was to spend 15k on a desktop SGI back then...
| nostalgia is insanity, vanity.
| lizknope wrote:
| Yeah but it still sounds really cool!
|
| From 1991 when I first saw SunOS I wanted a SPARCstation. I
| started college in 1993 and the school was full of DEC Ultrix
| machines, Suns, HP PA-RISC, and a handful of IBM RS/6000 and
| SGIs.
|
| I just thought DOS/Windows PCs were such garbage. Single
| user, no preemptive, multitasking, no memory protection. Then
| Linux came out and it changed everything. I bought a PC just
| to run Linux. My dream of a Unix RISC workstation faded away.
|
| My roommate in 1996 bought a DEC Alpha. Not the cheaper
| Multia but an Alpha that could run OSF/1 Digital Unix. He
| actually ran NetBSD on it.
|
| In 1997 I took the computer graphics class and we used SGIs.
| There was just one lab of them reserved for that class and
| grad students. I was so excited and it was really cool but I
| didn't think I could ever afford one. It's still really cool
| though that you had one.
| hintymad wrote:
| I remember Clayton Christensen mentioned that Andy Grove
| invited him to Intel to talk about how to deal with the
| dilemma, and interrupted Christensen while he was talking and
| said something like "I know the problem, and I need you to tell
| me the solution". Similarly, Peter Drucker repeatedly mentioned
| one of the biggest challenges in business is "killing the cash
| cow". Along that line, Netflix's Reed Hasting is really
| amazing. He somehow managed to kill the DVD business and used
| it to milk the streaming business, when almost everyone in the
| industry and some of his lieutenants in Netflix didn't believe
| him.
| froonly wrote:
| For a while you could view Netflix online _and_ rent DVDs
| from them.
| RandallBrown wrote:
| Oh dang, I thought you still could. Looks like they shut
| down the DVD rentals about 6 months ago.
| specialist wrote:
| Yes and:
|
| These years later, while the innovator's dilemma thesis
| describes what, there's still little treatment of why and
| how.
|
| I keep wanting someone to account for the roles of investment
| and finance.
|
| Amazon's innovation was lower cost of capital. They convinced
| investors to wait for returns. And they got a massive tax
| holiday. (How could they not succeed?)
|
| Ditto Tesla, with its saavy moves like govt loans,
| prepurchases, tax incentives, and selling direct.
|
| That cheap capital was necessary, but not sufficient. Both
| still had to create products customers wanted.
|
| I keep coming back to Apple. How'd Apple avoid the trap?
| Despite their terrible position. My guess is better financial
| strategy (if that's the right phrase). Apple focused on
| margins (and monosophony) instead of market share. And then
| leveraged their war chest to implement monosphony.
| foobiekr wrote:
| I was at one of SGI's competitors and we had teams doing cheap
| HW - ATi and other cards, like IBM's GPUs at the time - and yet
| the company as a whole was like "ALL GOOD LET'S KEEP BUILDING
| MASSIVE CUSTOM GRAPHICS SYSTEMS!"
|
| They were as dead as SGI in the same timeframe.
| jiggawatts wrote:
| In the late 90s I was in the last year of high school. Silicon
| Graphics came to do a demo of their hardware for students that
| were interested in taking a computer science course at
| university in the following year.
|
| The graphics demos looked like trash, basically just untextured
| and badly shaded plain colored objects rotating on the screen.
| For reference I was playing Quake III around the time which had
| detailed textures and dynamic lighting.
|
| I asked the SGI presenter what one of his Indigo workstations
| cost. He said $40,000, _not including the graphics card!_
| That's extra.
|
| I laughed in his face and walked out.
| dekhn wrote:
| In the late 90s, SGI demos were much more impressive than
| what you describe. It was used by technical folks to do real
| stuff, with stringent criteria.
|
| More importantly, the things that made Quake III so great
| were state-of-the-art for gaming. But those things couldn't
| render lines quickly and well (a mainstay of CAD at the
| time), or render at very high resolution (which IIRC was
| 1280x1024 in that era).
|
| Here's what Carmack said abotu the SGIs a few years before:
| """SGI Infinite reality: ($100000+) Fill rate from hell.
| Polygons from hell. If you don't trip up on state changes,
| nothing will come within shouting distance of this system.
| You would expect that.""" SGI was also key for map builds
| before PCs were capable.
|
| But yes, 1999-2000 was just around the cusp of when SGI went
| from "amazing" to "meh".
| rbanffy wrote:
| The curve that maps fucking around with finding out is not
| linear. By the time you start finding out, it's very hard
| to stop finding out much more than you would like to.
| blackoil wrote:
| I believe having a talented dictatorial leader at top may be
| only solution. Like Steve Jobs, Bill gates or Jeff Bezos. Once
| they believe in a path, they have methods to get it done.
| Internet Tidal Wave memo is a good example of it. Zuckerberg is
| able to invest 100s of billions on a future he believes in.
|
| Obviously the observation has a confirmation bias.
| Keyframe wrote:
| Steve Jobs on Silicon Graphics:
| https://www.youtube.com/watch?v=iQKm7ifJpVE
| Apocryphon wrote:
| Low-key iconic machines, Hollywood kept putting them in movies.
|
| http://www.sgistuff.net/funstuff/index.html
|
| https://www.starringthecomputer.com/computers.html#SGI
| pjmlp wrote:
| A bit of trivia, SGI used to host the C++ STL documentation based
| on HP libraries, pre-adoption into the standard.
| timthorn wrote:
| I miss SGI for many reasons, but their industrial design (and
| that of pre-HPE Cray) is one of the big ones. The 19" rack form
| factor is the shipping container of the computing world -
| practical, standardised, sensible... and dull.
|
| The variety in enclosures matched the novelty in architectures of
| the period. Exciting times to be part of.
| vrinsd wrote:
| I believe nVidia was started with a lot of SGI's core technology
| ; not "I have a good idea and I can't do it here" and more like
| "let me just take this stuff I doubt anyone will notice". I think
| SGI sued but didn't really pursue the matter because they didn't
| really see nVidia as a threat. I think Jensen was pivotal in this
| "technology transfer".
|
| Regarding computing cycles, boom/bust, I recently re-read Soul of
| New Machine and was struck by how much the world has NOT changed.
| Sure we're not talking about micro/mini-computers and writing
| micro-coded assembly but the whole "the market is pivoting and we
| need to ride this wave" and "work like a dog to meet some almost
| unobtainum goal" seems to still underpin being an engineer in
| "tech" today.
| miohtama wrote:
| The Nvidia lawsuit is discussed in the article.
| _DeadFred_ wrote:
| I mean there's more to it. NVidia literally just took SGI's
| IP. Only more blatant start was Cisco, where they straight
| stole a University computer.
| meekaaku wrote:
| where can i read about this cisco thing?
| vrinsd wrote:
| https://www.tcracs.org/tcrwp/1origin-of-cisco/
| takinola wrote:
| Cisco was started by a husband/wife team who were the heads
| of IT for the Stanford Electrical Engineering School and
| Business School respectively. Anecdotally, they first
| developed the technology trying to connect the networks for
| both schools.
| alecco wrote:
| (1999) https://www.eetimes.com/sgi-graphics-team-moves-to-
| nvidia/
| markus_zhang wrote:
| I love "Soul of New Machine" too. It was a blast to read. It
| even made me ponder the possibility to start over at 40+ and do
| something hardware-wise (or very low-level software). Of course
| I then found myself drown by 2 mortgages and dropped the
| thought.
| formerly_proven wrote:
| From my reading SGI was already dead and falling apart by that
| time. If you look at 3D, SGI had two graphics architectures in
| the 90s: RealityEngine from 1992 and InfiniteReality from 1996.
| They never managed to release a follow-up to IR. Similarly
| everything that came after about 1996-97 was a refresh of a
| prior product with only marginal changes. And then they went
| bankrupt in the early 2000s. So SGI had really only a very
| brief productive period that was over by the second half of the
| 1990s.
|
| SGI also never had a presence in business critical applications
| which gave some of the other vendors more momentum (HP-UX/PA-
| RISC, VMS/Alpha, Solaris/SPARC).
| vrinsd wrote:
| Well,
|
| Most Hollywood effects were all done on SGI systems before
| the slow migration to Linux. Renderman, Maya, were all SGI
| first-party programs.
|
| Also SGI made huge advances in NUMA and machines with dozens
| of CPUs/processors before most other companies ventured into
| this space.
|
| But not business critical like IBM CICS or Java.
|
| 1. https://en.wikipedia.org/wiki/NUMAlink
|
| 2. https://www.cs.ucr.edu/~bhuyan/CS213/2004/numalink.pdf
|
| 3. https://cseweb.ucsd.edu/classes/fa12/cse260-b/Lectures/Lec
| 17...
| sllabres wrote:
| The large Origin servers and the nice indigo workstations
| at trade fairs with their cool real time visualizations
| comes into my mind. Also applications like Softimage, the
| 4Dwm desktop ...
|
| Later the large Altix NUMA systems with core counts in
| unprecedented sizes (and problems booting due to lock
| contention ;)
|
| And of course their donation of the XFS filesystem to the
| linux world!
| cuno wrote:
| I worked at SGI on the next generation (code named Bali) in
| 1998 (whole year as an intern) and 1999 (part time while
| finishing my degree, flying back and forth from Australia).
| Bali was revolutionary. The goal was realtime Renderman and
| it really would. I had an absolute blast. I ended up
| designing the highspeed data paths (shader operations) for
| world's first floating point frame buffer (FP16 though we
| called it S10E5) with the logic on embedded DRAM for maximum
| floating point throughput. It was light years ahead of its
| time. But the plug got pulled just as we were taping out.
| Most of the team ended up at Nvidia or ArtX/ATI. The GPU
| industry was a small world of engineers back then. We'd have
| house parties with GPU engineers across all the company names
| you'd expect, and with beer flowing sometimes maybe a few
| secrets could eh spill. We had an immersive room to give
| visual demos and Stephen Hawking came in once pitching for a
| discount.
|
| For team building, we launched potato canons into NASA Moffet
| field, blew up or melted Sun machines for fun with thermite
| and explosives. Lots of amazing people and fond memories for
| a kid getting started.
| sgt wrote:
| The article went very quick from rowdy teen throwing smoke bombs
| to... boom, he has a PhD in Computer Science.
|
| Really interesting article that goes into depth regarding the SGI
| products. I didn't know Clark basically invented the GPU.
|
| Speaking of exotic hardware, I'm actually sitting next to an SGI
| O2 (currently powered off). A beautiful machine!
| cellularmitosis wrote:
| There's a kid on youtube (username 'dodoid') who put together a
| series of videos on SGI. Neat to see someone young get super
| enthusiastic about computing history.
| fnordpiglet wrote:
| I was at SGI and left with Clark to Netscape. That was a fun time
| in my career. That was also a time that it became clear SGI was
| about to explode as commodity GPUs were being developed by former
| SGI engineers and the core SGI graphics teams were attritting
| hard. Fun to read this story again and learn some of the things
| that happened later.
| alecco wrote:
| Interesting insider comment from a previous thread:
|
| https://news.ycombinator.com/item?id=30920824
| jibbit wrote:
| i'll never forget my 1st time coming into a high end Flame suite.
| it was so exciting.. i don't think i could have been more excited
| if you'd told me i was stepping into the worlds first time
| machine
| Keyframe wrote:
| It still has that cachet with, well a bit older generation. I
| have an Indy and Indigo2 (purple - max impact one) and when
| someone visit and sees the machines it's like you sprawled a
| vintage ferrari in (our) eyes. I don't think there's anything
| comparable today when we have it all.
|
| I demonstrated IRIX to younger colleagues and it was - ok, so
| it's alright I guess, like anything else we have today? Yep.. but
| contemporary world was NOT like that.
|
| I had an Octane as well, heavy and loud beast. All in storage now
| waiting for move. In late 90s I worked a lot on SGI machines
| (vfx).
| cf100clunk wrote:
| I'd never met Rick Belluzzo, but having known SGI insiders of his
| era I gathered that he'd driven the company strongly towards NT
| on x86 at the cost/risk of losing their corporate and personal
| UNIX competencies, causing some internal outrage. When his time
| was up at SGI he quickly popped up at Microsoft running MSN in
| what I saw as a synecure meant as a pat on the back for a job
| well done. Am I right on this?
| shrubble wrote:
| That is a widely held view among Unix nerds like me, at least.
| jra_samba wrote:
| My recollection of that time (I was at SGI when Belluzzo was
| there).
|
| https://www.linkedin.com/posts/jeremyallison_wither-
| google-f...
| canucker2016 wrote:
| There's the "SGI Irix bloat" internal email that got posted to
| Usenet.
|
| Took me awhile to find a copy on the net,
| https://www.seriss.com/people/erco/sgi-irix-bloat-document.t...
|
| Here's a formatted-for-HTML version:
| http://www.art.net/%7Ehopkins/Don/unix-haters/tirix/embarras...
| icedchai wrote:
| Meanwhile, today, we have chat applications taking up almost a
| gig of memory...
| Arathorn wrote:
| There's a surprising amount of good info here about very first
| IRISes. I ended up with 3 of the very first IRIS 1400
| workstations mentioned in the post which NASA Ames bought; at
| least one of which is still in working order. They were my first
| unix workstations (running a Unisoft-based sysv/bsd hybrid pre-
| IRIX variant), and is where I first learnt UNIX, C, IRIS GL, vi
| and lots of other good stuff. Irritatingly they shipped with an
| XNS rather than TCP/IP network stack (although I did get hold of
| a beta IP stack, I never got it work). I did get XNS working on a
| LAN using their EXOS 101 ethernet cards tho.
|
| In case anyone's interested, their graphics card (GE1, the
| world's first ever hardware 3D graphics card?) looks like:
|
| https://matrix-client.matrix.org/_matrix/media/v3/download/m...
|
| ...and the PM2 68k processor card mentioned in the post looks
| like:
|
| https://matrix-client.matrix.org/_matrix/media/v3/download/m...
|
| ...and one of the machines itself looks like:
|
| https://matrix-client.matrix.org/_matrix/media/v3/download/m...
|
| Suffice it to say that I have a _very_ soft spot for these
| machines :)
| danans wrote:
| > On the 10th of July in 2003, SGI vacated and leased their
| headquarters to Google.
|
| I was around to witness the he tail end of that office space
| transition (on the incoming side at Google). It was surreal to be
| sitting in the physical carcass of a company I had long
| fantasized about (in part due to their marketing via Hollywood).
|
| In retrospect it was ironic because a company that was based on
| selling very expensive high performance compute (SGI) was being
| physically replaced by a company selling (albeit indirectly) very
| cheap high performance compute.
| ben7799 wrote:
| I was kind of right the exact age to see all this happen all
| while I was in college.
|
| Fall 95 enter freshman year and we had Indys and IBM RS6000s as
| the main workstations on campus. Really great setup where you
| could sit at any workstation and all your stuff just worked and
| your whole environment seamlessly migrated. The only thing you
| had to do was if you were compiling your own stuff you'd have to
| recompile it for the machine you sat down at.
|
| SGI brought a demo truck to campus in the spring of my Freshman
| year (Spring 96) and blew us all away. They were there for
| interviews, obviously I was a freshman but we all went to check
| it out.
|
| Summer 96 I get an internship and for kicks they gave me an Indy
| with a 21" CRT (huge at the time) and the silly video camera that
| was like 10+ years ahead of it's time.
|
| Fall 96 we got labs full of O2s.
|
| Fall 1997 I bought a 3DFX card. MS/Intel somehow made a donation
| to the school and got them to start phasing out the Unix
| workstations. The windows NT setup was terrible, they never had
| the printing and seamless movement of files down till after I
| graduated. Video games in the Fall of 1997 on the 3DFX were
| basically as impressive as the demos on the $100k refrigerator
| sized machine SGI showed in 1995.
|
| Probably fall 1998 I remember my Dad got a computer with an
| Nvidia Riva 128.
|
| Spring 99 I graduated and that fall I rebuilt my PC with a
| Geforce 256.
|
| I'm not sure when I last saw an SGI, but I did briefly use one of
| their NT machines IIRC.
|
| Last time I had a Sun machine at work was probably 2004. I
| remember maybe 2007-2008 at work deciding for the first time we
| were going to support Linux, then by 2010-11 we had dropped
| support for Sun.
|
| Most of the commercial Unix workstations had tons of Unix
| annoyances I never found Linux to have. Irix was maybe the best.
| HP-UX was super annoying I remember. I didn't use DEC Unix and
| Tru64 much. Closed source PC Unix like SCO I remember being
| horrible.
| fuzztester wrote:
| This reminds me of the book The Soul of a New Machine.
| temporarely wrote:
| We had a few SGI boxes in arch school (for cad). I learned C++ on
| those boxes (the GL api, which I thought was beautiful btw). But
| the main attraction were the two demos (this is '91-2): the
| flight simulator was awesome and there was this deconstructing
| cube thing that was pretty wild as well. Compared to what was on
| PCs those days those SGI machines were truly dazzling.
___________________________________________________________________
(page generated 2024-04-05 23:00 UTC)