[HN Gopher] The computers used to do 3D animation for Final Fant...
       ___________________________________________________________________
        
       The computers used to do 3D animation for Final Fantasy VII in 1996
        
       Author : marcobambini
       Score  : 442 points
       Date   : 2022-04-07 16:55 UTC (6 hours ago)
        
 (HTM) web link (lunduke.substack.com)
 (TXT) w3m dump (lunduke.substack.com)
        
       | Melatonic wrote:
       | Now I want to know what they were using to animate the 3D assets
       | that ran on the Sega Genesis (1988) and the add-on Sega 32X
       | (released 1994)!
        
       | ei8ths wrote:
       | one thing i dont miss, those monitors.
        
       | jscheel wrote:
       | I studied 3d animation in college from 2001-2004. Our lab was
       | outfitted with tons of SGI Octane workstations. By the end, we
       | were getting better performance out the the one lone mac there,
       | though. Was such an awesome animation lab. I kinda miss those
       | days.
        
       | lispm wrote:
       | Such a Lisp Machine is usually one machine with two screens.
       | Typically it would be a XL1200 (or earlier an XL400). It would
       | have a black&white console and a color screen. The color screen
       | would be driven by a color graphics card, possibly a FrameThrower
       | - which is an accelerated graphics card.
       | 
       | The graphics editor seen is just the S-Paint part of S-Graphics -
       | it could use a FrameThrower, but also other graphics cards. There
       | was also S-Paint on the MacIvory running in a Macintosh.
       | S-Graphics also ran on earlier Lisp Machines from Symbolics, like
       | a 3670 from 1984.
       | 
       | A bunch of TV studios, video production companies, animation
       | studios and game developers were customers.
       | 
       | https://www.youtube.com/watch?v=Cwer_xKrmI4 from 6:09 shows the
       | using such a Paint system on a Symbolics.
       | 
       | The software it runs is S-Graphics, which was later ported to
       | SGIs and Windows machines as N-World, by Nichimen (then using
       | Allegro CL from Franz Inc.).
        
         | eddieh wrote:
         | I had no idea these machines existed. When tasked with writing
         | a ray tracer way back in college, the first thing I did was
         | create a scene description format based on S-expressions.
         | Yesterday I was nostalgically looking at backups of the
         | assignment and found my first scene file:
         | (camera 0 0 -20)                   ; red sphere         (sphere
         | 0 1 50 5 (material (color 255 127 127) 0.8 0.8))
         | ; green sphere         (sphere -20 1 200 9 (material (color 127
         | 255 127) 0.8 0.8))                   (plane 0 1 0 20 (material
         | (color 127 127 255) 0.8 0.8))
         | 
         | I went so far as to make the scenes scriptable with Guile. I
         | had another scene that procedurally generated spheres
         | positioned about a helix, but that seems to be lost to the bit
         | gods.
         | 
         | To me, there's something very natural about using S-expressions
         | to create graphics. I wish there was a video (with a high
         | enough resolution) that shows the Lisp interactions--especially
         | in the subdivision modeler.
        
           | lispm wrote:
           | https://www.youtube.com/watch?v=gV5obrYaogU&t=5s
        
         | miltondts wrote:
         | The more I look at software from the 80-90s, even the 60s the
         | more it seems it has barely moved in terms of capabilities.
        
       | Sohcahtoa82 wrote:
       | > Originally released in 1993, the Onyx from SiliconGraphics was
       | an absolute powerhouse. The machines were powered by between one
       | and four MIPS processors (originally the R4400's) -- ranging from
       | 100 MHz to 250 MHz.
       | 
       | 250 MHz in 1993 is insanity, considering that was the 33 MHz 486
       | era.
       | 
       | > The RAM on these machines were not industry standard [...] and
       | could handle up to 8 GB of RAM. 16 GB in the rackmount version
       | (yeah, there was a massive rackmount version).
       | 
       | 8 gig of RAM at a time when home users didn't even have 1 GB hard
       | drives. 16 GB of RAM at a time when a home user's desktop could
       | read memory at < 100 MB/s.
       | 
       | Having those specs then would be like running a 37 Ghz CPU with
       | 16 TB of RAM now.
        
         | qbasic_forever wrote:
         | Their price reflected their capabilities. I think they were
         | something like $20k on the low end. A nice new car in 1993 was
         | maybe $10k. Lots of homes even in good cities sold for $20k.
         | 
         | edit: Wow these were actually $100-250k! Back in 1993 that was
         | an immense amount of money. I bet you could have bought a nice
         | San Francisco row house in the mission or other hot area for
         | $100k back then.
        
           | jweir wrote:
           | In 1993 I bought a used SGI Personal Iris for 10k.
           | 
           | 32mb Ram, 1GB hard drive, 19"monitor. Monitor and system
           | weight 80lbs if I remember correctly.
           | 
           | Alias was an additional 16k.
           | 
           | It sounded like a jet engine.
           | 
           | I loved it.
        
             | tomatowurst wrote:
             | omg. i have magazines from this era. what were you using it
             | for ? tell us more!
        
             | holoduke wrote:
             | I wonder where all those machines ended up. Would love to
             | get one and play with them.
        
               | [deleted]
        
               | smm11 wrote:
               | I gave one away several years back to a high school art
               | department. The other one is in my garage under two
               | inches of dust.
        
               | tomatowurst wrote:
               | well you should consider auctioning that, it will fetch
               | lot of eager attention
        
               | notreallyserio wrote:
               | Including the dust, if it is from the same era!
        
               | tomatowurst wrote:
               | put that in a non-descript zip bag and include it in the
               | shipping box!
        
           | eternityforest wrote:
           | If only housing had the same kind of price drop
        
             | ajmurmann wrote:
             | If regulations for building houses were compostable to
             | those for building computers...
             | 
             | Just think you had to ask the most opinionated people in
             | your town every time you want to get a new computer and
             | potentially ask people in your neighborhood when you want
             | to make minor changes to you existing computer...
        
               | Gollapalli wrote:
               | Microsoft is the new HOA, and Google the new building
               | commission.
        
             | imtringued wrote:
             | Houses got bigger faster than they got cheaper. That's on
             | top of land going up in value.
        
               | eternityforest wrote:
               | Yeah that's the problem, you can't just make more land,
               | so there's no incentive to do anything other than make
               | any area you own into an expensive area.
        
           | brailsafe wrote:
           | must be nice to have made a 30x return on your house in 30
           | years
        
             | tomatowurst wrote:
             | how about 50x return? Home that cost $100,000 CAD is now
             | well past $5,000,000 where I live. If someone hodl'd their
             | property in Vancouver even longer than that they would be
             | seeing 100x return.
        
             | namecheapTA wrote:
             | In the central valley of California, a house cost $130k in
             | 1993 and today sells for $400k. In those 30 years you
             | probably painted it a few times, changed the roof just
             | recently, changed the kitchen, and so forth. Plus 30 years
             | of property tax. So you're probably in for $200k on that
             | $130k house. And that's not even counting interest on
             | mortgages that most buyers had in 1993.
             | 
             | Doubling your money in 30 years isn't great at all. And
             | this is within about 50 miles of tech jobs, although the
             | drive will take 90 minutes in the morning and 60 minutes at
             | night.
        
               | peregren wrote:
               | You also get to live in the house which is a pretty good
               | return.
        
               | vikingerik wrote:
               | Houses do have utility. Doubling your money _in an asset
               | that provided living space for 30 years instead of paying
               | rent_ is pretty great.
        
               | namecheapTA wrote:
               | Ok yes but even if this was a rental home and the renters
               | basically paid the mortgage.. so you didn't get any
               | utility out of it. 30 years later you made $200k, even if
               | you put basically nothing down. As a return it's a lot
               | percentage wise. Overall though, it's not life changing
               | for most people. It only gets crazy money in the most
               | desirable of areas. And that desire circle is basically
               | described as driving range from the highest paying jobs.
        
               | Mikeb85 wrote:
               | But they don't have so much utility that they should
               | offer such returns on top of depreciation. The market is
               | broken.
        
               | iso1210 wrote:
               | It's not the house that appreciates much in value, its
               | the land the house sits on.
        
               | ChuckNorris89 wrote:
               | _> In the central valley of California, a house cost
               | $130k in 1993 and today sells for $400k_
               | 
               | Holy shiz, is California real estate really that cheap?
               | 
               | 400K is a very basic house in the outskirts of a city in
               | Austria(Europe), and tech jobs here pay 1/4 of what you
               | can make in California. I feel we're being scammed over
               | here with housing and wages.
        
               | johnnyanmac wrote:
               | If you're willing to basically live in the boonies, yes.
               | But you're not finding housing in or around the cities
               | for less than a million in california. the median house
               | in my area (after a skim on Zillow) is 800k and I'm a
               | good hour north of downtown Los Angeles. Move another 30
               | minutes north (pretty much in literal desert) and housing
               | is more around these numbers of 400-500k.
               | 
               | But who knows? With WFH being more accepted I can see
               | some less city oriented folk moving out to those areas
               | and gentrifying it. It may be desert, but it still has
               | everything you'd want out of a neighborhood outside of
               | entertainment.
        
               | sneak wrote:
               | The central valley isn't really "on the outskirts of the
               | city" in the American sense. It's a lot of farmland.
        
               | [deleted]
        
               | [deleted]
        
               | namecheapTA wrote:
               | If houses are so expensive vs labor, you could always buy
               | some land and have one built? The central valley city I
               | was talking about is Tracy, CA. Without traffic, it's
               | takes 1 hour driving to get to the tech companies. Monday
               | to Friday at rush hour it would take 2 hours. 90 minutes
               | offpeak at 9am instead of 7am. Virtually zero tech
               | workers are willing to make that drive.
               | 
               | Also, quality of life in Tracy is pretty poor. Property
               | crime is pretty big. Virtually nothing for children to
               | do. Young adults go to the same 5-6 average restaurants
               | and that's it. I'd much prefer to live in a town near a
               | nice city in Austria.
        
               | aeyes wrote:
               | Labor is expensive though. It's just that your income in
               | relation to labor cost is relatively low due to high
               | taxes, health insurance, pension fund and so on.
               | Including your employers payments you go home with maybe
               | 40% of what the company pays for your work.
               | 
               | Building even a simple new home will set you back at
               | least 300k.
        
               | namecheapTA wrote:
               | I guess quit your job for a year and build your own home!
               | Maybe get into the home building business altogether.
        
               | Melatonic wrote:
               | You do not want to live in the central valley - that is
               | why it is so cheap
        
             | qbasic_forever wrote:
             | The real winners are the boomer generation. When they
             | graduated college in the mid 60's wages were enormous
             | relative to the cost of property. Property in all the now
             | hotly competitive areas was dirt cheap too. If you bought
             | in SF back then you've probably made 100x or more on the
             | property value.
        
           | stergios wrote:
           | I think the house prices were higher. I sold a house in
           | Mountain View (Monta Loma neighborhood) in Jan 1995 for
           | $255k. And that was a bit of a distressed sale as I was the
           | executor for an estate.
        
           | johnnyanmac wrote:
           | >I bet you could have bought a nice San Francisco row house
           | in the mission or other hot area for $100k back then.
           | 
           | To be honest, the idea of even getting a down payment of 100K
           | for a house like that in SF is insanity. Crazy how prices
           | skyrocketed in 30 years (I'm guessing 2008 didn't help much).
        
           | Razengan wrote:
           | I loved QBasic
        
         | bluedino wrote:
         | > 250 MHz in 1993 is insanity, considering that was the 33 MHz
         | 486 era.
         | 
         | To be fair I don't think the 200MHz chips came out until 1995,
         | when you could also get a Pentium Pro in similar speeds.
        
         | jiggawatts wrote:
         | > a 37 Ghz CPU with 16 TB of RAM now.
         | 
         | Something common in the hypervisor admin space is expressing
         | the compute capacity by multiplying the core count with the
         | clock speed.
         | 
         | So for example a 64 core EPYC at 2.5GHz is written down in
         | documentation as a 160 GHz processor.
         | 
         | So these Onyx desktops are directly comparable to artists using
         | a high-end Mac Pro or Threadripper workstation.
        
           | zozbot234 wrote:
           | OTOH, these machines were already quad processors (at the
           | high end). And I don't know if you can feasibly have 16TB of
           | RAM in a single compute node. In that case you'd need a rack-
           | scale multi-node system like the gear the Oxide folks are
           | pushing for "hyperscaler" workloads. (My guess is that such a
           | thing _could_ be made useful to a single user - along the
           | lines of Alan Kay 's quote referenced in a sibling comment -
           | but it would likely need to be something that involves
           | chewing through humongous amounts of data, to really make use
           | of that scale. Not sure if the art-creation use case has any
           | real need for that nowadays. Some sort of compute-heavy data
           | analytics for decision support and the like would be a lot
           | closer to the mark.)
        
           | tomxor wrote:
           | If you want to use that metric then the Onyx would be:
           | 
           | 4x 250MHz R4400s + 12x 40MHz Intel i860s = 1480 MHz when
           | maxed out.
           | 
           | But clock speed is a very poor metric for comparing older
           | CPUs. The size of these CPUs were much smaller, the number of
           | instructions they could cram into each clock cycle and
           | complexity of those instructions was no doubt far lower, and
           | the variety of instructions and ability to specialise far
           | lower (i'm not even considering the whole RISC and MIPS
           | thing, just the fact that they are working with much smaller
           | dies and far fewer, larger gates)... instructions per second
           | might be a better comparison, but then you still lack that
           | qualitative difference in the variety of instruction as i
           | said.
        
         | p_l wrote:
         | Another cpu with similar clock of that era was Alpha, whose
         | first real supplies were 150MHz, quickly updated to 200MHz
        
         | slightwinder wrote:
         | > 250 MHz in 1993 is insanity, considering that was the 33 MHz
         | 486 era.
         | 
         | More precisely, it was the start of the Pentium era. The First
         | Pentiums with 60/66 MHz were released in March 1993. But
         | interesting enough, it seems the R4400 were already 64-bit.
        
           | theodric wrote:
           | My Indigo 2 R4400SC couldn't run 64-bit IRIX. For that I had
           | to swap the motherboard and CPU for an R10000.
        
         | sandos wrote:
         | I once worked on a military simulator, and it ran on very
         | similar hardware, it might even have been the onyx. Actually
         | yes I do believe thats it now that I've googled it! I worked on
         | this in 2011-12, and I can tell you compiling stuff was not
         | fast! It was kind of funny thinking that this had once been a
         | "supercomputer"... now it was slower than even old desktops!
        
         | downrightmike wrote:
         | I just wish they would find a better port to remaster, the
         | current one is a low resolution mess and looks far worse on PS4
         | than the original on PS1 because it was a bad PC port back in
         | the day and were the only files they could find.
        
           | louhike wrote:
           | Unfortunately they lost all the files of the original game,
           | so they can only: - Use the PC port - Emulate the PS1 game -
           | Rebuild it (which will take a lot of time and resources)
        
             | colordrops wrote:
             | > Unfortunately they lost all the files of the original
             | game
             | 
             | I'm always flabbergasted when I hear of something like
             | this. Such as NASA losing many videos of the moon landing.
             | How does that happen?!
             | 
             | Could the existing copies of the game out there be reverse
             | engineered?
        
               | johnnyanmac wrote:
               | > How does that happen?!
               | 
               | to be frank, carelessness.
               | 
               | in the sympathetic sense, this was decades ago, wear and
               | tear happens, and coporate orgs are messy. When
               | Squaresoft merged with Enix, there was inevitably going
               | to have some stuff, important stuff even, lost in the
               | move. Even if they were perfectly careful to keep things
               | archived.
               | 
               | But From their own mouth, it was just a different time
               | where archiving wasn't important:
               | https://www.rockpapershotgun.com/square-enix-digital-
               | preserv...
               | 
               | > The concept of games preservation is a relatively
               | recent one, and as Matsuda admits: "It's very hard to
               | find them sometimes, because back in the day you just
               | made them and put them out there and you were done - you
               | didn't think of how you were going to sell them down the
               | road. Sometimes customers ask, 'Why haven't you released
               | that [game] yet?' And the truth of the matter is it's
               | because we don't know where it has gone."
               | 
               | The 80's was just a small wing of developers trying to
               | put out a toy. They weren't thinking of the original
               | Final Fantasy as the Sistene Chapel of gaming. They
               | thought as much about preserving it as I did about
               | preserving that game jam game I made almost a decade ago.
               | If it wasn't as easy for me to preserve it as throwing it
               | on dropbox, I probably woulda lost my source code too.
               | 
               | >Could the existing copies of the game out there be
               | reverse engineered?
               | 
               | It's physically possible. But the effort to extract
               | assets, scene data, and source code and re-configure that
               | into a clean port is much more gargantuan than using the
               | source from "a bad port" and working from there.
               | 
               | Even if they did manage to do that, that's another thing
               | about video game code that's only now starting to be less
               | true; it's an utter mess of spaghetti code. Games weren't
               | made to be maintained for years by a revolving door of
               | developers like a website, and few people will ever see
               | the code. They get something working, and leave it there
               | as long as its not bothering anyone. coding standards
               | were very loose
               | 
               | I'm sure many small intricacies people come to appreciate
               | were merely products of rushed development at the 11th
               | hour, with non-important bugs that just stayed in the
               | final product. Even if they preserved the code, those
               | imperfections fans appreciate would likely be patched out
               | anyways (See: Backwards long jump in Super Mario 64 being
               | removed in subsequent releases).
        
       | shaunxcode wrote:
       | followed the link to see lisp machines - was not disappointed!
        
       | depingus wrote:
       | Wow this brought back memories! In the late 90's, I got to work
       | on SGI machines at college learning 3D animation in Maya v1! The
       | school had 4 labs with about 30 SGI O2's in each; all networked
       | with no security. I could send messages from my workstation to
       | the teacher's open terminal session.
       | 
       | No one there knew (or cared to learn) IRIX. When they converted
       | the biggest lab to Windows NT4 everyone abandoned the SGI
       | machines. Which worked out great for me, because it was much more
       | peaceful in the SGI labs compared to the NT4 lab. Some of those
       | StarCraft matches could get kinda rowdy!
        
       | jgrahamc wrote:
       | This sort of archaeology is great fun. I spent a huge amount of
       | time figuring out what was happening on the screens in the first
       | Westworld film (1973):
       | https://www.youtube.com/watch?v=UzvbAm0y8YQ
        
         | cellularmitosis wrote:
         | I recently put together a timeline of tech-ish things, to help
         | put events like this into perspective:
         | https://gist.github.com/cellularmitosis/9a1b96ed3109690a2840...
         | 
         | Good for a few surprises: "wow, python is older than win 3.1!"
        
       | npunt wrote:
       | Oh man the memories of this time. Around 97-98 was when
       | workstations were on the way out and workstation cards were on
       | the way in, but regardless these SGI boxes were just so
       | lustworthy - the style, the performance, the _otherness_ and
       | clear superiority in all dimensions to my lowly hacked together
       | PC.
       | 
       | I was just a teen getting into 3D animation & game design in
       | 1998, and since I couldn't ask my parents to mortgage the house
       | to buy one, I wound up picking up a workstation card instead - a
       | footlong Dynamic Pictures Oxygen 402 with 32mb ram and four
       | 3dlabs chips - for a much more reasonable $750 used. I think
       | about a year and a half before these went for $4k new, that was
       | the pace of 3d innovation at the time. It suited me really well
       | to learn Softimage 3D on until I got a job at Pandemic Studios as
       | an artist/designer. Even this beast of a workstation card
       | couldn't run Quake without errors though, there was still a
       | separation of functionality between consumer 3D accelerators like
       | 3dfx and the $1k+ workstation ones.
        
       | throwmeariver1 wrote:
       | In another life I worked on a SGI Onyx for Print PrePress of
       | Rotogravure Cylinders. Now I am working in VFX and sometimes I
       | read up on the history of it usually the Onyx pops up and even if
       | I did something completely different than the VFX artists at the
       | time I get nostalgic.
        
       | theonething wrote:
       | The SGI Indy "pizza box" was also a VFX and 3D animation classic
       | in that era.
        
         | bluedino wrote:
         | The Indy didn't have any 3D hardware in it.
        
           | theonething wrote:
           | No, it didn't, but the R5000 CPU had an enhanced instruction
           | set that could do 3D rendering in software pretty well at the
           | time.
           | 
           | We used them at the VFX software shop I was interning at
           | during those times.
        
           | jeffbee wrote:
           | Really wondering how the Indy has this hagiographic
           | reputation. It was, as far as I could tell at the time, the
           | slowest and generally worst workstation you could buy. People
           | bought them because they were waiting for unix technical
           | software to be ported to Windows NT on x86 and didn't want to
           | spend $100k per seat on RISC workstations they knew were
           | already obsolete.
        
             | fit2rule wrote:
        
       | midnightclubbed wrote:
       | I worked at Rareware in that same era, similarly ridiculous
       | amounts of SGI hardware in the building. As I recall each artist
       | had an SGI Indigo2 and later on the SGI O2 because the standard
       | artist workstation. I believe our lead artist used an SGI Onyx.
       | Programmers had Indy2's with the internal N64 development boards.
       | 
       | There were at least 2 rack mounted SGI machines used for large-
       | scale rendering jobs (ie promotional images, magazine covers
       | etc). May have been SGI Challenges (I know one certainly was) and
       | were kept off-limits to most staff, at the time they were rumored
       | to cost $250k each.
        
         | Melatonic wrote:
         | Get us another Banjo Kazooie already!
        
         | rootsudo wrote:
         | That's really cool. :) I wonder if you should share any stories
         | of working at Rareware during that time period?
        
           | tomatowurst wrote:
           | ohhh i fantasize about being 3d artist / producer in the 90s.
           | i collect whatever I can find from this era. I will make use
           | of my Net Yaroze and release a PS1 game one day!!!
        
         | BolexNOLA wrote:
         | A lot of what I read about Rare talks about insane
         | hours/relatively high turnover in staff. Did you find that to
         | be the case?
        
         | samstave wrote:
         | I went to Animation school and learned on Indigo's and O2s for
         | Maya and Softimage... ~1994 -> 1996
         | 
         | Years later when we were collapsing ILM into the new Presidio
         | Campus, the amount of SGI full-rack sized machines being thrown
         | into the trash was insane.
         | 
         | I believe they turned at least one into a keg-erator...
         | 
         | I could have had an opportunity to get one of the cabinets, but
         | I didnt have a place to put it. Wish I had figured out a place
         | to keep one.
         | 
         | The SGI cases were a thing of beauty.
        
           | mixtur2021 wrote:
           | Do you mean Power Animator perhaps and not Maya? I believe
           | Maya, Power Animator's successor, came later around ~1998.
        
           | ______-_-______ wrote:
           | SGI was a key part of some of the most iconic early games
           | (even Nintendo used them!) There isn't much of an emulation
           | scene, probably because of how specialized they were, the
           | weird architecture, and the diversity of workstations. And
           | lots of the old machines were just tossed after they became
           | obsolete. It's a huge shame from a preservation point of
           | view.
        
         | ramesh31 wrote:
         | >I worked at Rareware in that same era, similarly ridiculous
         | amounts of SGI hardware in the building.
         | 
         | I've always been fascinated with Rareware. For such a tiny
         | studio, the level of quality in the games they put out during
         | that era is completely unparalleled, and many can justifiably
         | still be held up as the greatest ever made.
         | 
         | What was the secret sauce? What was it like working there? How
         | was the culture?
        
           | oh_sigh wrote:
           | From Martin Hollis' wikipedia page(project head for Goldeneye
           | 007):
           | 
           | > Hollis remarked that he worked non-stop on the game,
           | "[averaging] an 80 hour week over the 2 and a half years of
           | the project", and that the team he recruited was very
           | talented and dedicated even though most of it was composed of
           | people who had never worked on video games.
           | 
           | I guess the answer is insane levels of talent + dedication,
           | and don't worry too much about domain expertise. Probably my
           | favorite factoid out of there is that Goldeneye multiplayer
           | was an afterthought, and basically one dude hacked it
           | together in a couple of weeks at the end of the development
           | cycle.
        
           | valley_guy_12 wrote:
           | As an outsider, I believe Rareware's secret sauce was a
           | combination of a can-do, down-to-the-metal, fast-feedback-
           | loop game development style that came from the pre-PC British
           | bedroom game coders, a management team that understood how to
           | manage game development and releases, and Nintendo's coaching
           | on mascot development and general game polishing.
           | 
           | Rareware's talents were big advantages in the early 3D game
           | console era. But by the PS2 / Xbox era, their special skills
           | didn't help as much.
           | 
           | Today I'd say that Epic's Fortnite is the spiritual successor
           | of the old Rareware.
        
             | kmeisthax wrote:
             | To continue on the similarities, Epic and Rareware also
             | had/have problems with worker burnout. Pretty much every
             | N64 Rareware classic drove at least a few people out of the
             | company. By the time Nintendo and Microsoft got into a
             | bidding war over the company there wasn't much talent left
             | in it[0]. Fortnite is the same way: the fast pace of
             | content churn means people are working constant overtime,
             | and the perpetual nature of the game means there's no
             | release that you're crunching _for_.
             | 
             | I would disagree that this was good management, though.
             | Burning out your talent is how and why game studios fall
             | apart over time. Had they retained talent and kept crunch
             | time low they probably would have continued churning out
             | hits on the GameCube and Wii instead of stinkers on the
             | Xbox. In fact, Nintendo probably understands this[1] - for
             | example, when Retro Studios imploded they bought them out
             | and immediately banned overtime work at the studio.
             | 
             | [0] Microsoft _didn 't_ understand this, and this is why
             | they wound up overpaying for Rare.
             | 
             | [1] Or at least did in the Iwata era. No clue if Kimishima
             | or Fukukawa have the same convictions, but given that
             | Nintendo hired them both internally I imagine they do.
        
               | johnnyanmac wrote:
               | >I would disagree that this was good management, though.
               | 
               | well, "effective" management. Not necessarily good. Seems
               | like a story that pretty much all large gen 5 (and many
               | gen 6) studios share. It was this new cutting edge field
               | right before/after the dotcom bubble requiring (at the
               | time) very niche talent and passion. Perfect formula for
               | burn and churn.
               | 
               | This was likely one of the many thousand cuts the
               | industry faced when moving to the HD era in gen 7. You
               | couldn't just brute force a bunch of assets to work at
               | the expected HD fidelity without stepping back and
               | actually understanding what the machine is doing. You
               | couldn't just have two artists doing everything for asset
               | production; you needed an organized pipeline of
               | specialists. You absolutely needed a
               | producer/manager/director to make sure pieces are fitting
               | together. Huge wakeup call for game developers on
               | software/business practices most other parts of the
               | industry had to employ for years.
        
             | AdmiralAsshat wrote:
             | > Rareware's talents were big advantages in the early 3D
             | game console era. But by the PS2 / Xbox era, their special
             | skills didn't help as much.
             | 
             | StarFox Adventures on the Gamecube was probably their last
             | "holy crap" game from a technical perspective. There wasn't
             | anything else at the time, on any console, that did
             | realistic-looking fur as good as that game:
             | 
             | https://i.pinimg.com/originals/69/d1/9b/69d19b00eb25ffde0f1
             | d...
        
       | paulpauper wrote:
       | >The RAM on these machines were not industry standard -- they
       | were proprietary, 200 pin SGI RAM modules available in 16MB,
       | 64MB, or 256MB variants. The memory board (known as MC3), had
       | slots for 32 memory modules -- and could handle up to 8 GB of
       | RAM. 16 GB in the rackmount version (yeah, there was a massive
       | rackmount version).
       | 
       | >Think about that for just a moment. This was the mid-1990s.
       | 
       | Let's pose the more theoretical question of what is the most
       | powerful computer that could be built if enough recourses were
       | summoned to make it. We're talking a single rackmount. What about
       | a rackmount the size of a city.
        
         | vmception wrote:
         | at this point its more about what metric of powerful are you
         | comparing to? for example, nobody wants a liquid nitrogen
         | cooled 8 gigahertz CPU any more. and measuring "flops" isn't
         | that useful either. the old Silicon Graphics machines didn't
         | have a single GPU in them and would not be able to render
         | shaders or have/manipulate rasterized textures of basically any
         | resolution, they just threw a lot of energy at a problem for a
         | now-antiquated and useless approach to that problem.
         | 
         | and finally, if you make something with a bunch custom chips
         | with a bunch of pins with high bandwidth, then you don't have
         | another metric to compare to something else.
         | 
         | I'm open to the thought exercise but I can predict which
         | directions the conversation would go, I'm just content with the
         | variety of metrics on CPUmark these days.
        
           | [deleted]
        
         | fuzzy2 wrote:
         | Not possible because it could not be a single computer node.
         | Electrical signals simply are too slow.
        
         | tinus_hn wrote:
         | SGI was building these enormous supercomputers with the NUMA
         | architecture, kind of like a cluster of super fast units with
         | fast interconnects and a OS and support software so you could
         | actually use it. This is one of the less photogenic setups:
         | 
         | https://en.wikipedia.org/wiki/Altix#/media/File%3AUs-nasa-co...
         | 
         | If you can make your problem fit the architecture you can work
         | on enormous tasks. It wasn't bad stuff at all but probably
         | impossible to make money on. Commodity hardware improved so
         | fast they couldn't keep up.
        
         | shadowofneptune wrote:
         | I would imagine it'd be similar in architecture to existing
         | supercomputers: many identical compute units connected over a
         | network. Scaling a single rackmount computer design up to the
         | size of a city would not be practical, a network would have to
         | be included at some point to keep scaling adequately.
        
       | lostcolony wrote:
       | A bit random, but, the guy on the left must have been nodding or
       | something, and the camera being used had a slow shutter speed. I
       | did a double take in seeing two mouths.
        
       | mepian wrote:
       | I'm glad the middle screen is explained, Symbolics always gets
       | overshadowed by SGI. If you want to see it in action, watch this:
       | https://www.youtube.com/watch?v=gV5obrYaogU
        
       | opentokix wrote:
       | Windows NT4 and 3DS MAX R1 was releasted in 1996.
        
       | mirchiseth wrote:
       | Oh this post brought back so many fun memories of college days
       | playing Flight Simulator on SGI boxes in the computer lab. Having
       | used IRIX on SGI Indy, Windows 3.1 on PCs seemed like a toy.
        
       | efficax wrote:
       | There used to be such a wonderful diversity of architectures,
       | operating systems and platforms in comparison to today's boring
       | landscape of really only 3 end user platforms and basically 2
       | viable server environments. Alas, i miss the old days
        
       | justinator wrote:
       | Hard question to ask, but what was the workflow used in
       | development? Two programmers sitting at many monitors powered by
       | x different workstations, and a monitor. What are they all doing,
       | say: here?
        
         | qbasic_forever wrote:
         | They were probably posing for a photo there. Pair programming
         | wasn't coined as a term or popularized by Kent Beck until the
         | early 2000s and 'Extreme Programming'.
        
           | p_l wrote:
           | Extreme Programming seemed to be en vogue among the early
           | adopters that led to "agile manifesto" in late 1990s.
           | 
           | That said, it looks like somewhat typical case of discussing
           | a bit of design, with main artist sitting at the workstation,
           | while the other person came from elsewhere.
        
       | aasasd wrote:
       | Forget the machines: I'm vaguely impressed by the controllers
       | with lots of weird buttons, with analog knobs, and I think with
       | some LCD screens--casually sitting before the monitors. These
       | days every home video editor can buy such things--but were people
       | doing much video editing in '96? Wonder how many of them were
       | sold in a year.
        
       | habibur wrote:
       | Also there was a Final Fantasy film produced at that time that
       | took 1000 workstations, 200 persons and 4 years to render at a
       | cost of $100m+. Thought it made only $80m.
        
         | efsavage wrote:
         | And despite all that, by the time it was released, it was
         | really not very impressive graphically, which was all it had
         | going for it since it was a terrible film.
        
           | Pulcinella wrote:
           | Yeah the faces were alright for the time, but everything else
           | wasn't great. The environments in particular were pretty bad.
           | The opening scenes of the movie have the characters flying
           | through a burnt out wasteland and it really, effectively
           | looks like a 16x16 texture has been draped over several
           | square kilometers of mountains. Texture filtered, so no giant
           | chunky pixels, but it still looked awful. Absolutely no
           | detail. And this was one of the first things you saw in the
           | movie!
           | 
           | Also it was just dreadfully boring, which is basically the
           | worst thing a piece of entertainment can be, even worse than
           | the visuals.
        
             | latortuga wrote:
             | Amen! As a FF fan in the 90s, I was so incredibly hyped
             | about this movie. I remember it major getting press in
             | gaming magazines and going to it with my cousin. I also
             | fondly remember it as the only movie I've ever walked out
             | of.
        
             | guenthert wrote:
             | At the time of release, they were quite proud of the
             | animation of hair I remember.
             | 
             | I thought actually the film was alright, if you're into
             | such spiritual things. But I can see, that it didn't appeal
             | to the video gaming crowd.
        
               | Pulcinella wrote:
               | I could see what they were going for. A lot of the pieces
               | of good or functional ideas and hooks are there: the
               | central mystery of why there are all these ghostly
               | creatures everywhere, trying to solve this mystery before
               | it's too late, the uneasy alliance between the scientists
               | and the military where both are trying to solve the
               | problem both have different ideas about how to do it but
               | both need resources from the other, trying to convenience
               | other of things you know are true but are difficult to
               | understand or sound crazy, trying to have functional and
               | healthy relationships in a crumbling world, etc. There is
               | a lot there that could work and a lot of media has
               | similar themes and plot points, it's just the film
               | doesn't do a very good job of it. It's not even really
               | worth a watch to see "it's so bad it's good" or "let's
               | see and laugh at Square's hundred million dollar mistake"
               | because as I said it's just kind of mediocre and boring.
        
         | neogodless wrote:
         | Released in 2001 after a budget of $137m, and $85m revenue.
         | 
         | https://finalfantasy.fandom.com/wiki/Final_Fantasy:_The_Spir...
         | 
         | > The movie was created using a 3D CG tool called MAYA as well
         | as original tools created in Honolulu.
         | 
         | > By the time the final shots were rendered some of the earlier
         | ones had to be redone because they did not match anymore. Also,
         | the software used to create them had become more advanced (and
         | hence more detail was possible).
        
       | bitwize wrote:
       | In the video game I'm writing, I made the enemy computers blue
       | and purple in color -- as a tribute to SGI in the era when it
       | seemed RISC architecture really was gonna change everything.
        
         | tenebrisalietum wrote:
         | RISC did change everything - the primary computing device of
         | users are ARM-based phones. Dependent on x86 servers in the
         | cloud for much though.
         | 
         | Of course it's arguable how RISCy ARM really is, but x86 is the
         | only CISC left for non-embedded computing anymore, and really
         | A) it's a hybrid with SIMD and a lot of recent instructions,
         | and B) I the ISA is essentially internally virtualized atop a
         | microarchitecture which operates vastly differently than the
         | ISA - all the wacky stuff done for performance is tucked away
         | there until it rears its head with Spectre-like issues and
         | such.
         | 
         | Further, couldn't it be said that RISC changed Intel? - one
         | wonders what Intel would have done had RISC not been making
         | gains in the 90's against Intel.
        
           | p_l wrote:
           | Microcoded CISC has been state of art method to implementing
           | CISC cpus since ... 1960s?
           | 
           | The one x86 that actually had a RISC core inside was AMD K5,
           | which essentially used 29050 core with an x86 frontend
           | slapped on it (in great simplification). Am29k architecture
           | is still used and produced by Honeywell for their avionics
           | systems.
        
       | mywittyname wrote:
       | I'm astonished by the amount of money Squaresoft was investing in
       | game development at the time. Obviously, it paid off big time for
       | them, but I can't imagine they realized the game would be as
       | successful as it was. If I'm honest, their follow-ups make it
       | seem like they never understood why the game was a success.
        
         | endorphine wrote:
         | > If I'm honest, their follow-ups make it seem like they never
         | understood why the game was a success.
         | 
         | This sounds like a pretty simplistic reasoning to me. Do you
         | really believe this?
         | 
         | The fact that your follow up game (or
         | movie/book/album/painting) wasn't as successful as the previous
         | one, doesn't mean you don't understand why the latter was a
         | success. Understanding success and replicating it are two
         | different things.
         | 
         | Btw, I like FFVIII more than FFVII.
        
         | toto444 wrote:
         | Would you say you know why it was a success? This game has had
         | a massive impact on me and I am spending a lot of time trying
         | to understand why (along with FF6 and Chrono Trigger). I have
         | identified 3 things : the music plays a massive role, the way
         | emotion are conveyed using by the posture of the characters as
         | well and finally the story telling that mixes stories and
         | battles and that can hardly be recreated using an other medium.
         | 
         | Typing this I realise that does not explain why the follow-ups
         | were not as good.
        
           | Nition wrote:
           | A lot of us had never really played an RPG before, especially
           | if we never had an SNES. Most games had a generic poorly-
           | written story that might be a page in the manual instead of
           | even being in the game itself.
           | 
           | Then suddenly a game comes along on the popular console, and
           | everyone's playing it and talking about it, and it puts you
           | in this 3D living world, and a real story is happening with
           | real characters, with real conversations - there's even
           | _swear words_ in it! It 's treating me like an adult! - and
           | yes the music is beautiful too, and the whole thing seems
           | impossibly huge, the world seems to go on forever and now I
           | can fly? And now I can go under the ocean? There was nothing
           | else like it, not on a mainstream console anyway.
        
           | mywittyname wrote:
           | > Would you say you know why it was a success?
           | 
           | Oh, no. But I can say that the game is still objectively
           | incredible. I recently did a full play through of the Steam
           | version with the 7th Heaven upgrades, and enjoyed every
           | minute of it. And I don't think it's all nostalgia either.
           | 
           | And yes, the music is incredible, as evidence by the fact
           | that the Shin-Ra Orchestra still tours.
        
           | rkk3 wrote:
           | Chrono Trigger!
        
       | amelius wrote:
       | What happened to SGI?
        
         | vondur wrote:
         | Linux PC's with bandwidth better than propietary Unix systems,
         | and video cards from Nvidia/ATI.
        
         | philipkglass wrote:
         | They made most of their money on expensive hardware. Their
         | workstation market was killed by Windows NT and OS X (later
         | Linux too) running on mass-market CPUs once graphics
         | accelerator boards became good enough. Their server market was
         | killed by Windows and Linux running on mass-market CPUs.
         | 
         | https://en.wikipedia.org/wiki/Silicon_Graphics#Decline
        
         | Sohcahtoa82 wrote:
         | SGI's days were numbered as soon as 3D accelerator cards for
         | PCs became a thing.
        
         | l1k wrote:
         | Filed chapter 11 twice.
         | 
         | Bad management made the wrong bet, thought Itanium and Windows
         | would take over the world.
         | 
         | But what really broke all UNIX workstation manufacturers' backs
         | was the unwillingness to cannibalize their products with
         | affordable machines. SGI workstations were not affordable to
         | students, so they got x86 machines instead and installed Linux.
         | Google was built with x86-based Linux boxes because that's what
         | the founders were using and could afford. UNIX workstation
         | manufacturers lost an entire generation of young engineers that
         | way. Apple eventually offered what they should have: Sleek,
         | affordable machines with a rock-solid UNIX underneath a
         | polished UI.
        
           | cartoonfoxes wrote:
           | I think it was 2001? that Industrial Light & Magic (ILM)
           | replaced their SGI workstations with linux boxes running
           | RedHat 7.5 and powered by a Nvidia Quadro2 gpu.
        
             | samstave wrote:
             | That and in ~2004, ILM, along with the other LucasFilm and
             | Games --> Lucas Presidio... during which a lot of SGI
             | machines were scrapped.
             | 
             | Source: I was the designer of the datacenter and network
             | cabling infra for the Presidio.
        
           | zozbot234 wrote:
           | This is why some people are so excited about RISC-V, BTW -
           | they're re-enacting the exact same market play as x86 did
           | back then. Starting out from low-end hardware only good for
           | single-purpose use (we call that "embedded" these days) and
           | scaling up to something that can run a proper OS, with MMU
           | and virtual memory support. And doing it while beating
           | everyone else on price, as well as potentially on
           | performance.
        
         | cartoonfoxes wrote:
         | Bankrupt (2006), acquired by Rackable Systems (2009), then
         | aquired by HP Enterprise (2016).
        
           | p_l wrote:
           | Rackable did horrible things with the brand.
           | 
           | I believe HPE essentially continued only the UltraViolet
           | series (The Xeon-based continuation of Altix series)
        
       | harel wrote:
       | I received a demo of a $250k SGI when I was about 14 (1990). It
       | powered a military F16 flight simulator and the experience was
       | nothing short of mind blowing. Those machines were tightly packed
       | magic.
        
         | samstave wrote:
         | When I was in Civil Air Patrl, we went to Fallon Naval
         | Airstation in Nevada, when they were still doing Top Gun there.
         | 
         | The flight review theatre was AMAZING, it had a huge screen,
         | and the graphics were 3D wireframe - but they had the entire
         | valley modeled and had a huge trackball and they could review
         | the flight scenes in 3D -- this was ~1988/89
         | 
         | It was amazing... but I am not sure if it was backed by SGI,
         | but based on your comment, I believe it would have been.
         | 
         | ---
         | 
         | I bought one of the early OpenGL capable graphics cards frm
         | Evans and Sutherland in ~1997 to run Softimage on Windows NT
         | with a Dual PII 266 based machine...
         | 
         | The card had 32MB of graphic ram. it cost me $1,699 -- and it
         | was a full length AT board.
         | 
         | I was trying to get an O2 -- but it was way out of my price
         | range.
        
         | usefulcat wrote:
         | My first job out of college was implementing the image
         | generator for the simulator for the landing signal officer
         | (LSO) on the USS Nimitz.
         | 
         | It ran on an 8 CPU SGI Onyx that was about the size of a
         | refrigerator. The view was from the position of the LSO, at the
         | aft end of the carrier deck. In the actual installation, the
         | images were projected onto curved screens giving a 270 degree
         | FOV. I do wish I could have seen the final product!
        
         | esaym wrote:
         | You were 14 and got $250k worth of hardware to demo? You got
         | some explaining to do... (please)
        
           | p_l wrote:
           | "Demo" can mean a lot of things. Getting hands on a VIP pass
           | at an airshow meant I received, as an 8 year old, a rather
           | comprehensive demo... _of JAS-39 Gripen multirole fighter
           | jet_. Just the seat I sat for half an hour cost $250k.
           | 
           | Sometimes you can get yourself into really interesting places
           | :)
        
           | harel wrote:
           | Sorry, I've posted this before so always feel like blabbering
           | if I repeat - but here goes - I was 14, my dad had a print
           | business and printed all the cockpit panels for a private
           | company developing an F16 simulator for the Israeli air
           | force. He took me to their offices one weekend. The setup was
           | a full 180 degree screen projections, a realistics 1:1 F16
           | cockpit with all the panels and buttons etc, and a SGI
           | running the show. They gave me the spinning Beatle car demo,
           | and then sat me down to fly. That day left a hard imprint
           | (including the price tag on the SGI which they were proud to
           | mention). I was an Amiga kid, and to top it all off, the
           | other room had what seemed like hundreds of Amigas, which
           | were used to build 3D models for the simultaor.
        
           | neogodless wrote:
           | They said they received a demo. That is to say they were
           | among people who got a few minutes to witness the $250k worth
           | of hardware in action.
        
           | qbasic_forever wrote:
           | I remember that flight simulator demo, it was something you'd
           | find at events and trade shows or even super fancy arcades.
           | This was back in the first VR 'boom' and tail end of the era
           | of arcades. Some companies used SGI and similar powerful
           | workstations to build simulator game pods, like for
           | mechwarrior and spacecraft racing games. People would pay for
           | a 5-10 minute session in one.
        
       | ChuckNorris89 wrote:
       | Ah yes, good ol' SGI, it always brings a smile to my face reading
       | these old war stories.
       | 
       | I wish we could get some insight on the development of the first
       | successful 3D game on PC, Quake by I'd software as there's a
       | famous picture of John Carmack sitting in front of some SGI
       | workstation with a monitor with a resolution ox 1920*1080( in
       | 1995!)
       | 
       | Also, SGI powered most VFX Studios of that era, so many great
       | movies went through those machines before ending up on the big
       | screen.
       | 
       | It's insane how quickly 3dfx, Nvidia and Intel X86 consumer
       | hardware made SGI workstations overpriced and completely obsolete
       | within the span of just a few years. The '90's were a blast.
       | 
       | But still, I'm sad to see SGI go, as their funky shaped and
       | brightly colored workstations and monitors had the best
       | industrial design[1] in an era of depressing beige, grey or black
       | square boxes.
       | 
       | [1]
       | https://preview.redd.it/tt3ziuwt98o31.jpg?auto=webp&s=e5cc61...
        
         | MisterTea wrote:
         | > But still, I'm sad to see SGI go, as their funky shaped and
         | brightly colored workstations and monitors had the best
         | industrial design[1] in an era of depressing beige, grey or
         | black square boxes.
         | 
         | This is what I miss most about the 90's and proprietary
         | computer vendors: the exotic fun looking cases they had vs
         | boring beige Mac and PC cases.
         | 
         | I have an SGI 230 which was a last ditch effort to stay
         | relevant via offering a regular x86 machine in an ATX SGI case.
         | Unfortunately SGI ditched the snazzy cube logo sy then so it
         | only has the lame fisher price sgi logo stenciled on it. It now
         | houses a 12 core threadripper running Void Musl.
         | 
         | Before that workstation, the 320 and 540 used Intel P3/Xeon
         | chips on proprietary motherboards, 3D GPU/chip-set and even
         | proprietary ram modules. Only ran nt4 and was a miserable
         | failure of a machine.
        
           | aftbit wrote:
           | Why Void? And why Void musl?
        
         | nebula8804 wrote:
         | Love those designs as the colors are now a direct
         | representation of the 90s.
        
         | markus_zhang wrote:
         | Just curious what are the state of arts graphic workstations
         | nowadays? Mac pro?
        
           | walrus01 wrote:
           | really, really beefy dual socket xeon probably, or amd epyc,
           | with 2TB+ RAM
        
             | markus_zhang wrote:
             | Got it. I Google around and surprised to find vendors who
             | basically build workstations from market components. Not
             | sure whether they are mainstream though.
        
             | Certified wrote:
             | Surprisingly, because most CAD and computer graphics
             | programs are still largely not optimized for multi-threaded
             | processors, the fastest workstations are typically using
             | whatever processor tops the single core compute performance
             | category, not the multicore. They are then paired with as
             | much ram as the chipset supports. Also, when you are
             | talking about pro graphics cards like Radeon Pro and the
             | RTX A (formerly called quadro) lines, it only pays to
             | upgrade to the next gen graphics card once your software
             | vendor has had a year or two to integrate with a new
             | hardware gen's capabilities. The pro gfx card market (at
             | least when it pertains to OpenGL performance) is one area
             | that will actually punish you for being too early an
             | adopter, which is disappointing when cards go for several
             | thousands of dollars new. The whole area of CG software has
             | been stagnating for 5 years while OpenGL driver
             | improvements have fallen out of favor for more bare metal
             | processing approaches that are only now getting to feature
             | parody and developer adoption like vulkan. Hopefully the
             | next few years brings a positive trend in CG price to
             | performance again as these new architectures actually start
             | shipping in CG software products. As someone who works
             | daily in CAD, the performance stagnation over the last 5-10
             | years has been depressing to say the least.
        
               | markus_zhang wrote:
               | Thanks. This is totally new to me. I guess it's the same
               | pic for professional level designers or game designers
               | who basically work in a CAD like environment? For example
               | the people who design levels and scripts for games such
               | as Skyrim.
        
               | blevin wrote:
               | Feature parody is such a good turn of phrase.
        
         | pengaru wrote:
         | > It's insane how quickly Nvidia and Intel X86 consumer
         | hardware made SGI workstations overpriced and completely
         | obsolete within the span of just a few years. The '90's were a
         | blast.
         | 
         | Let's at least give _some_ credit to 3dfx Interactive for PCs
         | dethroning the 3D giants. There was a time practically everyone
         | playing Quake had a Voodoo card.
        
           | Melatonic wrote:
           | Voodoo 2 ftw
        
           | [deleted]
        
           | ChuckNorris89 wrote:
           | Pretty much.
           | 
           | With the advent of semi fabs like TSMC, STM, etc. making
           | their processes more accessible to smaller fabless companies,
           | 3dfx, PowerVR, ATI, Nvidia and other startups in the 3D space
           | back then, realized they can replace all those expensive
           | discrete RISC chips SGI was using for their massive 'reality
           | engine' PCBs, and instead design a cheaper and more efficient
           | custom ASIC from the ground up that does nearly the same
           | things SGI's reality engine was doing (triangle rendering and
           | texture mapping was enough for a PC 3D accelerator back
           | then), but at 1/100th of the price, and sell it to consumers.
           | 
           | Fast forward, and we all know how the story played out and
           | where the industry is today.
           | 
           |  _> There was a time practically everyone playing Quake had a
           | Voodoo card._
           | 
           | Yeah they were the most desirable piece of tech back then,
           | plus, the marketing and advertising 3dfx had at the time was
           | wild as hell.[1]
           | 
           | Even the box-art on their GPU boxes was the most memorable of
           | any HW of that era. Anyone remember those eye glaring down on
           | you from the store shelves?[2] I feel like this is now a lost
           | art.
           | 
           | [1] https://www.youtube.com/watch?v=1NWUqIhB04I
           | 
           | [2] https://pbs.twimg.com/media/EkiRZhLW0AQzR7g.jpg
        
             | justsomehnguy wrote:
             | > realized they can replace all those expensive discrete
             | RISC chips SGI was using for their 'reality engines', and
             | instead design a custom ASIC from the ground up that does
             | nearly the same things SGI's workstations were doing
             | 
             | This one. It is amazing seeing how some monstruosity of a
             | full-length, full-heigth cards were replaced by,
             | essentially, a single chip solutions. I would recommend
             | something akin to a computer museum to see exactly how it
             | came to be.
             | 
             | On mobile ATM, if someone is interested - leave a reply, I
             | would provide some links on the most interesting ones.
        
               | ChuckNorris89 wrote:
               | _> It is amazing seeing how some monstruosity of a full-
               | length, full-heigth cards were replaced by, essentially,
               | a single chip solutions._
               | 
               | Yeah, I'm surprised SGI didn't see the tides turning as
               | the ground shifted beneath them and the rest of the
               | industry leapfrogged them.
               | 
               | But that's what made SV great, young visionary companies
               | could come out of nowhere and eat the lunch of old
               | dinosaurs who lacked the vision.
        
               | jandrese wrote:
               | I did a stint as a co-op for SGI back in the late 90s.
               | What was clear at the time was that there was a flight of
               | the smart people out to the early PC graphics card
               | industry causing serious brain drain in the company. This
               | was also the time the company was making astoundingly
               | overpriced PCs that made the bad bet on RAMBUS.
               | 
               | The reality is the company suffered from the same
               | fundamental market forces that killed off most of the
               | Workstation market in the 90s. No niche company could
               | spend what Intel was spending on R&D every year so their
               | performance advantage was continually eroding, while the
               | price points for the hardware were not. Trying to
               | transition to being a PC manufacturer wasn't totally
               | crazy, but it would mean competing a highly price
               | competitive market which SGI was absolutely not equipped
               | to do.
               | 
               | I had the impression that the smart people in the
               | graphics department saw that management was never going
               | to go along with their "lets build far cheaper and better
               | versions of our existing products on a PCI card that you
               | can stuff in a cheap off the shelf PC" that would
               | massively undercut the core business. So they quit the
               | company and started nVidia.
        
               | linspace wrote:
               | > that would massively undercut the core business
               | 
               | This happens a lot of times. The reality is that someone
               | will do it for you. Sometimes they even grow bigger than
               | you.
        
               | npunt wrote:
               | Even Intel flirted with RAMBUS and paid for it. When I
               | was at Pandemic Studios in 99-02, we'd get lots of
               | prototype hardware from Intel and they sent us PIII's
               | with the RAMBUS-exclusive i820 chipset. The things were
               | impossible to get working stably and the RDRAM was
               | ludicrously expensive. Total dead end from the get-go,
               | and slower than AMD's stuff.
               | 
               | Intel was really on a dumb path starting in the late 90s,
               | with RAMBUS, Itanium, the P4 debacle, and missing the
               | emerging mobile market, and didn't right themselves until
               | 2006 with the Core series. But they were big enough to be
               | able to make a few mistakes unlike SGI.
        
               | ChuckNorris89 wrote:
               | _> there was a flight of the smart people out to the
               | early PC graphics card industry causing serious brain
               | drain in the company_
               | 
               | Yeah, I can imagine you could count on the fingers of
               | your hands, the number of people could, back then, design
               | 3D acceleration hardware, so it must have been a pretty
               | exclusive club in the Bay Area at the time where everyone
               | in this field knew each other, I can only assume.
        
             | Sohcahtoa82 wrote:
             | > 3dfx came with their Glide API
             | 
             | As a teenager/young adult in the late 90s/early 00s, I was
             | so glad to see 3dfx fail. I hated how popular the Glide API
             | was since you could only run it on a 3dfx card. I had asked
             | for a 3dfx Voodoo for Christmas one year, and my dad got me
             | a Rendition Verite 2200. It supposedly had better
             | performance than a Voodoo while having a lower price, but
             | it couldn't run Glide, so couldn't play half the games I
             | wanted a Voodoo for.
             | 
             | I didn't want to feel ungrateful to my dad, so my
             | frustration got targeted to 3dfx for making a proprietary
             | API when OpenGL and Direct3D existed.
             | 
             | I eventually got a Voodoo Banshee, but by that time Glide
             | was falling out of favor.
        
               | ChuckNorris89 wrote:
               | _> my frustration got targeted to 3dfx for making a
               | proprietary API when OpenGL and Direct3D existed_
               | 
               | Do you happen to know a fruity HW company that today runs
               | it's own proprietary graphics API when the open Vulkan or
               | OpenGL exist? /s
               | 
               | All jokes aside, back then it made sense why every 3D HW
               | company was baking their own API. It wasn't just for
               | gatekeeping/rent seeking, but the consumer 3D graphics
               | acceleration business was brand new, there was no
               | standardization, so nobody knew where the future was
               | heading, so they wanted to have full control over it as
               | they built it. Plus, they were shipping hardware before
               | Microsoft had come up with DirectX so they needed some
               | API until then, and I assume they were afraid to touch
               | OpenGL, the API of their biggest competitor.
        
               | Sohcahtoa82 wrote:
               | > Do you happen to know a fruity HW company that today
               | runs it's own proprietary graphics API when the open
               | Vulkan or OpenGL exist? /s
               | 
               | Yeah, but how many AAA games use it exclusively? Every
               | AAA game I know of is either using Vulkan, OpenGL, or
               | DirectX directly, or they're using a game engine like
               | Unity or Unreal and abstracting away the graphics API.
        
               | Nextgrid wrote:
               | To be honest, I can't think of anything that's actually
               | exclusive to the fruity company's graphics API? I mean,
               | they have no game market to begin with.
        
               | MaxBarraclough wrote:
               | Is Metal any more proprietary than Direct3D?
        
               | zozbot234 wrote:
               | Direct3D has alternate implementations as part of Proton.
               | Is there anything like that for Metal?
        
               | pjmlp wrote:
               | https://apps.apple.com/us/genre/ios-games/id6014
        
               | ______-_-______ wrote:
               | I'm sure at least 99% of those use an opengl-metal
               | translation layer
        
           | agumonkey wrote:
           | > There was a time practically everyone playing Quake had a
           | Voodoo card.
           | 
           | those who had a voodoo + those who wanted to have one = 100%
        
         | samstave wrote:
         | Fun fact, the O2 had an 'optional' expansion port that was an
         | additional ~$1,000 or so... but the thing is, ALL the O2s had
         | this port - it was that if you paid for it, they popped the
         | plastic cover off the case to reveal the port...
        
           | bri3d wrote:
           | I'm not so sure about this, I've owned a lot of O2s.
           | 
           | The only blocking plates on the rear cover of an O2 usually
           | cover the spot where the Flat Panel Adapter or Dual Monitor
           | board goes, and it's not installed by default, or the spot
           | where a PCI card would go, which, well, it's a PCI card.
        
           | p_l wrote:
           | Which expansion port are you talking about?
        
             | samstave wrote:
             | I cant recall what the port was... A serial port? I cant
             | recall - but just that it was on all O2s and it was just
             | knocking out the plastic cover to get access to it.
        
               | spitfire wrote:
               | One of the video ports. I think digital video port.
               | 
               | The base model supported audio and "moose cam" (A webcam,
               | in 1996!). If you paid extra you got video.
        
               | bri3d wrote:
               | Oh, this was the SDI hack, but it needed a separate port
               | expander - it was more involved than just removing a
               | plate.
               | 
               | You could buy the base AV1 "analog" video I/O card and
               | then plug an SDI expansion breakout board (I think the
               | Miranda VIVO was the most popular) into the webcam port,
               | instead of buying the much more expensive AV2 "digital"
               | video I/O card.
        
               | p_l wrote:
               | There were two video options you could buy, with analog
               | and digital versions, plus there were special parts to
               | provide alternative options for display outs (by default
               | it had sgi-style 13W3 only)
        
         | jl6 wrote:
         | I recall CRT monitors were capable of some quite high
         | resolutions (often with a compromise to refresh rate), and it
         | took a while for LCD panels to overtake them.
        
           | bityard wrote:
           | CRTs, being analog, are theoretically capable of _any_
           | resolution. But in practice, most monitors were limited to
           | between a few to a few dozen common modes. Toward the end of
           | the CRT monitor's reign, most monitors could display higher
           | resolutions than was really practical for their size, as the
           | physical limit is the size of the "dots" that make up the
           | phosphor layer. (Which is probably not at all the right
           | terminology, because I'm not a CRT geek.)
           | 
           | The refresh rate compromise at higher resolutions was due to
           | the limitations of the graphics card, NOT the monitor.
           | 
           | LCDs took a little while to catch up for a few reasons:
           | 
           | 1) expense! It was hard and expensive to manufacture a
           | display containing millions of transistors with an acceptable
           | (read: profitable) failure rate.
           | 
           | 2) colors! LCDs had a reputation for extremely poor color
           | quality in the beginning. Blacks were medium gray at best and
           | primary colors all looked washed out. Today's LCDs still have
           | a hard time getting to "true black."
           | 
           | 3) ghosting! Early LCDs had poor response times. Move your
           | mouse cursor and watch it leave a trail across your screen.
           | Fine for word processing and spreadsheets. Terrible for
           | games.
        
             | dylan604 wrote:
             | >as the physical limit is the size of the "dots" that make
             | up the phosphor layer.
             | 
             | Another physical limit was the weight of the glass. The
             | larger the display, the glass got thicker and thicker to
             | "lens" the beam correctly so the edges/corners were
             | straight. We had a Sony reference CRT for our film
             | transfer/color correction suite that was a 32" HD monitor.
             | MSRP was >$30k for it. (Sony's reference monitors were
             | roughly $1k per inch in pricing.) The thing was stupid
             | heavy requiring a minimum of 2 people if their names were
             | Arnie; otherwise it'd take at least 3 maybe 4 typical post
             | house employees. All of the weight was in the front.
        
             | agumonkey wrote:
             | the CRT had no frequency limit ?
        
               | stjohnswarts wrote:
               | Of course they did, the scan rate of the electron guns
               | would be limited at some point. What that limit is I
               | don't know but it was certainly finite.
        
               | agumonkey wrote:
               | I wonder which component who limit first.. the gun or the
               | flyback driver.
        
               | wtallis wrote:
               | CRTs had limits on how quickly the beam could scan across
               | a line (and return to the other side of the screen for
               | the next line), which imposed a tradeoff between the
               | number of lines per frame and the number of frames per
               | second. Within a line, the number of pixels per line was
               | often limited either by the speed of the graphics card's
               | DAC or the analog bandwidth of the VGA cable. But
               | sometimes it wasn't, and you could take a monitor
               | originally intended for something like 1280x1024 and get
               | it to display 1920x1080 with acceptable sharpness, after
               | adjusting the picture height to compensate for the
               | changed aspect ratio.
        
           | zbuf wrote:
           | It certainly took a long time (mid 2010s) for panels in their
           | various guises to truly overtake CRTs for film work. The
           | reason given at my workplace was the calibration control of
           | CRTs took time to be superceded.
           | 
           | The Sony 21" CRTs (and 24" in widscreen) had a very good run
           | -- spanning several decades. They were certainly capable of
           | higher resolutions, but the 1600x1200 was pretty much a
           | standard (or 1920x1200 on the 24" widescreen) for that entire
           | run. I can't think of any other desktop workstation
           | performance 'metric' that was stationary for so long.
        
             | aceazzameen wrote:
             | I miss my Sony Trinitron that I gave away in 2015. I
             | remember not having an "HD TV" when I first bought a PS3,
             | so I hooked it up to my Trinitron with component cables and
             | it displayed beautifully. The 1080p TV I bought at a later
             | date felt like a downgrade, despite the larger size.
        
               | sbierwagen wrote:
               | 1600x1200 is 1.92mp, while 1920x1080 is 2.07mp.
               | Technically more pixels, but juuust barely. For quite a
               | few years my setup was a 1080p panel next to an aged
               | ColorSync 20" and the difference was mostly the falloff
               | in the corners of the CRT. (And the incredible weight of
               | the CRT, of course. 78 pounds!)
        
           | paulpauper wrote:
           | I read that CRTs are still better in some respects
        
             | ridgered4 wrote:
             | Input latency, flexibility of native resolution (they don't
             | really have one). It's been awhile since I've looked at
             | them side by side, but I think they're still better on
             | black levels. I suspect color reproduction has caught up
             | though.
             | 
             | It's not surprising they fell out of favor though. Once the
             | ghosting stopped being absolutely horrendous the LCD is
             | just superior for office work. Uses less power, saves tons
             | of desk space (and due to less weight, doesn't need a
             | strong desk) and no flickering means a cheapo LCD is
             | probably easier on the eyes than a cheapo CRT.
             | 
             | I bet the CRT wwould still be used more often if the supply
             | chain to make them didn't fall apart when the demand
             | disappeared.
        
             | dylan604 wrote:
             | For giving you cancer!
             | 
             | When Sony first brought out their OLED reference monitors
             | to the market, they had a demo at NAB to demonstrate the
             | various screen types. All of the monitors were the
             | equivalent reference version of that series: CRT, LCD,
             | OLED.
             | 
             | It was an interesting demo as they had the same feed going
             | to each monitor. When showing how each monitor could
             | display black, the OLED looked like it was off, the LCD was
             | just a faint shade of gray, while the CRT was much much
             | more noticeably not black from its glowing screen. I
             | started to think to myself how they might be pushing the
             | brightness on the CRT to make it look bad against the
             | others. Right as I was thinking that to myself, the
             | narrator said something to address this thought and then
             | displayed bars. All 3 monitors were correctly adjusted. A
             | gamed CRT with brightness adjusted would have been obvious
             | at this point to those knowing how the test pattern is
             | meant to look.
             | 
             | The only thing I'd suggest that a CRT looks better is on
             | true interlaced content.
        
         | bluedino wrote:
         | It was an Intergraph monitor, connected to an Integraph
         | workstation. Carmack didn't use SGI's for development but they
         | did use their servers for level processing
        
         | [deleted]
        
         | walrus01 wrote:
         | as I recall the SGI monitors were actually the pinnacle of Sony
         | Trinitron CRT tech, rebadged, and connected using a 13W3 analog
         | video link. Similar to the very high end Sun at the time. As
         | neither SGI or Sun actually made CRTs, they went with the state
         | of the art from the world's top CRT maker. Might have been some
         | mitsubishi diamondtron in there too.
         | 
         | In 1995 I think there were 16:9 aspect ratio japanese model TVs
         | but I am not sure about _monitors_ , might have been more like
         | a 4:3 1600x1200 display.
        
           | Melatonic wrote:
           | You could also "overclock" them to run at higher refresh
           | rates (frame rates, hz) depending on the resolution. The top
           | CRT's were blowing away LCD's for many, many years. Good
           | color, higher refresh, good dynamic range, etc etc. Those 21"
           | CRT's were massive but I held onto my second hand one for as
           | long as I could. I remember getting the VGA adapter for the
           | Dreamcast and it looked damn good on that CRT!
        
             | zozbot234 wrote:
             | Driving CRTs at bad frequencies was a common way of letting
             | all the magic smoke out, back in the day.
        
           | rjzzleep wrote:
           | I still remember buying a dirt cheap SGI monitor back in the
           | day. But it took a while before I figured out how to mod the
           | cable because of sync on green.
        
             | dylan604 wrote:
             | Yeah, component cables with 5 BNCs RGB+Hsync+Vsync. Good
             | times!!
        
               | jeffreygoesto wrote:
               | Fixed Frequency, crafting an X mode line on a cheap 14",
               | no text readable during boot, fingers crossed if X came
               | up ok, if not, swap monitors, boot with "init=/bin/bash"
               | and goto 1...
        
           | ChuckNorris89 wrote:
           | _> In 1995 I think there were 16:9 aspect ratio japanese
           | model TVs but I am not sure about monitors, might have been
           | more like a 4:3 1600x1200 display_
           | 
           | Nope, it was definitely 16:9, but I was mistaken, it was an
           | Intergraph[1], not SGI like I originally thought, but still
           | connected to an SGI workstation.
           | 
           | Just look at this beast[1]. Also, the monitor is in the photo
           | as well :)
           | 
           | [1] https://www.reddit.com/r/crtgaming/comments/gxvm99/legend
           | ary...
        
             | walrus01 wrote:
             | I can't even imagine what that might have cost, it was
             | probably sold for high-end CAD and similar...
        
           | dylan604 wrote:
           | There's a semi-easy way to spot a Trinitron if you knew
           | where/how to look for the tell. There were 2 horizontal lines
           | that were shadows from some bit of wiring that could be seen
           | when viewing the screen when certain images/patterns like
           | solid colors were displayed.
        
             | JohnBooty wrote:
             | There's an even easier way on older Trinitrons - the bulge
             | of the glass is different.
             | 
             | Notice how the left and right sides of the glass are
             | straight vertical lines, and only the top and bottom edges
             | are curved: https://spectrum.ieee.org/the-consumer-
             | electronics-hall-of-f...
             | 
             | Versus a standard CRT, where all four sides are curved:
             | https://fineartamerica.com/featured/vintage-tv-scott-
             | chimber...
             | 
             | Of course, the final-gen Trinitons (circa 2000 onward) had
             | truly flat glass, so there was no tell-tale curvature to
             | look for.
             | 
             | And once Sony's patents started expiring, there were
             | competitors like Mitsubishi's Diamondtron displays with
             | glass shaped like Trinitrons. I'm not sure if they had the
             | two horizontal lines like Trinitrons.
        
             | syncsynchalt wrote:
             | I had a Sun workstation with a beast of a 19" Trinitron CRT
             | in the late '90s in my apartment (working remotely for a
             | California startup) and remember it fondly. The lines
             | you're referring to are called "damping wires" in this
             | article: https://en.wikipedia.org/wiki/Aperture_grille
        
             | valley_guy_12 wrote:
             | I remember that shadow! FWIW it was just one line in
             | smaller Trinitrons, and it was located 1/3rd of the way
             | from the top or bottom edge rather than in the middle.
             | 
             | Apple mounted the Trinitron tube upside down compared to
             | other vendors, so that the faint horizontal line would be
             | in the bottom third of the screen rather than the top
             | third.
        
           | midnightclubbed wrote:
           | The SGI monitors were 4:3 but I don't recall the resolution.
           | I do recall that they were beasts, weight of a small elephant
           | and a not dissimilar size - you needed to pull your desk away
           | from the wall to get any kind of distance to the screen.
           | 
           | Over time their timings would drift and you could never get
           | the entire screen completely sharp.
        
             | angst_ridden wrote:
             | I just finally gave away a 17" Trinitron SGI monitor when
             | cleaning my office.
             | 
             | That thing was a tank! I bought it at a CG house bankruptcy
             | sale in the 90s for $2k, which was less than half the going
             | price at the time.
             | 
             | But you're right, it weighed a ton. It did need periodic
             | degaussing. And when a communication company half a mile
             | away put in some satellite uplinks, I could see when there
             | was heavy communication traffic by a slight color shift on
             | one side.
        
             | cfn wrote:
             | And there was a button you could press to "de-magnetize"
             | the screen which made this awesome noise!
        
               | AyyWS wrote:
               | Like this?
               | 
               | https://www.youtube.com/watch?v=rSAnvHq_aFE
        
               | throwanem wrote:
               | Degaussing buttons were common on high-end CRT displays.
               | They were always that fun!
               | 
               | In the mid-2000s, I worked at a place where we had
               | several "decommissioned" Indys living a second life as
               | Apache servers for some of our hosting clients - not
               | uncommon in those days. We had one of the big 19" 4:3
               | CRTs on a KVM, too, and its weight put a noticeable if
               | graceful curve in the MDF desktop on which it stood.
        
               | lostcolony wrote:
               | Degaussing. Most(all?) CRT monitors could do that.
               | https://www.youtube.com/watch?v=PjO2vVaxIWM
        
         | agumonkey wrote:
         | I'm still stumped by nvidia rise and how gaming propelled them
         | into HPC/Supercomputing. Who knew playing video games who
         | amplify research that much ?
        
           | Sohcahtoa82 wrote:
           | Simple.
           | 
           | Rendering 3D graphics for games and the supercomputing used
           | by AI/ML/research both need the same thing: Embarrassingly
           | parallel math calculations with little or no branching in the
           | code.
           | 
           | For example, a feed-forward neural network is just a whole
           | lot of multiplication and addition. Transforming a 3D vertex
           | in space to a 2D screen coordinate is matrix multiplication,
           | which is just a whole lot of multiplication and addition. If
           | you've already designed silicon to perform those operations
           | in a single clock cycle, making it do HPC/SC rather than
           | gaming isn't that big of a switch.
        
             | agumonkey wrote:
             | Aight but then why no other company managed to compete on
             | the numerical core array ? still funny, as if the wealthy
             | gaming market funded nvidia venture into serious computing
             | enough to choke the potential competition.
        
               | Sohcahtoa82 wrote:
               | Silicon is expensive to design, and even more expensive
               | to build. The startup costs are crazy high. By the time
               | people realized it could be a thing, nVidia and AMD
               | already owned the market.
        
               | jandrese wrote:
               | You need huge money to develop the tech, but the HPC
               | industry is tiny. People are always impressed at these
               | million dollar machines, but only a handful are built
               | every year. There is a lot more money in gaming selling
               | millions of $200-$1000 cards every year.
        
             | zozbot234 wrote:
             | Neural networks have non-linearities. There were hacks for
             | doing some sort of general purpose computation using the
             | GPU fixed rendering pipeline, but they're not enough for
             | neural networks especially as understood today. You need
             | general shaders, which were a relatively late development.
        
           | dylan604 wrote:
           | every single teenager whose parental units asked where gaming
           | skills would come into use later in life as an adult. at
           | least the were hoping for something like that.
        
             | agumonkey wrote:
             | I bet zero 90s teenager said "mooom, how do you expect to
             | cure cancer if you stop me from buying nv chips ?? how ?"
        
               | dylan604 wrote:
               | I bet I can stack boxes in a truck better than any pre-
               | Tetris playing person before me!
        
           | ChuckNorris89 wrote:
           | _> I'm still stumped by nvidia rise and how gaming propelled
           | them into HPC/Supercomputing. Who knew playing video games
           | who amplify research that much ?_
           | 
           | Jensen Huang did. When people think of tech visionaries they
           | think of Jobs or Musk, but Huang is just as great. He's been
           | bang on the money on the future of this industry since he
           | founded Nvidia, which is how they managed to not just
           | consistently stay ahead of their competition (ATI) or put
           | them out of business (3dfx), but leapfrog them (AMD) by
           | branching in several fields (AI/ML, PhysX, computer vision,
           | self driving, compute etc.) He saw early on that GPUs should
           | push into general-compute and not just be for video games,
           | and he executed well on that.
           | 
           | There are interviews on Youtube with Huang at Standford IIRC,
           | where he discusses his vision of the GPU industry from the
           | early days of Nvidia. Check them out, the guy's not your
           | typical CEO suit focused on the share price, but he's
           | basically a tech genius.
           | 
           | So, to answer your other question about why only Nvidia
           | manage to win compute and not the other GPU companies, it's
           | simple. Huang had the vision for the entire ecosystem from
           | GPU chips, to drivers, to APIs and SW libraries, to
           | partnerships and cooperation with the people and the
           | companies who will use them. Building great GPUs for compute
           | is not enough if you're just gonna throw them on the market
           | without the ecosystem and support behind them, and expect it
           | to be a success. That's what Nvidia gets and the rest
           | (AMD/Intel) don't. So while ATI/AMD had tunnel vision and was
           | focused only on building gaming chips, Huang was busy
           | building a complete GPU-compute ecosystem for their gaming
           | chips with the rest of the industry.
        
             | Keyframe wrote:
             | Not to take anything from what you've said because Huang is
             | really all that and more. Find his interview with Morris
             | Chang to get some more insight. Nvidia, ATI and other
             | players for the most part were seeded by ex-SGI crew
             | however. SGI had an instrumental role in those companies
             | that ate it.
        
             | pjmlp wrote:
             | Same applies to Khronos APIs, they behave just like that,
             | throwing the APIs out there and hoping for the best.
             | 
             | No wonder they are always a shadow of what proprietary ones
             | bring in the box.
        
             | Melatonic wrote:
             | The guy really is a visionary - although I gotta say he
             | really needs to diversify his wardrobe. How long are we
             | gonna see him in the same exact look with the black leather
             | jacket?
        
             | rchiang wrote:
             | To be fair, Nvidia hired a lot of people who at some point
             | worked at SGI and 3dfx, so there was already a lot of
             | HPC/server/supercomputing talent working there. There are
             | articles (e.g.
             | https://www.extremetech.com/gaming/239078-ten-years-ago-
             | toda...) showing the transition from having fixed
             | vertex/pixel shaders in the GeForce 7000 GTX series to the
             | generalized stream processors starting with the GeForce
             | 8000 GTX series going forward.
             | 
             | And it's easy to see that Huang and Nvidia put their money
             | where their mouth was. The first GTC was 2009. That's 3
             | years before the famous AlexNet paper that's often credited
             | with kicking off the current AI on GPUs trend.
        
         | [deleted]
        
         | randomifcpfan wrote:
         | You're in luck in that the development of Quake is pretty well
         | documented. Both through a book and Carmack's plan files
         | 
         | https://github.com/oliverbenns/john-carmack-plan
         | 
         | https://en.m.wikipedia.org/wiki/Masters_of_Doom
        
         | zeagle wrote:
         | This makes me really nostalgic for being introduced to SGI
         | dogfight multiplayer on the local LAN I had occasional access
         | to visiting a family member's employer. It really felt
         | revolutionary for the day compared to the Pentium 1 whatever I
         | had access to at home!
        
       | xedarius wrote:
       | I remember being on a stand next to SGI at E3 in 1997. They had a
       | giant black truck in the arena like the one that Knight Rider
       | drove into. They were selling these machines that looked way more
       | powerful and expensive than anything the games industry could
       | afford. People at the show were mainly debating when and if Intel
       | could release a 1ghz processor. Stange what you remember.
        
       | georgewsinger wrote:
       | It was a common practice in the 90s for creative engineers to use
       | extremely expensive "supercomputer" workstations to pay for
       | productivity gains & live on the bleeding edge (e.g. Silicon
       | Graphics workstations, NeXT workstations, and so forth).
       | Question: What is the equivalent way to do this today? That is,
       | is there a way to pay a lot of money to use a computer which is
       | 5-10 years ahead of its time?
       | 
       | Ok so I'm pretty biased here, but I think the answer lies in VR
       | computing. There's no doubt VR computers are more expensive than
       | their PC/laptop counterparts, but they allow you to adopt a
       | bleeding edge technology which is essentially 5+ years ahead of
       | its time in terms of where it is on the "commodity computing"
       | frontier.
       | 
       | A good quote from Alan Kay I find pretty inspirational on this
       | front: https://youtu.be/id1WShzzMCQ?t=3345 Here he basically
       | advocates for spending hundreds of thousands of dollars on a
       | computing machine, in order to compute on what will be a
       | commodity product 10-15 years into the future. VR computers
       | aren't this extreme on the cost curve, but I think there is
       | something to this point of view which I find really
       | inspirational.[1]
       | 
       | [1] Caveat: I'm one of the founders of SimulaVR
       | (https://simulavr.com), so admittedly am very biased here. But I
       | do think VR provides a way to convert money into "better compute"
       | in a way that hasn't been available since the 70s-90s super
       | workstation era.
        
         | Keyframe wrote:
         | You could shell out for Nvidia's DGX A100. Not the same level
         | of futureness compared to current off the shelf stuff as this
         | was back then.
        
         | arapacana wrote:
         | I respect the smoothness and honesty of your plug.
        
         | wongarsu wrote:
         | I think todays equivalent is more running a desktop with a 64
         | core CPU and 128 GB RAM (or more), with as many screens as you
         | want (I don't find more than two useful, but preferences vary).
         | 
         | I could see VR computing as the mobile version of that. Laptops
         | can be pretty powerful, but their screens are limiting. VR
         | gives you the screen real estate without requiring you to set
         | up monitors where you are.
        
       | smm11 wrote:
       | Commodity hardware and good video cards crushed everything.
        
       | atum47 wrote:
       | I was strolling in the mall when I noticed a really cool movie on
       | a TV inside a random store; it was ffvii. Damn, I was like, this
       | is the best 3d I have ever seen. By the way, I had never played
       | the game by that point, none of the franchise. I kinda got
       | familiar with the Loren just so I could watch the movie.
       | Excellent job
        
       | [deleted]
        
       | trollski wrote:
        
       | mikehotel wrote:
       | In case this article piques your interest to try out a modern
       | version of IRIX: https://docs.maxxinteractive.com/
        
       ___________________________________________________________________
       (page generated 2022-04-07 23:00 UTC)