[HN Gopher] OpenAI's hunger for computing power
       ___________________________________________________________________
        
       OpenAI's hunger for computing power
        
       Author : doener
       Score  : 99 points
       Date   : 2025-10-04 22:14 UTC (1 days ago)
        
 (HTM) web link (www.wsj.com)
 (TXT) w3m dump (www.wsj.com)
        
       | neonate wrote:
       | https://archive.md/OnsLK
        
       | vineyardmike wrote:
       | What is their angle with this?
       | 
       | Surely SamA doesn't actually think that they'll more than 20x
       | their compute in a few years? I'm sure the researchers there
       | would love to do more research, with more compute, faster, but
       | 20+x growth is not a practical expectation.
       | 
       | Is the goal here to create a mad rush to build data centers,
       | which should decrease their costs with more supply? Do they just
       | want governments to step in and to help somehow? Is it part of
       | marketing/hype? Is this trying to project confidence to investors
       | on future revenue expectations?
        
         | refulgentis wrote:
         | > Surely SamA doesn't actually think that they'll more than 20x
         | their compute in a few years?
         | 
         | He does, or at least, he believes if its plausible they should
         | attempt to.
         | 
         | We live in odd times. It sort of reminds me of Feb 2020. All
         | you really needed to know was the Rt and rest was just math.
         | Who knows if it'll matter or pencil out in a decade, but, it's
         | completely reasonable at these growth rates and with the iron
         | laws known to keep scaling.
        
         | p1esk wrote:
         | _Surely SamA doesn't actually think that they'll more than 20x
         | their compute in a few years?_
         | 
         | If their goal is to train say, a 100T model on the whole
         | youtube dataset they will need 20000x more compute. And that
         | would be my goal if I were him.
        
           | bix6 wrote:
           | Why 20000x more compute? I thought they were at approx 1T
           | with current compute?
           | 
           | Edit: looked it up. 10k+ times more for training compute.
           | Sheesh. Get the Dyson sphere ready lol.
        
             | esafak wrote:
             | https://www.youtube.com/watch?v=Dy6Dw9rOAFQ
        
               | indolering wrote:
               | Now I'm more on the side of him being delusional.
        
               | zeroq wrote:
               | "At this point I think I know more about manufacturing
               | than anyone currently alive on Earth."
               | 
               | It's that dumbass at your work who thinks that solely
               | because he landed a job that pays him more than their
               | parents ever made combined in his early 20s he can school
               | everyone on every topic imaginable, from nutrition to
               | religion.
               | 
               | Him and Elon makes way more than that dumbass so ego get
               | inflated even more.
               | 
               | I don't especially like Tucker Carlson, but I think the
               | more screen time we'll give to this people with an open
               | mic it's better for everyone to have a first hand
               | experience of how detached from reality these people are.
        
               | johnnienaked wrote:
               | Absolutely right, and it's ubiquitous across
               | organizations too.
               | 
               | I've never met an executive I respect. They're all
               | absolute experts at _appearing_ competent.
        
               | aswegs8 wrote:
               | I mean, they're selected for it so that's not surprising
        
               | johnnienaked wrote:
               | I guess the surprising part is that appearing competent
               | is more important to shareholders than being competent
        
               | estimator7292 wrote:
               | Actually checking if someone _is_ competent requires
               | actual work, though. Work is for _lesser_ people.
               | Shareholders just _know_ if a person is or is not
               | competent, that 's why they have so much money, right?
        
               | blitzar wrote:
               | It is better to remain silent and be thought a fool than
               | to open your mouth and remove all doubt
        
               | IT4MD wrote:
               | Thanks for the example.
        
               | b00ty4breakfast wrote:
               | I can never tell if these guys have come to genuinely
               | love the smell of their own farts or if they're just
               | constantly in _sales mode_. Like maybe all those hours in
               | meetings with investors and shareholder or whatever has
               | gotten them stuck, like your mom used to warn you about
               | when you 'd make faces at your little brother.
        
               | llbbdd wrote:
               | When your job is to constantly be making the pitch for
               | your company, and you live in a world where every
               | conversation you have can be news before the end of the
               | day, the mask can never come off.
        
               | dmbche wrote:
               | If they know it won't bring in revenue, they can't get
               | out of "sales mode" because when the runway stops they
               | get left out. Like musical chairs with one chair left,
               | you want to keep the game going if you don't think you
               | can get it. And you're filthy rich as long as the game's
               | going.
        
             | kadushka wrote:
             | Mainly because global video data corpus is > 100k larger
             | than global text corpus, so you will need to train much
             | larger models for much longer (than current LLMs).
        
           | Drblessing wrote:
           | That would be awesome.
        
           | zingerlio wrote:
           | The AI bubble bursts when he stumbles to get that money.
        
         | indolering wrote:
         | In the early days of Bitcoin, we would argue security models
         | and laugh about Bitcoin mining taking some significant
         | percentage of global power supply. It's been giving around 1%
         | for a while now despite the supply falling off.
         | 
         | I wouldn't put bets on what the outer limits for AI are going
         | to be. However, it's a huge productivity boost across a huge
         | range of workflows. Models are still making large gains as they
         | become more sophisticated.
         | 
         | If I had Sam Altman's access to other people's capital, I would
         | be making large bets that it will keep growing.
        
         | skywhopper wrote:
         | He needs 11 figures of cash injected as soon as possible. The
         | people who can give it want a big return. Given the current
         | losses, the only way to make the math right is to lie
         | outrageously about what's possible.
        
           | karmakurtisaani wrote:
           | He's been kicking this can for years now. Looking forward to
           | the day he's forced to stop.
        
         | pram wrote:
         | Perceptually, it helps to take scrutiny off the current spend?
         | It isn't a bubble if you can just scoff at $100 billion and say
         | like" "thats pocket change, this will actually require 10
         | quadrillion dollars!!"
        
         | ants_everywhere wrote:
         | If they want to survive they need to outrun Google and have a
         | differentiated service. Which as of now it's not clear that
         | OpenAI will have a reason to exist in the near future with
         | Anthropic and Google.
         | 
         | They're likely betting on either training a model so big they
         | can't be ignored or possibly focusing more B2B which means lots
         | of compute to resell.
        
           | CuriouslyC wrote:
           | If their plan was to go toe to toe with Google as a
           | foundation model/inference provider they would 100% be
           | getting ground to dust, that's not a winnable fight. There's
           | a reason they've pivoted to product and retained Jony Ive.
           | 
           | Anthropic gets a TON of hate on social media, their models
           | are fragile, their infra is poorly managed, they're 100%
           | going to end up in Jeff's pocket. OpenAI is a survivor.
        
         | arthurofbabylon wrote:
         | The phrase you are looking for is "commodifying the periphery."
         | As adjacent bottlenecks open up, the bottleneck you control
         | becomes more valuable.
        
         | tim333 wrote:
         | My guess:
         | 
         | Altman figures AI will be a big deal and constrained by
         | avaiable compute.
         | 
         | If he locks in all the available compute and related finance
         | before the competition then he's locked in as the #1 AI
         | company.
         | 
         | I'm not sure 20x or 5x or 40x matters, nor revenue
         | expectations, so much as being ahead of the competition.
        
         | jgalt212 wrote:
         | > What is their angle with this?
         | 
         | My pet theory: Sam makes more money when OpenAI spends than
         | when OpenAI earns.
        
         | Szpadel wrote:
         | I believe it is because of RL you are no longer limited by
         | training data as you generate it during learning on the fly so
         | benchmark driven learning could scale with compute
         | 
         | they also seem to assume that everyone will use AI from them in
         | the future, especially with new "pulse" combined with ads.
         | scaling this will need much more compute.
         | 
         | is this reasonable? I'm not convinced, but this is how I
         | believe it's their reasoning
        
         | Tubelord wrote:
         | Pascal's wager applied to tech cycles. The fervent adherence to
         | the hype is akin to religious zealots in many ways
        
       | lemonlearnings wrote:
       | https://archive.is/OnsLK
        
       | lemonlearnings wrote:
       | Too big to fail is the goal. If the world is powered by openai
       | but it aint making a profit in 2028 they can just put their "were
       | a utility like water" facemask on and get bailed out.
        
         | kortilla wrote:
         | That's a fun trope but it's a terrible outcome for
         | shareholders.
        
           | jacquesm wrote:
           | Good.
        
           | avs733 wrote:
           | Which means it will be made into a terrible outcome for
           | everyone.
        
           | throwaway290 wrote:
           | shareholders like a business that can never fail...
        
           | lemonlearnings wrote:
           | Less terrible than being allow to go bust though.
        
         | stevenwoo wrote:
         | At least in the USA, I think if consumers realized their power
         | bills going up every year are tied to these new data centers,
         | there would be more opposition to data centers going up
         | politically. https://apnews.com/article/electricity-prices-
         | data-centers-a... I don't know if the electricity markets work
         | differently in other countries.
        
           | MountDoom wrote:
           | Taxpayers subsidize data centers in many other ways. These
           | are prestige projects for politicians, so they often get
           | long-term tax breaks and other preferential treatment.
           | 
           | I think it's part vanity, part a misunderstanding about the
           | economic benefits of a datacenter (which are nearly nil, as
           | they employ very few people and produce nothing for the local
           | market), and part just a desire to score brownie points with
           | wealthy corporations, which might mean donations, campaign
           | support, or other perks for the politician who makes it
           | happen.
        
           | noosphr wrote:
           | The power bill going up is because the US, and the West in
           | general, bet on renewables and a low energy future.
           | 
           | Neither of those things turn out to be a good fit for the new
           | economy. The only thing left for people who derided nuclear
           | for the last 40 years is to hope this is a bubble that sends
           | us back to the 17th century when it pops. Anything else means
           | we have to invest trillions in nuclear right now.
        
             | Drblessing wrote:
             | The amount of money being invested in AI should've been
             | invested into nuclear, both fusion and fission. The AI
             | bubble will burst, but the energy bubble never bursts.
        
             | DavidPiper wrote:
             | Genuine question from a non-American: What is "the new
             | economy"?
        
               | noosphr wrote:
               | Malthusianism for computers.
               | 
               | Moore's law is dead. The only way to increase compute is
               | to increase the power we feed to computers. AI is just
               | the shiniest example. Everything else will follow suit
               | until electricity costs increase enough that it doesn't
               | make sense to throw any more computation at it.
               | 
               | Any country that doesn't have energy to spare will be in
               | the position of countries which didn't have food to
               | support armies before the industrial revolution.
        
               | fooker wrote:
               | Interesting point. I can see this could turn out to be
               | true.
               | 
               | If we needed, for example, 1000 TWH to power AI for a
               | huge drone swarm but could not do it while China could,
               | this would be problematic.
               | 
               | It requires a future where MAD with nuclear weapons is
               | obsolete though, with some futuristic new missile defense
               | tech. I don't see that happening until some currently
               | unknown physics makes it possible.
        
             | jimbob45 wrote:
             | What's your beef with solar? Both parties seem to like it
             | just fine, despite it not covering as much of the total
             | demand as anyone would like.
        
               | noosphr wrote:
               | Both parties like it better because it turns the
               | electricity market into another casino that lets you take
               | billions from the parts of the economy that do things.
               | 
               | I worked as a quant in the electric market. There wasn't
               | a single dataset I saw where more renewables resulted in
               | lower total costs for consumers.
        
           | omcnoe wrote:
           | The US needs sufficient energy surplus to power industry. US
           | energy production has been essentially flat for the past 25
           | years and the country has forgot how to bring new capacity
           | online. Chinese energy production is up over 6x over the same
           | period. China has more clean energy generation capacity today
           | than their entire capacity a little over a decade ago.
           | 
           | Instead of panicking about data center electricity usage we
           | need to be worrying about getting back to a state where we
           | regularly bring new (clean) generation capacity online.
        
           | fooker wrote:
           | It's correlated to be data centers, not tied to. That's an
           | important difference.
           | 
           | We could easily build ..say.. 10 nuclear reactors and halve
           | the utility bills with amortization.
        
           | chermi wrote:
           | That's not the main problem. That's the convenient scapegoat
           | so we don't get mad about the real problem. Power bills have
           | been going up for years. We're just not good at generating
           | and serving sufficient energy. Our grid sucks, our utilities
           | suck and can do whatever they want, and we can't build
           | anything. And the grid problems get worse as we add
           | renewables as they have to manage more complex generation
           | profiles. (I'm all for renewables, love solar.)
        
       | nakamoto_damacy wrote:
       | [flagged]
        
         | nebula8804 wrote:
         | Its cheap because you aren't paying the full cost, you are
         | externalizing some of the costs onto others.
         | 
         | [1]:https://www.youtube.com/shorts/jNFemZpMadU
        
       | Drblessing wrote:
       | The AI-powered tiktok competitor is not going to be cheap on
       | compute
        
       | banandys wrote:
       | I mean yeah we all saw the video of him stealing gpus and getting
       | arrested
        
       | bgwalter wrote:
       | 40% of global DRAM output:
       | 
       | https://www.tomshardware.com/pc-components/dram/openais-star...
       | 
       | All for creating worthless TikTok videos with Sora 2, while we
       | don't get graphics cards and DRAM and our electricity prices
       | rise.
       | 
       | Trump will get another "win" for "his" Stargate project. The
       | meeting with South Korean President Lee Jae Myung was NOT
       | arranged by Altman, he is the messenger boy:
       | 
       | https://www.reuters.com/business/media-telecom/samsung-sk-hy...
        
         | johnnienaked wrote:
         | And our water runs out, and we pollute and destroy the planet
         | past the point of no return.
         | 
         | AI will fix it though?
        
           | logtempo wrote:
           | Hey, at least you'll be able to add that exterminated tigre
           | specie in your postcard from your last adventure trip. And
           | more water to that river, with some greener trees etc.
           | 
           | All of that without leaving your home ofc.
        
           | panta wrote:
           | Of course. According to Andreessen if you are not optimistic
           | and worry about the environment are an "enemy" for the bright
           | future ahead (while at the same time he puts Nick Land in the
           | list of the "Saints"). These people are deranged psychopaths,
           | why are we leaving them at the wheel?
        
             | johnnienaked wrote:
             | Sounds about right coming from the guy who plagiarized
             | university research to make his billions
        
       | Incipient wrote:
       | Is it just me, or does this extreme demand for compute imply
       | they've realised the core tech is mostly stagnant, and needs
       | compute to possibly scale towards anything AGI-like? (however
       | unlikely that is).
        
         | general1465 wrote:
         | I see it the same way. It is like automotive manufacturer who
         | is using bigger engines with more and more pistons and
         | wondering how much bigger engine next model iteration will need
         | to make it go faster and how many iterations until they will
         | finally break the sound barrier. However their product is
         | looking like a school-bus box on wheels which is going to rip
         | itself apart long before even reaching the sound barrier.
        
         | adventured wrote:
         | They're substantially tied down by demand/usage right now.
         | 
         | How much more compute would they need to allow all of their
         | paying users unlimited access to their best model? And to
         | enable that setup in such a way that it's actually very fast.
         | 
         | The answer: they need resources far beyond what they have now.
         | That's just to solve an existing problem.
         | 
         | Then throw in Sora 4 and whatever else will exist in a few
         | years, and the need to feed that monster. They couldn't come
         | close to allowing Sora 2 unlimited for all of their paying
         | customers - I'd hate to see what that would require.
         | 
         | Then let the AI begin world building for every user (which is
         | where this is going). It'll be decades before the resource
         | demands actually flatten globally (at least 20-30 years, to get
         | to global population saturation on usage; assuming the global
         | population will begin to rapidly slow and then shrink).
         | 
         | Hint: the solution to Fermi's Paradox is that we go into the
         | box and we never come back out, because it's a lot more
         | interesting for 99.9%+ of humanity than a bunch of repeating
         | rocks in space that take a zillion years to reach. The core
         | purpose of AI will be to world build for us, mentally
         | (relaxation, stimulation, entertainment, social) in we go: the
         | end. The same thing happens to any advanced beings that get to
         | our level, there's little to nothing out there that we can
         | reach of interest (and no, it doesn't matter if the HN crowd
         | disagrees, what matters for this outcome are the masses), we'll
         | definitely figure that out, and there's infinite
         | stimulation/experience in the machine world by contrast.
        
       | buyucu wrote:
       | Why does OpenAI need so much more compute than everybody else?
       | DeepSeek, Qwen and many others build competitive models that need
       | much less capital.
        
         | credit_guy wrote:
         | Most likely OpenAI has models at least as efficient as DeepSeek
         | or Qwen. Cerebras offers both GPT-OSS-120B and
         | Qwen3-235B-Instruct. Obviously, the second has twice as many
         | parameters as the first, but that's the closest comparison I
         | can find. The Qwen model is twice as large, but twice as slow
         | (1400 tokens/second vs 3000) and 50% more expensive ($1.2 per
         | million tokens vs $0.75). Now, OpenAI is running a proprietary
         | model, and most likely it is much optimized than the free
         | version they release for public use.
         | 
         | [1] https://inference-docs.cerebras.ai/models/overview
        
           | buyucu wrote:
           | Inference is not the main cost driver, training and research
           | is.
        
             | credit_guy wrote:
             | I'm not sure that's still the case. It used to be the case,
             | but I doubt it continues to be. OpenAI had $6.7 BN costs
             | for the first half of 2025. I doubt they spent $3 BN in
             | training and research. They have 700 million weekly users,
             | and many of these users are really heavy users. Just taking
             | myself: I probably consumed a few million tokens with
             | GPT-5-Codex in the last 3 days alone. I am a heavy user,
             | but I think there are users who burn through hundreds of
             | times more tokens than me.
        
         | yorwba wrote:
         | Chinese companies need to pay much higher prices for the same
         | GPUs, so they would need to charge more to make a profit, but
         | it's difficult to charge more unless they have a much better
         | product. So building massive data centers to gain market share
         | is riskier for them.
         | 
         | That said, Alibaba not releasing the weights for Qwen3-Max and
         | announcing $53 billion in AI infrastructure spending
         | https://www.reuters.com/world/china/alibaba-launches-qwen3-m...
         | suggests that they think they're now at a point where it makes
         | sense to scale up. (The Reuters article mentions data centers
         | in several countries, which I assume also helps work around
         | high GPU prices in China.)
         | 
         | Circling back to OpenAI: I don't think they're spending so much
         | on infrastructure just because they want to train bigger models
         | on more data, but moreso because they want to serve those
         | bigger models to more customers using their services more
         | intensively.
        
         | lvl155 wrote:
         | They're trying to lock out competition from accessing compute.
        
       ___________________________________________________________________
       (page generated 2025-10-05 23:01 UTC)