[HN Gopher] The impact of competition and DeepSeek on Nvidia
___________________________________________________________________
The impact of competition and DeepSeek on Nvidia
Author : eigenvalue
Score : 48 points
Date : 2025-01-25 15:30 UTC (1 days ago)
(HTM) web link (youtubetranscriptoptimizer.com)
(TXT) w3m dump (youtubetranscriptoptimizer.com)
| eigenvalue wrote:
| Yesterday I wrote up all my thoughts on whether NVDA stock is
| finally a decent short (or at least not a good thing to own at
| this point). I'm a huge bull when it comes to the power and
| potential of AI, but there are just too many forces arrayed
| against them to sustain supernormal profits.
|
| Anyway, I hope people here find it interesting to read, and I
| welcome any debate or discussion about my arguments.
| zippyman55 wrote:
| So at some point we will have too many cannon ball polishing
| factories and it will become apparent the cannon ball trajectory
| is not easily improved on.
| j7ake wrote:
| This was an amazing summary of the landscape of ML currently.
|
| I think the title does the article injustice, or maybe it's too
| long for people to read to appreciate it (eg the deepseek stuff
| can be an article within itself).
|
| Whatever the ones with longer attention span will benefit from
| this read.
|
| Thanks for summarising this up!
| eigenvalue wrote:
| Thanks! I was a bit disappointed that no one saw it on HN
| because I think they'd like it a lot.
| j7ake wrote:
| I think they would like it a lot, but I think the title
| doesn't match the content, and it takes too much reading
| before one realises it goes beyond the title.
|
| Keep it up!
| dang wrote:
| We've changed the title to a different one suggested by the
| author.
| diesel4 wrote:
| Link isn't working. Is there another or a cached version?
| eigenvalue wrote:
| Try again! Just rebooted the server since it's going viral now.
| kevinventullo wrote:
| I'm getting a 502
| OutOfHere wrote:
| It seems like a pointless discussion since DeepSeek uses Nvidia
| GPUs after all.
| jjeaff wrote:
| it uses a fractional amount of GPUs though.
| breadwinner wrote:
| As it says in the article, you are talking about a mere
| constant of proportionality, a single multiple. When you're
| dealing with an exponential growth curve, that stuff gets
| washed out so quickly that it doesn't end up matter all that
| much.
|
| Keep in mind that the goal everyone is driving towards is
| AGI, not simply an incremental improvement over the latest
| model from Open AI.
| ithkuil wrote:
| Which due to the Jevons Paradox may ultimately cause more
| shovels to be sold
| cma wrote:
| Their loss curve with the RL didn't level off much though,
| could be taken a lot further and scaled up to more parameters
| on the big nvidia mega clusters out there. And the
| architecture is heavily tuned to nvidia optimizations.
| arcanus wrote:
| > Amazon gets a lot of flak for totally bungling their internal
| AI model development, squandering massive amounts of internal
| compute resources on models that ultimately are not competitive,
| but the custom silicon is another matter
|
| Juicy. Anyone have a link or context to this? I'd not heard of
| this reception to NOVA and related.
| snowmaker wrote:
| This is an excellent article, basically a patio11 / matt levine
| level breakdown of what's happening with the GPU market.
| eprparadox wrote:
| link seems to be dead... is this article still up somewhere?
___________________________________________________________________
(page generated 2025-01-26 23:01 UTC)