[HN Gopher] How Is LLaMa.cpp Possible?
___________________________________________________________________
How Is LLaMa.cpp Possible?
Author : birriel
Score : 64 points
Date : 2023-08-15 22:18 UTC (41 minutes ago)
(HTM) web link (finbarr.ca)
(TXT) w3m dump (finbarr.ca)
| Havoc wrote:
| What I find more stunning is what this implies going forward. If
| tech advances as it tends to do then having a 200bn model fit
| into consumer hardware isn't that far away.
|
| Might not be AGI but I think cliched as it is that would "change
| everything". If not at 200 then 400 or whatever. Doesn't matter -
| the direction of travel seems certain.
| gct wrote:
| Basically Ray Kurzweil's argument, he's been saying $1000 worth
| of compute will be able to match human performance around 2029
| for decades now.
| csjh wrote:
| IMO the direction we're going seems more like having a few
| small models in a MoE that are equivalent to a current 200bn
| model
| TMWNN wrote:
| Bah. We still haven't equaled the rude and hateful AI achieved in
| a microcomputer in 1981. <https://scp-wiki.wikidot.com/scp-079>
| RosanaAnaDana wrote:
| We can keep reaching for that rainbow.
| __loam wrote:
| Will be interesting to see what people can do with local models,
| particularly for open source programming tools and PCG models for
| video games.
___________________________________________________________________
(page generated 2023-08-15 23:00 UTC)