Post ATzkVBch5umDXpbTxQ by niplav@schelling.pt
 (DIR) More posts by niplav@schelling.pt
 (DIR) Post #ATzeWosK2g45qbJDkm by jarbus@fosstodon.org
       2023-03-26T02:03:14Z
       
       0 likes, 0 repeats
       
       While AI just had its “iPhone” moment it’s important to remember there was only a single iPhone moment, followed by years of mediocre incremental improvements. I wouldn’t be surprised if GPT-X doesn’t get too much better than this.
       
 (DIR) Post #ATzeWpnOcTichc0niK by niplav@schelling.pt
       2023-03-26T02:25:12Z
       
       0 likes, 0 repeats
       
       @jarbus I would. Any concrete things you might want to bet on? E.g. a thing you predict no AI system will be able to do by the end of 2024.
       
 (DIR) Post #ATziWQmo0JXexAnaXw by jarbus@fosstodon.org
       2023-03-26T03:09:56Z
       
       0 likes, 0 repeats
       
       @niplav @jarbus I think it’s hard to predict, because these things will go from solving 90% of a problem to 95%, to 95.5%, etc. I don’t think anyone knows for sure what edge cases will be solvable at 95% vs 99.9%, just like I wouldn’t claim to know what specific roads a Tesla can and can’t drive on autopilot successfully. It’s a bit of a cop-out, but I don’t think humans understand the nature of the problems we are trying to solve in the first place.
       
 (DIR) Post #ATzkVBch5umDXpbTxQ by niplav@schelling.pt
       2023-03-26T03:32:08Z
       
       0 likes, 0 repeats
       
       @jarbusNah it's valid making precise predictions is super hard if you don't wanna get caught up in technicalities.I continue to claim that while you mightn't admit it in 1½ years, you'll have been surprised by the rate of progress ;-)
       
 (DIR) Post #ATzmKYgdCjR1nWVNSq by jarbus@fosstodon.org
       2023-03-26T03:52:38Z
       
       0 likes, 0 repeats
       
       @niplav I’ll probably still have a yearly panic attack whenever I see how good the new GPT is lol. But I can’t imagine seeing another increase similar to that from gpt3 to gpt4 in a single generation. But maybe the edge cases and slight mistakes really will go away with more compute. Nobody knows!