[HN Gopher] OpenAI created a team to control 'superintelligent' ...
___________________________________________________________________
OpenAI created a team to control 'superintelligent' AI - then let
it wither
Author : doener
Score : 13 points
Date : 2024-05-18 22:31 UTC (28 minutes ago)
(HTM) web link (techcrunch.com)
(TXT) w3m dump (techcrunch.com)
| andy99 wrote:
| Luckily that doesn't exist. It's about equivalent to spinning up
| a Ghostbusters unit and letting it wither. You can argue that
| we'll be in trouble if we encounter ghosts, but on likelihood
| weighted basis there are much better uses of time.
| lgas wrote:
| Probability is only half of the expected value equation. When
| the outcome is excessive then you should care even if the
| probability is low as long as it is non-zero.
| manmal wrote:
| Part of me thinks that's exactly why this department has been
| sidelined - having such a department is necessary to create
| hype ("we are creating things so powerful we have to explore
| how to contain them"), but it doesn't need to thrive either.
| throwaway115 wrote:
| >What if someone builds a supermassive teapot in Earth's orbit
| and it breaks free and comes hurtling towards us?
|
| Ok, that seems ridiculous and infeasible and nothing that has
| happened so far indicates that this is a real threat.
|
| >Yes, but it would be so catastrophic to humanity, that we must
| take it seriously, regardless of how improbable it is!
|
| This is the AGI debate, in my view. If we're picking different
| extremely improbable events to get worked up about, why stop at
| AI?
___________________________________________________________________
(page generated 2024-05-18 23:00 UTC)