Post ATLQQNQ4Gtfo2jElCy by colorblindcowboy@mastodon.art
 (DIR) More posts by colorblindcowboy@mastodon.art
 (DIR) Post #ATLNX4mLfwy8OXA50i by grammargirl@zirk.us
       2023-03-06T16:06:15Z
       
       0 likes, 0 repeats
       
       I can't stop thinking about this story in which #chatGPT insists the author is dead, even citing a fake obituary. It's so weird!https://www.theregister.com/2023/03/02/chatgpt_considered_harmful/
       
 (DIR) Post #ATLO65is9r9tSRESVE by paulkater@mstdn.social
       2023-03-06T16:12:18Z
       
       0 likes, 0 repeats
       
       @grammargirl It's a program that lives on input and output. We call it GIGO.garbage in, garbage out.That's what happens, on an amazing scale.
       
 (DIR) Post #ATLQQNQ4Gtfo2jElCy by colorblindcowboy@mastodon.art
       2023-03-06T16:38:34Z
       
       0 likes, 0 repeats
       
       @grammargirl Wow. I read “Laura” this weekend, and this feels like the dystopian version.
       
 (DIR) Post #ATLSlSSYvXL9fGaGPo by statesdj@genomic.social
       2023-03-06T17:04:47Z
       
       0 likes, 0 repeats
       
       @grammargirl ChatGPT is a model trained on human text. The implication is that most people double down when caught in a lie
       
 (DIR) Post #ATLaAlfdL1KaAGiIt6 by kims@mas.to
       2023-03-06T18:27:52Z
       
       0 likes, 0 repeats
       
       @grammargirl Give it a few weeks and tech bros will be arguing that it's the author's fault for being so obstinate about remaining alive.The implications he lays out (job rejections, credit reports, etc.) are terrifying.
       
 (DIR) Post #ATLnJt96DI6NMKAvbs by leadore@toot.cafe
       2023-03-06T20:55:04Z
       
       0 likes, 0 repeats
       
       @grammargirl I was wondering about that, too. The AI works by predicting what text "should come next". I think, since so many biographical summary articles that it would have been trained on end with mentioning the person's death, it would tend to insert a line like that, calculating a high probability that one should go there. We've seen plenty of examples that when it doesn't have actual source material, it just makes something up. I agree with the author that serious harm will be done.
       
 (DIR) Post #ATMEanRLNUPdLeznvs by spotrick@aus.social
       2023-03-07T02:00:40Z
       
       0 likes, 0 repeats
       
       @grammargirl "computer says no", on a massive scale.
       
 (DIR) Post #ATNlEcmDpmg5YFkylk by AaronNGray@geekdom.social
       2023-03-07T19:41:10Z
       
       0 likes, 0 repeats
       
       @grammargirl @jaredwhite yes it does not check facts even in programming language problems. I got it very flustered.