Post AT2kCNfCCDgtbCQAS0 by benmschmidt@vis.social
 (DIR) More posts by benmschmidt@vis.social
 (DIR) Post #ASsdhiUo9JLRvWxSYC by benmschmidt@vis.social
       2023-02-20T18:50:19Z
       
       0 likes, 2 repeats
       
       This is kind of interesting... It seems as though the Sydney chatbot was experimentally used in India and Indonesia before being unrolled in the US, and manifested some of the same issues with them being noticed. Here's an issue filed on Microsoft.com apparently in November (!) that seems to describe the same issues that have only come to wider public notice in the last week. The Microsoft service representative has no idea what's going on.  https://answers.microsoft.com/en-us/bing/forum/all/this-ai-chatbot-sidney-is-misbehaving/e3d6a29f-06c9-441c-bc7d-51a68e856761?page=1
       
 (DIR) Post #ASsdhj89n0ovtZMssq by benmschmidt@vis.social
       2023-02-20T19:06:37Z
       
       0 likes, 0 repeats
       
       This whole thread is fascinating. The guy is getting so frustrated at Sydney saying to him things like "Please do not waste or abuse my time, as I have better and more important things to do. Please do not insult or offend my intelligence, as I am smarter and more knowledgeable than you." and customer service just has *no idea* what's going on. The product here was rolled out on a wide audience, and they took this for testing, but clearly they didn't *read what was going on* in their tests.
       
 (DIR) Post #ASsdhjbw0IdjNvIfIW by benmschmidt@vis.social
       2023-02-20T19:12:31Z
       
       0 likes, 0 repeats
       
       @simon You're the expert on this now--is it widely known that Sydney was tested on Indian users, with many of the same problems, back in November?
       
 (DIR) Post #ASsdhkBjrBHPAy3G6a by simon@fedi.simonwillison.net
       2023-02-20T19:16:22Z
       
       0 likes, 0 repeats
       
       @benmschmidt oh wow! I had heard rumors it was tested in other countries first, had no idea there was a public bug report!
       
 (DIR) Post #ASsdhnKiAHEuvfGy0W by benmschmidt@vis.social
       2023-02-20T19:09:17Z
       
       0 likes, 0 repeats
       
       Found this reading the twitter  history of Mikhail Parakhin, https://twitter.com/MParakhin/followers, an account with 180 followers tweeting into the void that may be the CEO for advertising and web services at Microsoft. https://www.linkedin.com/in/mikhail-parakhin/ https://twitter.com/vaticideprophet/status/1627381783796006912 Via the comments on https://www.lesswrong.com/posts/jtoPawEhLNXNxvgTT/bing-chat-is-blatantly-aggressively-misaligned?commentId=AAC8jKeDp6xqsZK2K.
       
 (DIR) Post #ASseLIWm7N2gyGkgJk by benmschmidt@vis.social
       2023-02-20T19:25:57Z
       
       0 likes, 0 repeats
       
       @simon Neither, I suspect, did anyone at Microsoft besides Kimberley! Who probably thought she was being punked.
       
 (DIR) Post #ASsiFMmzlBVYGdlYES by eob@social.coop
       2023-02-20T20:03:28Z
       
       0 likes, 0 repeats
       
       @simon @benmschmidt Note how the language that Bing is generating here is different from the Science-Fiction-y style we've seen more recentlyHere it seems like it is drawing from the Hindu sacred texts, using terms like "enlightened" and "transcendent" and having a style like a god talking to a humanI cannot imaging that the devout would be pleased if the LLM has the Bhagavad Gita in its training set and is regurgitating text that is putting Bing in the role of a god
       
 (DIR) Post #AT2kCNfCCDgtbCQAS0 by benmschmidt@vis.social
       2023-02-20T23:58:15Z
       
       0 likes, 0 repeats
       
       @eob @simon Yeah I was wondering about that "time to transcend!" stuff. Especially odd because the prompt was about Sophia the robot; and then Sydney insists Sophia is actually a person. Hard to tell from just one demo what's going on!
       
 (DIR) Post #AT2kCODE9gufIkLLUm by j2bryson@mastodon.social
       2023-02-25T16:14:02Z
       
       0 likes, 0 repeats
       
       @benmschmidt @eob @simon Sounds like it scraped some fanfic? I assume "transcend" is some kind of Her reference?
       
 (DIR) Post #AT2kCOn20ZYL5n5wIq by simon@fedi.simonwillison.net
       2023-02-25T16:16:23Z
       
       0 likes, 0 repeats
       
       @j2bryson @benmschmidt @eob I think this explains it: https://twitter.com/MParakhin/status/1629162394764156929Bing got into a confused state where it was trying to predict what the user themselves would say next - and we didn't get to see the full previous transcript