Subj : Fed up with the Bing AI chatbots attitude? Now you can change its To : All From : TechnologyDaily Date : Thu Mar 02 2023 11:30:03 Fed up with the Bing AI chatbots attitude? Now you can change its personality Date: Thu, 02 Mar 2023 11:10:10 +0000 Description: With three selections for different personalities possible, there seems to be a clear favorite so far. FULL STORY ====================================================================== Microsofts Bing chatbot is now offering a choice of personalities for all users, with the rollout of the Bing Chat Mode selector having been completed. This news was shared on Twitter by Mikhail Parakhin, head of Microsofts Advertising and Web Services division, as spotted by MS Power User . Now almost everyone - 90% - should be seeing the Bing Chat Mode selector (the tri-toggle). I definitely prefer Creative, but Precise is also interesting - it's much more factual. See which one you like. The 10% who are still in the control group should start seeing it today. March 1, 2023 See more As you can see, at the time of the tweet, 90% of Bing chatbot users had the tri-toggle chat selector that lets you switch between three different personalities for the AI (Precise, Balanced, or Creative). The remaining control group (10%) then had the selector rolled out to them across the course of yesterday, so everyone should have it by now. Thats good news for those who want more options when it comes to the chatbots responses to their queries. Earlier this week, we saw other work on the AI to reduce what are called hallucinations (where the chatbot gives inaccurate info, or plain makes a mistake). There was also tinkering to ensure that instances where Bing simply fails to respond to a query happen less often. While thats all good, it seems on the latter count, theres a fresh stumbling block that has been introduced with the latest version of the chatbot which has the personality selector namely a something went wrong error message when querying the ChatGPT-powered AI. In the above Twitter thread, there are a few complaints along these lines, so hopefully this is something Microsoft is already investigating. Analysis: Creative for the win? Maybe for now... Doubtless there will be plenty of experimentation with the chat modes to determine exactly how these three personalities are different. Thus far, the Creative setting seems to be getting the most positive feedback, and this is likely the one many Bing users are plumping for. Simply because this is where the AI has the most free rein, and so will seem more human-like rather than Precise mode which is more like a straight answer to a search query. (Arguably somewhat defeating the point of having an AI carrying out your searches, anyway). Balanced is a middle road between the two, so that may tempt fans of compromise, naturally. Initial feedback indicates that in Creative mode Bing gives more detailed answers, not just adding a more personal touch, but seemingly fleshing out replies to a greater depth. Thats going to be useful, and likely to lead to this being the more popular choice. Especially as this setting is where youre going to get the more interesting or perhaps occasionally eccentric, or even outlandish responses. Microsoft may need to look at working on the Balanced setting to be a more compelling choice, particularly if it sees that traffic is heavily skewed towards the Creative option. That said, the latter being popular is likely to be partly tied in with how new the AI is, attracting people who are curious and just want to mess around with the chatbot to see what they can get Bing to say. Those kind of users will doubtless get bored of toying with the AI before too long, giving a different picture of personality usage when the dust settles a bit more. At any rate, tweaking Bings personalities is something thatll doubtless happen on an ongoing basis, and we may even get more options aside from these initial three eventually. Come on, Microsoft, we all want to see Angry Bing in action, or maybe a Disillusioned chatbot (or how about an Apocalypse Survivor setting?). No? ====================================================================== Link to news story: https://www.techradar.com/news/fed-up-with-the-bing-ai-chatbots-attitude-now-y ou-can-change-its-personality --- Mystic BBS v1.12 A47 (Linux/64) * Origin: tqwNet Technology News (1337:1/100) .