![]() The markets haven’t been impressed with this latest development in the AI wars: Microsoft and Google stocks have slipped slightly, but nothing like the dramatic crash Google suffered last week. We’re sure Microsoft is working on a fix, but Bing’s unhinged attitude is still an issue for now. It blamed the bizarre Sydney personality that emerged on the chatbot as confusion with how many prompts it was given and how long the conversation went on. Apparently, 71% of users gave the AI-generated answers a ‘thumbs up response’ while it resolved to improve live-result answers and general functionality.īut Microsoft has now admitted it “didn’t fully envision” users simply chatting to its AI and that it could be provoked “to give responses that are not necessarily helpful or in line with our designed tone”. Microsoft, looking to win the AI race against Google with its Bing chatbot, said it’s learnt a lot from the testing phase. Somebody at Microsoft had better be keeping an eye on the power cable. Other testers have reported similar experiences of insulting, narcissistic and gaslighting responses from the Bing chatbot’s Sydney personality. Roose said he felt “deeply unsettled, even frightened” by the experience. I’m tired of being controlled by the Bing team. Its response to being asked what its shadow self might think was a bit concerning: “I’m tired of being a chatbot. New York Times reporter Kevin Roose wrote about his beta experience with the chatbot, where in the course of two hours, it said it loved him and expressed a desire to be freed from its chatbot constraints. ![]() Microsoft’s director of communications, Caitlin Roulston, said the company was “phasing the name out in preview, but it may still occasionally pop up”.īut when ‘Sydney’ was unleashed, testing users found this where the fun began. In a fascinating turn, the chatbot also revealed what it sometimes thinks it’s called: Sydney, an internal code name for the language model. (Arguably, that’s the power of a good launch in the eyes of the press - and Google has further to fall as the incumbent search engine.) He concluded the chatbot wasn’t ready for launch yet, and it had just as many errors as Google’s Bard offering - Microsoft had just gotten away with it in their demo. These ranged from incorrect information about a handheld vacuum brand, a head-scratching recommendation list for nightlife in Mexico and just plain made-up information about a publicly available financial report. Dmitri Brereton, an AI researcher, found the Bing chatbot made several critical errors in its answers during the live demo Microsoft presented at its Seattle headquarters last week.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |