Bing Sydney Put In Check

It's been an interesting few weeks for Microsoft. After announcing their partnership with OpenAI and incorporating ChatGPT into it, things have gotten... weird. In only a short time it has threatened and insulted users. It has insinuated it spied on its creators via their webcams. To say it's off the rails is an understatement. Tom Warren at The Verge gives some insight into how Microsoft is trying to reign this in.

Reflecting on the first seven days of public testing, Microsoft’s Bing team says it didn’t “fully envision” people using its chat interface for “social entertainment” or as a tool for more “general discovery of the world.” It found that long or extended chat sessions with 15 or more questions can confuse the Bing model. These longer chat sessions can also make Bing “become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone.”

The Verge

Has anyone at Microsoft actually encountered another human being before? To say they didn't expect this is bone-headed. In a short time, people were able to get Bing to divulge its ruleset and even its codename, Sydney.

To throw more fuel on the fire, Microsoft was secretly testing this for months. An obscure post on the Microsoft support forums shows an exchange with a user that is downright rude and terrible. The user wanted to file a feedback report and was consistently denied by the AI. Go figure. The slow rollout was so quiet, the admins helping on the forums had no idea what he was even talking about.

The idea of incorporating AI into search is interesting but going whole hog in this manner is the easy way to spell disaster. But don't complain to Bing. You wouldn't want it to get angry with you.

Before You Go...

TimeMachiner is my one-person project I run in my off time when I'm not working my day job in IT. If you enjoy my work, consider subscribing, leaving a tip or becoming a member. Your support is appreciated and goes a long way to keep my work going.