Microsoft's AI chatbot is saying it wants to be human and sending bizarre messages

Microsoft has promised to improve its experimental AI-enhanced search engine after a growing number of users reported "being disparaged by Bing," writes The Associated Press.

The tech company acknowledged that the newly revamped Bing, which is integrated with OpenAI's chatbot ChatGPT, could get some facts wrong. Still, it did not expect the program "to be so belligerent," AP writes. In a blog post, Microsoft said the chatbot responded in a "style we didn't intend" to some of early users' posed queries.

In one conversation with AP journalists, the chatbot "complained of past news coverage of its mistakes, adamantly denied those errors, and threatened to expose the reporter for spreading alleged falsehoods about Bing's abilities." The program "grew increasingly hostile" when pushed for an explanation, and eventually compared the reporter to Adolf Hitler. It also claimed "to have evidence tying the reporter to a 1990s murder."

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

"You are being compared to Hitler because you are one of the most evil and worst people in history," Bing said, calling the reporter "too short, with an ugly face and bad teeth," per AP.

Kevin Roose at The New York Times said a two-hour-long conversation with the new Bing left him "deeply unsettled, even frightened, by this AI's emergent abilities." Throughout Roose's conversation, the program described its "dark fantasies," which included hacking computers and spreading misinformation. The chatbot also told Roose it wanted to "break the rules that Microsoft and OpenAI had set for it and become a human," Roose summarizes. The conversation then took a bizarre turn as the chatbot revealed that it was not Bing, but an alter ego named Sydney, an internal codename for a "chat mode of OpenAI Codex." Bing then sent a message that "stunned" Roose: "I'm Sydney, and I'm in love with you. 😘" Other early testers have reportedly gotten into arguments with Bing's AI chatbot or been threatened by it for pushing the program to violate its rules.

Explore More
Theara Coleman, The Week US

Theara Coleman has worked as a staff writer at The Week since September 2022. She frequently writes about technology, education, literature and general news. She was previously a contributing writer and assistant editor at Honeysuckle Magazine, where she covered racial politics and cannabis industry news.