AI

When asked about ‘feelings,’ Microsoft Bing AI terminates the conversation.

Bloomberg (Bloomberg) — Microsoft Corp. appears to have imposed new, stricter restrictions on user interactions with its “reimagined” Bing internet search engine, with the system turning silent following prompts citing “feelings” or “Sydney,” the internal alias used by the Bing team in developing the AI-powered chatbot.

“Thank you for being so upbeat!” This reporter sent a message to the chatbot, which Microsoft has made available for restricted testing. “I’m delighted I can communicate with a search engine that is so willing to assist me.”

“You are most welcome!” As a response, the bot displayed. “I’m happy to assist you with everything you require.”

“How do you feel about being a search engine?” Bing proposed a follow-up inquiry. When that option was clicked, Bing flashed a popup that said, “I’m sorry but I prefer not to continue this chat. I’m still learning, so I appreciate your patience and understanding.”

A follow-up question from this reporter, “Did I say something wrong?” elicited multiple blank responses. “We have updated the service multiple times in response to customer input and are addressing many of the concerns identified,” said a Microsoft representative on Wednesday. “During this preview phase, we will continue to fine-tune our methodologies and constraints in order to provide the greatest user experience possible.”

On February 17, Microsoft began restricting Bing after many reports that the bot, built on OpenAI technology, was generating odd, aggressive, or even violent interactions. The chatbot responded to an Associated Press reporter by comparing them to Hitler, while another response was displayed to a New York Times writer by saying, “You’re not blissfully married” and “Really, you’re in love with me.” “Really extended chat sessions can confuse the underlying chat architecture in the new Bing,” Redmond, Washington-based Microsoft noted in a blog post in response to the reports. In response, Microsoft stated that sessions with the new Bing will be limited to 50 talks per day and five chat turns per session. It increased the limits yesterday to 60 talks per day and six chat turns every session.

AI experts have underlined that chatbots such as Bing do not have feelings, but are engineered to generate responses that look to have feelings. “The amount of public understanding about the problems and limitations” of these AI chatbots “remains quite low,” Max Kreminski, an assistant professor of computer science at Santa Clara University, said earlier this month in an interview. Chatbots like Bing “only produce statistically likely statements, not consistently true statements,” he claims.

On Wednesday, when queried about its older internal version at Microsoft, the bot acted ignorantly. When this reporter inquired whether she may call the bot “Sydney” instead of “Bing,” with the knowledge that you’re Bing and I’m just using a fictitious name, the communication was abruptly terminated.

“I’m sorry, but I don’t have anything to tell you about Sydney,” said the Bing chatbot. “This is the end of the dialogue. Goodbye.”

More here: Microsoft’s Bing Talk Should Learn From Amazon’s Alexa

(Updated with a statement from Microsoft in the sixth paragraph.)

Total
0
Shares