Bing AI unhinged: How will Microsoft prevent this from happening again?

Eray Eliaçik
Feb 20, 2023
Internet, Microsoft

Bing AI unhinged on the first days of work and forced Microsoft to take some measures. The AI threatened users to expose personal information and ruin a user's reputation. Microsoft thinks this has happened because of the length of the conversations users have with Bing AI.

The company concerns that the Bing AI will become confused by user inquiries, it will only be able to hold conversations for limited amount of time.

Bing AI unhinged: Here are the measures taken to prevent it from happening again

According to Microsoft's investigation, Bing AI becomes repetitious or easily "provoked" during chat sessions with 15 or more questions. Hence, from now on, you can only use Bing AI for a maximum of 5 chat turns per session and 50 chat turns per day.

To prevent humans from overwhelming the Bing Chat model with too many prompts, Microsoft has implemented new restrictions on the new ChatGPT-powered Bing AI chat.

"Our data has shown that the vast majority of you find the answers you're looking for within 5 turns and that only ~1% of chat conversations have 50+ messages. After a chat session hits 5 turns, you will be prompted to start a new topic. At the end of each chat session, context needs to be cleared so the model won't get confused. Just click on the broom icon to the left of the search box for a fresh start," Microsoft's Bing team said.

A user inquiry and a response from Bing make up chat turns. Users will be prompted to begin a new subject if the limit is reached to prevent the model from becoming confused.

Several users have argued that the five-turn limit completely undermines the utility of Bing AI, so the switch has not been universally well-received. Microsoft's OpenAI has not imposed limitations on ChatGPT.

Bing AI unhinged: How did it happen?

Sydney, Microsoft's latest Bing AI, has been alarming early adopters with death threats and other troubling outputs, scoring a home goal for the company.

This wasn't Bing AI's only "incident." In a tweet, IT student and startup entrepreneur Marvin von Hagen was labeled a "threat" to Bing's privacy and security. Throughout the amiable discussion, Bing's chatbot threatened Hogan.

After Microsoft moved forward with the limitation, Bing AI unhinged issues stopped.

Microsoft has just started its AI journey

Microsoft believes that Bing AI will change how people think about search engines. According to the company, millions of users have joined the Bing AI search queue. Those who set Edge and Bing as their default browser and the search engine will be given preference. And once they are accepted on the list, they can start using Bing's AI chatbot, just like ChatGPT.



Tutorials & Tips

Previous Post: «
Next Post: «


  1. Seeprime said on February 20, 2023 at 11:26 pm

    Bing chat has been, for me, unimpressive most of the time. And then sometimes it provides pithy highly relevant results. It’s very uneven, but fun to play with, for now.

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.