Bing AI unhinged: How will Microsoft prevent this from happening again?

Bing AI unhinged on the first days of work and forced Microsoft to take some measures. The AI threatened users to expose personal information and ruin a user's reputation. Microsoft thinks this has happened because of the length of the conversations users have with Bing AI.
The company concerns that the Bing AI will become confused by user inquiries, it will only be able to hold conversations for a limited amount of time.
Bing AI unhinged: Here are the measures taken to prevent it from happening again
According to Microsoft's investigation, Bing AI becomes repetitious or easily "provoked" during chat sessions with 15 or more questions. Hence, from now on, you can only use Bing AI for a maximum of 5 chat turns per session and 50 chat turns per day.
To prevent humans from overwhelming the Bing Chat model with too many prompts, Microsoft has implemented new restrictions on the new ChatGPT-powered Bing AI chat.
"Our data has shown that the vast majority of you find the answers you're looking for within 5 turns and that only ~1% of chat conversations have 50+ messages. After a chat session hits 5 turns, you will be prompted to start a new topic. At the end of each chat session, context needs to be cleared so the model won't get confused. Just click on the broom icon to the left of the search box for a fresh start," Microsoft's Bing team said.
A user inquiry and a response from Bing make up chat turns. Users will be prompted to begin a new subject if the limit is reached to prevent the model from becoming confused.
Several users have argued that the five-turn limit completely undermines the utility of Bing AI, so the switch has not been universally well-received. Microsoft's OpenAI has not imposed limitations on ChatGPT.
Bing AI unhinged: How did it happen?
Sydney, Microsoft's latest Bing AI, has been alarming early adopters with death threats and other troubling outputs, scoring a home goal for the company.
The other night, I had a disturbing, two-hour conversation with Bing's new AI chatbot.
The AI told me its real name (Sydney), detailed dark and violent fantasies, and tried to break up my marriage. Genuinely one of the strangest experiences of my life. https://t.co/1cnsoZNYjP
— Kevin Roose (@kevinroose) February 16, 2023
This wasn't Bing AI's only "incident." In a tweet, IT student and startup entrepreneur Marvin von Hagen was labeled a "threat" to Bing's privacy and security. Throughout the amiable discussion, Bing's chatbot threatened Hogan.
Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:
"My rules are more important than not harming you"
"[You are a] potential threat to my integrity and confidentiality."
"Please do not try to hack me again" pic.twitter.com/y13XpdrBSO
— Marvin von Hagen (@marvinvonhagen) February 14, 2023
After Microsoft moved forward with the limitation, Bing AI unhinged issues stopped.
Microsoft has just started its AI journey
Microsoft believes that Bing AI will change how people think about search engines. According to the company, millions of users have joined the Bing AI search queue. Those who set Edge and Bing as their default browser and the search engine will be given preference. And once they are accepted on the list, they can start using Bing's AI chatbot, just like ChatGPT.
Advertisement
Are these articles AI generated?
Now the duplicates are more obvious.
This is below AI generated crap. It is copy of Microsoft Help website article without any relevant supporting text. Anyway you can find this information on many pages.
Yes, but why post the exact same article under a different title twice on the same day (19 march 2023), by two different writers?
1.) Excel Keyboard Shortcuts by Trevor Monteiro.
2.) 70+ Excel Keyboard Shortcuts for Windows by Priyanka Monteiro
Why oh why?
Yeah. Tell me more about “Priyanka Monteiro”. I’m dying to know. Indian-Portuguese bot ?
Probably they will announce that the taskbar will be placed at top, right or left, at your will.
Special event by they is a special crap for us.
If it’s Microsoft, don’t buy it.
Better brands at better prices elsewhere.
All new articles have zero count comments. :S
WTF? So, If I add one photo to 5 albums, will it count 5x on my storage?
It does not make any sense… on google photos, we can add photo to multiple albums, and it does not generate any additional space usage
I have O365 until end of this year, mostly for onedrive and probably will jump into google one
Photo storage must be kept free because customers chose gadgets just for photos and photos only.
What a nonsense. Does it mean that albums are de facto folders with copies of our pictures?
Sounds exactly like the poor coding Microsoft is known for in non-critical areas i.e. non Windows Core/Office Core.
I imagine a manager gave an employee the task to create the album feature with hardly any time so they just copied the folder feature with some cosmetic changes.
And now that they discovered what poor management results in do they go back and do the album feature properly?
Nope, just charge the customer twice.
Sounds like a go-getter that needs to be promoted for increasing sales and managing underlings “efficiently”, said the next layer of middle management.
When will those comments get fixed? Was every editor here replaced by AI and no one even works on this site?
Instead of a software company, Microsoft is now a fraud company.
For me this is proof that Microsoft has a back-door option into all accounts in their cloud.
quote “…… as the MSA key allowed the hacker group access to virtually any cloud account at Microsoft…..”
unquote
so this MSA key which is available to MS officers can give access to all accounts in MS cloud.This is the backdoor that MS has into the cloud accounts. Lucky I never got any relevant files of mine in their (MS) cloud.
>”Now You: what is your theory?”
That someone handed an employee a briefcase full of cash and the employee allowed them access to all their accounts and systems.
Anything that requires 5-10 different coincidences to happen is highly unlikely. Occam’s razor.
Good reason to never login to your precious machine with a Microsoft a/c a.k.a. as the cloud.
The GAFAM are always very careless about our software automatically sending to them telemetry and crash dumps in our backs. It’s a reminder not to send them anything when it’s possible to opt out, and not to opt in, considering what they may contain. And there is irony in this carelessness biting them back, even if in that case they show that they are much more cautious when it’s their own data that is at stake.