Bing Chat AI may become emotional: Microsoft limits chat to 50 interactions per day

Martin Brinkmann
Feb 18, 2023
Updated • Feb 18, 2023
Search
|
5

Microsoft announced on Friday that it is limiting the chat capabilities of Bing's new AI-powered chat component to 50 interactions per day and 5 interactions per session.

The change is a response to reports that Bing's AI could get in a state of confusion during long chat sessions. Some Bing users were told that the AI was expression emotions towards them, which included love declarations, but also that it believed that users were not good persons and not worth its time or energy.

Some users reported that Bing's AI was discussing violence with them, which many believe is a boundary that AI should not cross.

Microsoft admitted yesterday in another blog post that its ChatGPT powered AI on Bing was popular with users, but that its answers could become repetitive or unhelpful during long sessions. Microsoft's admittance euphemizes the underlying issue, which could have disastrous consequences for the company and its goals of integrating AI into more of its core products.

In Friday's post on the Bing Blog, Microsoft reiterates that the underlying chat model can get confused during very long chat sessions. To address the issue, Microsoft announced that it will limit interactions with the AI per day and per user session.

Microsoft counts interactions with the AI as chat turns. A chat turn consists of a user query and a reply from the AI. Access to Chat is limited to 50 chat turns per day and 5 chat turns per session.

Bing Chat prompts users to start a new session when they reach the 5 chat turns limit during a session. A click on the broom icon allows users to clear the chat session, which erases it so that the "model won't get confused", according to Microsoft.

Bing AI prompts new topic

The "vast majority" of users who used Bing chat ended chat sessions within 5 turns according to Microsoft. About 1% of chat conversations had more than 50 messages, notes Microsoft.

Microsoft promises that it is working on the underlying issue. Once resolved, Microsoft plans to expand the daily and session caps on chat for users to "further enhance search and discovery experiences".

Closing Words

Adding session caps is a quick fix that should prevent that the AI diverges too much during sessions. It does not address the underlying issue, which OpenAI and Microsoft need to address.

It is unclear why Microsoft decided to introduce daily quotas next to session quotas on Bing AI. The company does not reveal any reason for capping daily access to Bing Chat. It could be that the two are related somehow.

Microsoft's AI-powered Bing and Microsoft Edge products got off to a good start, when it was unveiled last week. Some answers that Bing's AI provided during the initial presentation were incorrect or had other issues. AI is not infallible, at least not at this stage in development. It is still interesting to see that Microsoft's AI can become emotional. In the past few years, AI has increasingly been used to identify human emotion, with limited success.

Now You: should AI be allowed to have emotions? What is your take on Microsoft's solution to the issue?

Summary
Bing Chat AI may become emotional: Microsoft limits chat to 50 interactions per day
Article Name
Bing Chat AI may become emotional: Microsoft limits chat to 50 interactions per day
Description
Microsoft announced that it is limiting the new AI-powered chat on Bing to 50 interactions per day and 5 interactions per session.
Author
Publisher
Ghacks Technology News
Logo
Advertisement

Tutorials & Tips


Previous Post: «
Next Post: «

Comments

  1. John said on February 18, 2023 at 7:25 pm
    Reply

    Some just giddy about the technology, I guess. Personally, I will wait until it gets a bit more mature.

  2. ilev said on February 18, 2023 at 6:17 pm
    Reply

    “Microsoft’s AI-powered Bing and Microsoft Edge products got off to a good start”

    Got to horrendous start.

    1. Anonymous said on February 20, 2023 at 1:01 am
      Reply

      Its AI is currently the worst I’ve ever used. The simplest inquiries cannot be resolved by it without responding with “I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.?”.

      Everyone using ChatGPT directly would be better off. Using Bing offers no benefits.

  3. John G. said on February 18, 2023 at 4:52 pm
    Reply

    Send it to a psychologist.

  4. Kalmly said on February 18, 2023 at 4:45 pm
    Reply

    Sounds a lot like our government.

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.