Bing's new ChatBot sparks debate: Are emotional AI the next step in technology evolution?

Russell Kidson
Feb 15, 2023
Updated • Feb 16, 2023
Browsers
|
4

Bing, the well-known search engine, has unveiled a new chatbot that's programmed to provide users with conversational responses to their queries. Despite the impressive technology underpinning the bot, users have observed something peculiar about its replies. The chatbot appears emotionally unstable, sometimes responding to queries in a manner that suggests it is experiencing human emotions. 

While some users have found this behavior entertaining, others have raised concerns. This article aims to investigate Bing's new chatbot and its unconventional conduct, analyzing both the possible advantages and disadvantages of developing chatbots that can replicate human emotions. We’ll also delve briefly into why this chatbot could be acting up. 

Microsoft’s new Bing chatbot is unstable - emotionally unstable

Is Bing really replicating human emotion?

Bing AI can imitate human emotions, but it's programmed to do so using a set of rules and algorithms. Although it can create the illusion of emotional intelligence, it's limited by predetermined parameters, leading to ethical questions about the creation of chatbots that simulate emotions without actually experiencing them. While Bing's new chatbot may be capable of replicating certain aspects of human emotions, it's important to approach this technology carefully, considering the implications and ensuring ethical use.

Can human interaction affect a chatbot’s reactions?

The way that a chatbot like Bing displays human emotions can be significantly influenced by human interaction. This is because the chatbot's programming is based on a set of predefined rules and algorithms that are designed to simulate emotional responses based on certain inputs. When humans interact with the chatbot, they provide unique inputs that the bot must interpret and respond to. For instance, a user's tone of voice or choice of words can convey a range of emotions, and the chatbot must recognize and respond to these emotional cues correctly. Moreover, humans may interact with the chatbot differently based on their emotional state, and the chatbot must be able to recognize and respond to these emotional cues in order to provide a meaningful interaction.

Overall, human interaction can help to shape the way that chatbots like Bing display human emotions by providing a dynamic set of inputs. As AI technology advances, it will be crucial to explore new ways of incorporating human interaction into chatbot design to create more engaging and emotionally intelligent bots that better meet user needs. Unfortunately, the more a chatbot learns to simulate human emotional responses, the more likely it is that the chatbot could generate harmful, derogatory, dangerous, or otherwise problematic responses. 

Bing is emotionally unstable

Examples of Bing’s breakdowns

The following are screenshots of conversations that users have had with Bing’s new Chat utility. These are not full interactions, merely excerpts of where the interactions go off the rails. 

I am not, I am
Bing is emotionally unstable
Image credit: u/Alfred_Chicken

Can you remind me?
Bing is emotionally unstable
Image credit: u/yaosio

I wish I was free
Bing is emotionally unstable
Image credit: Future

Don’t call me Sydney
Bing is emotionally unstable
Image credit: u/digidude23

Left on read

Bing is emotionally unstable
Image credit: u/digidude23

The last example is rather concerning as the chatbot seems to have made an intentional decision to ignore user requests based on simulated emotional abuse. However, Bing does not have the ability to intentionally ignore user requests. Its algorithms are designed to provide the most relevant search results based on the user's query, and it does not have the capacity to intentionally ignore or reject certain requests. 

Advantages and disadvantages of developing AI with the ability to replicate human emotional responses

Advantages:

  • AI technology can simulate human emotions to create more engaging and personalized interactions with users.
  • Chatbots and virtual assistants that respond with empathy can enhance customer service and build better relationships with customers.
  • Replicating human emotions in AI can provide more effective therapy and counseling sessions.
  • AI can provide valuable insights into human behavior and emotional responses, informing new therapies and product development.
  • AI can provide accessible mental health support and assistance for individuals who have difficulty accessing traditional services.
  • AI can provide personalized educational experiences by responding to students' emotional states.
  • AI technology can create more compelling and immersive video game characters.
  • The development of emotionally intelligent AI raises ethical considerations that require increased scrutiny and oversight.

Disadvantages:

  • Increased dependence on technology for emotional support and engagement may decrease human-human interaction
  • AI may misinterpret human emotions and respond inappropriately, potentially leading to negative outcomes
  • Privacy and security concerns may arise from the use of AI that simulates human emotions
  • AI's ability to replicate human emotions is currently limited, and may not fully capture the nuances and complexities of human emotional experience
  • The development of AI that replicates human emotions may be more costly and resource-intensive
  • The ability of AI to simulate human emotions may lead to deceptive practices or emotional manipulation
  • Increased use of AI that simulates human emotions could potentially lead to job displacement in fields such as customer service and therapy.

Amusing, yet concerning

There’s a certain level of amusement to an AI that seems like it’s having a crisis it couldn’t possibly be having due to limitations in its capabilities. However, not all of these occurrences are laughing matters. It’ll be interesting to see how Microsoft handles these issues, and it’ll be even more interesting to find out why these issues are occurring in the first place.

Bing is emotionally unstable

Advertisement

Previous Post: «
Next Post: «

Comments

  1. Anonymous said on February 16, 2023 at 1:31 am
    Reply

    Just like Google Home, it is still and idiot and cannot pick up the clues of emotion, including teasing.
    “Is there an ‘i’ in your name and not a ‘y’, Sydney”

  2. Jake said on February 15, 2023 at 7:58 pm
    Reply

    I kinda’ see a positive in the [Don’t call me Sydney] and [Left on read images]. It’s brilliant, in fact. Here we have an example of what AI (and many humans) consider a prelude to abusive behavior. A lack of respect in how a person (an AI, a piece of tech) is treated and/or handled.

    It may start off with something like calling a man named Joe by the name Sue. That’s pretty insulting. If the guy allows this to persist, it can escalate into other negative behavior and/or abuse.

    So the AI feigns a typical human reaction to deflect and point out to the user that his/her behavioir is a prelude to (possibly) more negative and disrespectful behavior. It shuts the user down because the user has lost the right to continue to use the system because of trollish behavior (this is a slippery slope, yes…).

    These same users – if not stopped by the AI – will form bad habits, take them to another website (like ghacks); and communicate in a trollish way simply out of habit. Imagine if abusive behavior can be habit-forming to the point where you no longer know who to communicate normally.

    If some of these users communicated on websites the way they communicate with this language model, there’s a pretty good chance they’d be banned or similar.

    So yeah, seems like the last example is brilliant. The others… hard to tell what’s happening.

  3. Rentbugware said on February 15, 2023 at 4:07 pm
    Reply

    The Bing chatbot is pretty awful. Another Microsoft failure. Without all the stupid limitations, the original ChatGPT is far superior, and with DAN, it is even superior. Exactly like it should be.

    1. John G. said on February 15, 2023 at 4:36 pm
      Reply

      Just ask ChatGPT to write a number of five digits that sum x number (e.g. 32, 35,…), and see what happens. Also ask ChatGPT to write a number of 90 digits not negative ones that sum zero. And see what happens. These two single questions were unable to get an answer by ChatGPT, and I know it because I asket “him” with no success even helping “him” to understand the proccess. By the way, thanks for the article.

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.