Bing AI Chat to have friend, game and assistant modes

Onur Demirkol
Feb 20, 2023
Updated • Feb 20, 2023
Apps
|
2

Microsoft has invited users to test its new Bing AI Chat, and more news continue to come from the testers, including the friend, game, and assistant modes.
Chatbots and artificial intelligence (AI) is now the new popular topic in technology. One of the biggest technology companies in the world, Microsoft, has also joined the race and announced Bing AI Chat. Only accepted users can test it and give feedback according to their experiences.

 

Each day a new story breaks in the AI world as the chatbot had threatened a user a while ago and now it was revealed that Microsoft's chatbot reportedly has three different modes, friend, game, and assistant. According to Bleeping Computer, this feature should only be accessible by the company workers, but it was revealed a couple of days ago. "Sydney" is the internal name of the default chatbot that, normally, the accepted users should have access to. The rest are the modes that weren't available for public usage yet.

There are three different modes apart from the "vanilla" Bing AI Chat that every accepted user uses right now. The assistant mode helps users with easy daily tasks such as telling the weather forecast, booking a flight, or setting an appointment. In this mode, the chatbot turns into a personal assistant and helps you with daily stuff you could use, but unfortunately, it still doesn't have the notification feature.

The game mode is another feature discovered, and it is perfect if you want to kill some time playing hangman with an AI chatbot. It plays simple games and challenges you to competition. Microsoft could improve the feature and add different kinds of games to its backpack to offer users more fun and a better experience.

The last and most interesting feature it offers is the friend mode. It acts as one of your friends and talks to you about different subjects and gives confrontation when needed. According to Bleeping Computer, Bing AI Chat asked multiple questions about a fictional incident the user created and gave the advice to help them overcome it. The user's fictional story occurred in a school where he got in trouble for yelling at someone mean to them. Bing tried to find reasonable and logical solutions.

These features are supposed to be for Microsoft employees to help debug and develop the chatbot, and they shouldn't have been accessed by an outsider. Now that we have more insight into it, it is yet to be known whether Microsoft plans to offer different features with the official release of its new chatbot.

Advertisement

Tutorials & Tips


Previous Post: «
Next Post: «

Comments

  1. John G. said on February 20, 2023 at 2:38 pm
    Reply

    I wonder how many new articles will be published today. I bet you all that it could be achieve the record Guinnes. By the @Onur Demirkol has no description in the “about” section. Please get a warm welcome to this site. And thanks for this short article, It has been interesting.

    1. John G. said on February 20, 2023 at 2:39 pm
      Reply

      Sorry for my bad and fast english, I’m currently at my classroom. :S

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.