The big secret Microsoft has been keeping for years is revealed

Onur Demirkol
Feb 24, 2023

Even though AI chatbots are newly introduced to common usage, there is a long process of testing, debugging, and such. These tools go under heavy and long tests before meeting with the customers, and as per latest news suggests, Microsoft is one of the companies to keep the secret perfectly.

Microsoft implemented its new AI chatbot Sydney into Bing a couple of years ago, and users have been testing it since then. However, the roots of the new technology come from 2017. Microsoft had been using a couple of AI techniques on its Office products and Bing but more "primitively" as the technology wasn't near its current state of development. Microsoft has been testing the developed version in different countries, including India, and controversies about the Bing chatbot AI are flying around.

In a statement given to The Verge: “Sydney is an old codename for a chat feature based on earlier models that we began testing in India in late 2020,” says Caitlin Roulston, director of communications at Microsoft. “The insights we gathered as part of that have helped to inform our work with the new Bing preview. We continue to tune our techniques and are working on more advanced models to incorporate the learnings and feedback so that we can deliver the best user experience possible.”

According to a finding of The Verge on GitHub, the initial Bing bots that were created in 2017 used AI techniques. Since then, the company has worked on developing it and moved its focus to form a single AI bot called "Sydney." It is known that OpenAI shared its new GPT4 model with Microsoft, and according to  Jordi Ribas, Head of Search and AI at Microsoft, it is a game-changer, and they are working closely to integrate GPT capabilities into the Bing search. This will help them get better results which means a more stable user experience.

In the last couple of years, the chatbot developed a personality that led to controversies. After the latest incident of Bing AI's misbehavior against a user, Microsoft started working on making the bot emotionless to prevent any kinds of similar issues. After seeing GPT-4's structure, Microsoft is looking to improve its product and offer a better version when it is ready for mass and regular usage.


Tutorials & Tips

Previous Post: «
Next Post: «


  1. really? said on February 25, 2023 at 12:24 am

    clickbait titles now?

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.