The big secret Microsoft has been keeping for years is revealed

Even though AI chatbots are newly introduced to common usage, there is a long process of testing, debugging, and such. These tools go under heavy and long tests before meeting with the customers, and as per latest news suggests, Microsoft is one of the companies to keep the secret perfectly.
Microsoft implemented its new AI chatbot Sydney into Bing a couple of years ago, and users have been testing it since then. However, the roots of the new technology come from 2017. Microsoft had been using a couple of AI techniques on its Office products and Bing but more "primitively" as the technology wasn't near its current state of development. Microsoft has been testing the developed version in different countries, including India, and controversies about the Bing chatbot AI are flying around.
In a statement given to The Verge: “Sydney is an old codename for a chat feature based on earlier models that we began testing in India in late 2020,” says Caitlin Roulston, director of communications at Microsoft. “The insights we gathered as part of that have helped to inform our work with the new Bing preview. We continue to tune our techniques and are working on more advanced models to incorporate the learnings and feedback so that we can deliver the best user experience possible.”
According to a finding of The Verge on GitHub, the initial Bing bots that were created in 2017 used AI techniques. Since then, the company has worked on developing it and moved its focus to form a single AI bot called "Sydney." It is known that OpenAI shared its new GPT4 model with Microsoft, and according to Jordi Ribas, Head of Search and AI at Microsoft, it is a game-changer, and they are working closely to integrate GPT capabilities into the Bing search. This will help them get better results which means a more stable user experience.
In the last couple of years, the chatbot developed a personality that led to controversies. After the latest incident of Bing AI's misbehavior against a user, Microsoft started working on making the bot emotionless to prevent any kinds of similar issues. After seeing GPT-4's structure, Microsoft is looking to improve its product and offer a better version when it is ready for mass and regular usage.
Advertisement
Are these articles AI generated?
Now the duplicates are more obvious.
This is below AI generated crap. It is copy of Microsoft Help website article without any relevant supporting text. Anyway you can find this information on many pages.
Yes, but why post the exact same article under a different title twice on the same day (19 march 2023), by two different writers?
1.) Excel Keyboard Shortcuts by Trevor Monteiro.
2.) 70+ Excel Keyboard Shortcuts for Windows by Priyanka Monteiro
Why oh why?
Yeah. Tell me more about “Priyanka Monteiro”. I’m dying to know. Indian-Portuguese bot ?
Probably they will announce that the taskbar will be placed at top, right or left, at your will.
Special event by they is a special crap for us.
If it’s Microsoft, don’t buy it.
Better brands at better prices elsewhere.
All new articles have zero count comments. :S
WTF? So, If I add one photo to 5 albums, will it count 5x on my storage?
It does not make any sense… on google photos, we can add photo to multiple albums, and it does not generate any additional space usage
I have O365 until end of this year, mostly for onedrive and probably will jump into google one
Photo storage must be kept free because customers chose gadgets just for photos and photos only.
What a nonsense. Does it mean that albums are de facto folders with copies of our pictures?
Sounds exactly like the poor coding Microsoft is known for in non-critical areas i.e. non Windows Core/Office Core.
I imagine a manager gave an employee the task to create the album feature with hardly any time so they just copied the folder feature with some cosmetic changes.
And now that they discovered what poor management results in do they go back and do the album feature properly?
Nope, just charge the customer twice.
Sounds like a go-getter that needs to be promoted for increasing sales and managing underlings “efficiently”, said the next layer of middle management.
When will those comments get fixed? Was every editor here replaced by AI and no one even works on this site?
Instead of a software company, Microsoft is now a fraud company.
For me this is proof that Microsoft has a back-door option into all accounts in their cloud.
quote “…… as the MSA key allowed the hacker group access to virtually any cloud account at Microsoft…..”
unquote
so this MSA key which is available to MS officers can give access to all accounts in MS cloud.This is the backdoor that MS has into the cloud accounts. Lucky I never got any relevant files of mine in their (MS) cloud.
>”Now You: what is your theory?”
That someone handed an employee a briefcase full of cash and the employee allowed them access to all their accounts and systems.
Anything that requires 5-10 different coincidences to happen is highly unlikely. Occam’s razor.
Good reason to never login to your precious machine with a Microsoft a/c a.k.a. as the cloud.
The GAFAM are always very careless about our software automatically sending to them telemetry and crash dumps in our backs. It’s a reminder not to send them anything when it’s possible to opt out, and not to opt in, considering what they may contain. And there is irony in this carelessness biting them back, even if in that case they show that they are much more cautious when it’s their own data that is at stake.