Microsoft limits use of AI Services in upcoming Services Agreement update

Microsoft plans to update its Services Agreement on September 30, 2023. The company is informing customers about the change currently via email and other means.
If you take the time to go through the lengthy services agreement, you may notice several new sections. Besides the new Microsoft Storage section, which is encompassing OneDrive and Outlook.com now, as Outlook.com attachment storage is affecting OneDrive storage quotas now, there is a new AI section that defines rules for using Microsoft's AI-based services.
Microsoft defines AI services as "services that are labeled or described by Microsoft as including, using, powered by, or being an Artificial Intelligence ("AI") system". This includes then, among others, Bing Chat, Windows Copilot and Microsoft Security Copilot, Azure AI platform, Teams Premium.
Microsoft lists five rules regarding AI Services in the section. The rules prohibit certain activity, explain the use of user content and define responsibilities.
The first three rules limit or prohibit certain activity. Users of Microsoft AI Services may not attempt to reverse engineer the services to explore components or rulesets. Microsoft prohibits furthermore that users extract data from AI services and the use of data from Microsoft's AI Services to train other AI services.
Here is the full text of the first three rules:
i. Reverse Engineering. You may not use the AI services to discover any underlying components of the models, algorithms, and systems. For example, you may not try to determine and remove the weights of models.
ii. Extracting Data. Unless explicitly permitted, you may not use web scraping, web harvesting, or web data extraction methods to extract data from the AI services.
iii. Limits on use of data from the AI Services. You may not use the AI services, or data from the AI services, to create, train, or improve (directly or indirectly) any other AI service.
The remaining two rules handle the use of user content and responsibility for third-party claims. Microsoft notes in the fourth entry that it will process and store user input and the output of its AI service to monitor and/or prevent "abusive or harmful uses or outputs.
Users of AI Services are also solely responsible regarding third-party claims, for instance regarding copyright claims.
Here is the text of the two remaining regulations.
iv. Use of Your Content. As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service.
v. Third party claims. You are solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services).
Interested users can check out Microsoft's list of all the changes of the September 30, 2023 Services Agreement update here.
Closing Words
Microsoft is betting on AI and it was only a matter of time before it would add regulations to its Services Agreement that limit and regulate user interactions with these services in writing. (via Born)


Are these articles AI generated?
Now the duplicates are more obvious.
This is below AI generated crap. It is copy of Microsoft Help website article without any relevant supporting text. Anyway you can find this information on many pages.
Yes, but why post the exact same article under a different title twice on the same day (19 march 2023), by two different writers?
1.) Excel Keyboard Shortcuts by Trevor Monteiro.
2.) 70+ Excel Keyboard Shortcuts for Windows by Priyanka Monteiro
Why oh why?
Yeah. Tell me more about “Priyanka Monteiro”. I’m dying to know. Indian-Portuguese bot ?
Probably they will announce that the taskbar will be placed at top, right or left, at your will.
Special event by they is a special crap for us.
If it’s Microsoft, don’t buy it.
Better brands at better prices elsewhere.
All new articles have zero count comments. :S
WTF? So, If I add one photo to 5 albums, will it count 5x on my storage?
It does not make any sense… on google photos, we can add photo to multiple albums, and it does not generate any additional space usage
I have O365 until end of this year, mostly for onedrive and probably will jump into google one
Photo storage must be kept free because customers chose gadgets just for photos and photos only.
What a nonsense. Does it mean that albums are de facto folders with copies of our pictures?
Sounds exactly like the poor coding Microsoft is known for in non-critical areas i.e. non Windows Core/Office Core.
I imagine a manager gave an employee the task to create the album feature with hardly any time so they just copied the folder feature with some cosmetic changes.
And now that they discovered what poor management results in do they go back and do the album feature properly?
Nope, just charge the customer twice.
Sounds like a go-getter that needs to be promoted for increasing sales and managing underlings “efficiently”, said the next layer of middle management.
When will those comments get fixed? Was every editor here replaced by AI and no one even works on this site?
Instead of a software company, Microsoft is now a fraud company.
For me this is proof that Microsoft has a back-door option into all accounts in their cloud.
quote “…… as the MSA key allowed the hacker group access to virtually any cloud account at Microsoft…..”
unquote
so this MSA key which is available to MS officers can give access to all accounts in MS cloud.This is the backdoor that MS has into the cloud accounts. Lucky I never got any relevant files of mine in their (MS) cloud.
>”Now You: what is your theory?”
That someone handed an employee a briefcase full of cash and the employee allowed them access to all their accounts and systems.
Anything that requires 5-10 different coincidences to happen is highly unlikely. Occam’s razor.
Good reason to never login to your precious machine with a Microsoft a/c a.k.a. as the cloud.
The GAFAM are always very careless about our software automatically sending to them telemetry and crash dumps in our backs. It’s a reminder not to send them anything when it’s possible to opt out, and not to opt in, considering what they may contain. And there is irony in this carelessness biting them back, even if in that case they show that they are much more cautious when it’s their own data that is at stake.