Microsoft limits use of AI Services in upcoming Services Agreement update

Martin Brinkmann
Aug 13, 2023
Updated • Aug 13, 2023

Microsoft plans to update its Services Agreement on September 30, 2023. The company is informing customers about the change currently via email and other means.

If you take the time to go through the lengthy services agreement, you may notice several new sections. Besides the new Microsoft Storage section, which is encompassing OneDrive and now, as attachment storage is affecting OneDrive storage quotas now, there is a new AI section that defines rules for using Microsoft's AI-based services.

Microsoft defines AI services as "services that are labeled or described by Microsoft as including, using, powered by, or being an Artificial Intelligence ("AI") system". This includes then, among others, Bing Chat, Windows Copilot and Microsoft Security Copilot, Azure AI platform, Teams Premium.

Microsoft lists five rules regarding AI Services in the section. The rules prohibit certain activity, explain the use of user content and define responsibilities.

The first three rules limit or prohibit certain activity. Users of Microsoft AI Services may not attempt to reverse engineer the services to explore components or rulesets. Microsoft prohibits furthermore that users extract data from AI services and the use of data from Microsoft's AI Services to train other AI services.

Here is the full text of the first three rules:

i. Reverse Engineering. You may not use the AI services to discover any underlying components of the models, algorithms, and systems. For example, you may not try to determine and remove the weights of models.
ii. Extracting Data. Unless explicitly permitted, you may not use web scraping, web harvesting, or web data extraction methods to extract data from the AI services.
iii. Limits on use of data from the AI Services. You may not use the AI services, or data from the AI services, to create, train, or improve (directly or indirectly) any other AI service.

The remaining two rules handle the use of user content and responsibility for third-party claims. Microsoft notes in the fourth entry that it will process and store user input and the output of its AI service to monitor and/or prevent "abusive or harmful uses or outputs.

Users of AI Services are also solely responsible regarding third-party claims, for instance regarding copyright claims.

Here is the text of the two remaining regulations.

iv. Use of Your Content. As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service.
v. Third party claims. You are solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services).

Interested users can check out Microsoft's list of all the changes of the September 30, 2023 Services Agreement update here.

Closing Words

Microsoft is betting on AI and it was only a matter of time before it would add regulations to its Services Agreement that limit and regulate user interactions with these services in writing. (via Born)

Microsoft limits use of AI Services in upcoming Services Agreement update
Article Name
Microsoft limits use of AI Services in upcoming Services Agreement update
Microsoft is adding an AI Services section in an update to its Microsoft Services Agreement to regulate AI use.
Ghacks Technology News

Tutorials & Tips

Previous Post: «
Next Post: «


  1. owl said on August 14, 2023 at 3:13 pm

    Martin’s article is to the point.
    Google specializes in the “advertising business” and wants to take control of the advertising industry.
    Microsoft has been aiming for market dominance through “vendor lock-in” through its “proprietary system”, which is a business strategy and revenue source.
    From the beginning, Microsoft has been hostile “open standards”, so it’s not surprising.

    Relying on Google and Microsoft like this is silly and just makes them grow.
    Diversity is necessary for the health of the market, and the use of platformers that dominate the market should be avoided whenever possible.

    1. bruh said on August 14, 2023 at 8:08 pm

      In principle, I agree! The meritocratic logic of thoughts like “I want the best tool for the job” is all well and good but it needs to be balanced somewhat with the moral/ethical concerns associated with always supporting monopolistic companies which engage in bad practices.

  2. Guest said on August 14, 2023 at 1:24 pm

    Why tf isn’t this entire ChatGPT thing open sourced? This is absurd.

    1. bruh said on August 14, 2023 at 8:04 pm

      Like it or not, but closed-sourcing can lead to companies developing higher quality products, the incentive is there to do a good job and invest staff/research into the tools, when you know there aren’t gonna be tons of clones/copycats using your source code once you release the product.

      Microsoft invested many millions into it, they want to reap all the benefits – I mean that’s quite simple really. The fools are the ones that become either addicted or dependant on this, because that’s exactly what Microsoft wants.

  3. Anonymous said on August 13, 2023 at 10:01 am

    This is messed up. As long as people are not using the AI to build bombs or biological weapons limits shouldn’t be a thing.
    This is saying you can’t even explore the AI and study it as a private citizen! I understand going after a company in the field that’s trying to exploit your software but this is too much.

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.