Microsoft recently quietly updated its official service agreement, the new rules came into effect on September 30th, and there are many new additions and changes.
The biggest change is a new section on using Microsoft’s artificial intelligence services, which involves using its Bing Chat chatbot as well as Windows Copilot and Microsoft 365 Copilot services. Here’s the full text of that section:
- Reverse engineering. Artificial intelligence services must not be used to discover models, algorithms, and any underlying components of the system. For example, you cannot try to determine and remove the weights of the model.
- Extract data. Do not use web scraping, web harvesting, or web data extraction methods to extract data from the AI Service unless expressly permitted.
- Restrictions on the use of AI service data. Do not use the AI Services or data from the AI Services to create, train or improve (directly or indirectly) any other AI Services.
- Use your content. As part of providing the AI service, Microsoft will process and store your input to the service and the output of the service in order to monitor and prevent abusive or harmful use of or output from the service.
- Claims by third parties. You are solely responsible for responding to any third-party claims related to your use of the AI Services (including, without limitation, copyright infringement or other claims related to content output during your use of the AI Services) in accordance with applicable law.
In addition, the definition of “your content” in the agreement has been expanded to include “content generated by you using our artificial intelligence services”. Additionally, the “Code of Conduct” section now contains information on the use of artificial intelligence services.
There are a number of other smaller changes to the service agreement, including a new Microsoft Storage section detailing what OneDrive and Outlook.com are for. There have also been some changes to the Microsoft Rewards section to reflect that it is now available globally.