What Microsoft's AI knows about you
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Maura Losch/Axios
Microsoft has aggressively added AI-powered Copilots to nearly all its products, but that doesn't necessarily mean your data is being used to train models.
Why it matters: You won't know how much data you might be sharing with Microsoft's AI developers unless you dig into the firm's policies and understand your options.
In our ongoing series on What AI Knows About You, Axios is looking company by company at how tech giants are sometimes using their customers' information to develop and improve their products, and how users can opt out.
- Today's AI developers don't face any requirement to divulge the exact sources for their training data — but under various privacy laws, they do have to reveal what customer data they collect and how they use it.
Zoom in: Microsoft has partnered with OpenAI to power many of its services. But Microsoft's privacy policies are the ones that prevail, and Microsoft says that it has not provided any customer data to OpenAI to help train that company's models.
- Like most other tech companies, Microsoft doesn't train models using its business customers' data or prompts, including those using Copilot for Microsoft 365.
- Microsoft says it also does not use any content created in Microsoft 365
— that includes Word, Excel, PowerPoint, Outlook and Teams — to train its underlying foundational models.
Yes, but: Microsoft says it does use consumer data from Bing, MSN, the consumer version of Copilot and interactions with ads in certain countries (including the U.S.) for AI training.
- It says that if any images are included in AI conversations, it will "take steps to de-identify them such as removing metadata or other personal data and blurring images of faces."
Users who are logged in to a Microsoft account can control whether or not their data is used to train Microsoft's AI models, Microsoft says.
- Microsoft says opting out will "exclude your past, present, and future consumer data from being used for training these AI models, unless you choose to opt back in."
- Microsoft also has a variety of AI services in preview or beta and, in some cases, it asks customers to share data that can be used for training. The company says it clearly states how the data is being used and how users can opt out if they choose.
- Finally, some of Microsoft's subsidiaries have their own policies, including LinkedIn and GitHub, which trains its coding assistant using public code repositories, along with the usual array of "publicly available" internet data.
Between the lines: One place where Microsoft is trying to be extra clear about its privacy policies is with its new crop of Copilot+ PCs, where a number of AI features — including the controversial Recall feature — do their work on the device rather than in the cloud.
- "[Users of Microsoft's early access program] and Recall users, we want you to know your snapshots are truly yours," Microsoft reiterated in a blog post on Friday. "We do not send your snapshots off your PC to Microsoft or third parties, and don't use them for training purposes."
Go deeper:
