What Apple's AI knows about you
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Maura Losch/Axios
Apple has designed Apple Intelligence to make significant use of personal information, but the company promises that it keeps that information private — even from Apple — and doesn't use your data to train its AI models.
Why it matters: In order to deliver the highly personalized results Apple promises, it needs the trust of users and the policies designed to earn it.
Catch up quick: Tech companies — Apple included — don't have to say where they get the information used to train their models. But they do have to say how customer data is used.
- In this series, Axios is looking company by company at how customer information is used in conjunction with AI systems and what, if any data, is used to train AI models.
Zoom in: Apple has been adding features to Apple Intelligence, with its most recent update adding ChatGPT integration and the ability to create AI images. Apple says it doesn't use customers' private data or their interactions to train its foundation models.
- Like other companies, Apple doesn't share much detail on where it does get its training data, though it provides some information here.
- When it comes to Apple's ChatGPT integration, Apple stresses the use of ChatGPT is optional and — through its agreement with OpenAI — Apple ensures that customers' IP addresses are obscured and that the ChatGPT maker won't store requests.
Yes, but: If people choose to sign into their ChatGPT Plus account they can access additional features, but then OpenAI's privacy policies apply.
The big picture: Many AI systems avoid using data personalized to the user, or at least avoid doing so by default.
- Although Apple Intelligence remains an opt-in feature, the company is leaning into making it personal. In demos when it introduced Apple Intelligence, Apple showed its Siri assistant was able to answer questions by drawing on information in contacts, mail and other apps.
Apple Intelligence is similarly personalized when it comes to generating images.
- While other companies won't draw AI images of kids or specific people as part of their effort to lessen the risk of deepfakes, Apple Intelligence explicitly does create images of people you know.
- However, all of its options are cartoonish or stylized, and its Genmoji and Image Playground image generators focus on recreating faces rather than whole bodies.
- Apple's more personal approach to AI, if fully realized, could give Apple a leg up. But it depends heavily on trust.
Between the lines: Apple hasn't just talked the talk when it comes to ensuring privacy. Apple Intelligence is designed to ensure as much work as possible is done on device.
- Where Apple needs the power of the cloud, it has developed a novel approach called Private Cloud Compute that ensures the information sent and received is used only to fulfill a request.
- The data is not stored or made accessible to Apple for other uses and Apple has encouraged security and privacy experts to examine the code that handles such requests.
- Apple also offers a report that allows customers to see how Apple Intelligence used their data over the last 7 days.
Go deeper: Read the rest of the series.
