Aug 5, 2021 - Technology

Apple debuts plan to detect images of child sexual abuse

Illustration of the shadow of a hand over a cell phone.

Illustration: Aïda Amer/Axios

Apple announced new iPhone features Thursday that it said would enable the detection and reporting of illegal images of child sexual abuse while preserving users' privacy.

Driving the news: One new system will use cryptographic hashes to identify illegal images that users are uploading to Apple's iCloud without Apple directly snooping in users' troves of photos, which can be encrypted.

  • If Apple's system flags enough such images in any one account, it will have human moderators review the case for possible referral to law enforcement.
  • Apple says it's confident its system's error rate is one in a trillion.

Another feature will flag sexually explicit photos sent via Apple's Messages service by or to users with family accounts. This system uses on-device machine learning to warn users of potentially problematic content.

Details: The features will begin rolling out for testing in the U.S. immediately and will arrive in final form as part of an update to iOS 15.

What they're saying: An Apple spokesperson at a background press briefing emphasized that the iCloud screening feature is similar to steps many cloud providers already take to comply with the law, but takes additional measures to preserve users' privacy.

The Financial Times first reported the news, along with questions from security researchers concerned that Apple's systems might become vehicles for broader surveillance.

Go deeper