Apple delays plan to scan iPhones for images of child sex abuse
Apple announced on Friday that it would delay its plan to scan iPhone users' libraries for images of child sex abuse.
Why it matters: The delay is an acknowledgment of the privacy concerns brought up when the program was first announced last month.
Background: Apple announced the program to scan for child sex abuse material imagery on August 5th.
- The system would use "cryptographic hashes" to detect illegal images, and if enough images were detected for a user it would be turned over to law enforcement per Axios' Scott Rosenberg.
- Apple said that the error rate would be "one in a one trillion" per a Q&A Apple released earlier this month.
Driving the news: While the decision was applauded by child safety organizations, many online privacy organizations were concerned that this technology could be used in a way that violates a user's privacy, per Axios' Scott Rosenberg.
What they're saying: "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in a statement this morning.