Photoshop: Lazaro Gamio/Axios
A group whose members include Adobe, Twitter and the New York Times Monday offered a plan for restoring trust in photos and video in the face of a rising tide of digital fakery.
Why it matters: Deepfakes — images manipulated or generated by AI in a deceptive way — undermine trust both by tricking people into thinking phony images or videos are real and by making them doubt the veracity of real imagery.
Driving the news: The Content Authenticity Initiative Monday released a white paper outlining an open standard for a photo and video authentication system that could be built into both hardware such as cameras and smartphones and software such as Photoshop.
- The system would record a digital signature when a photo or video is first taken, and then again each time it's edited in any way. Users would be able to see that record of the imagery's origin and any changes that have been made to it.
The idea is a flexible standard aimed at protecting privacy and safety.
- Photojournalists, for instance, could tag themselves as the creator of a photo and geotag it to a specific location.
- The system could also simply authenticate that a photo was taken with a standard-compliant device, without identifying who took it or where.
Context: The white paper represents the first public fruits of nearly two years of labor by the CAI.
- Several firms already offer digital authentication technology, including charter CAI member Truepic, but an open standard offers the hope of a universal system.
What they're saying: The group views authenticating images from their creation to the time they're seen online as a more promising approach than trying to detect deepfakes once they're already in circulation.
- Even the best entry in Facebook's Deepfake Detection Challenge was only able to detect them 65% of the time, per results the company announced in June.
- "That's only slightly better than a coin toss," Sherif Hanna, Truepic's vice president of R&D and a co-author of the paper, told Axios. "Instead of all of us trying to get to where we can detect what's fake, we should prove what's real."
What's next: Members are working on prototypes for implementing the standard in software and hardware, though Hanna declined to offer a timeline for when tech built on the standard will come to market.