Illustration: Rebecca Zisser/Axios
As deepfakes become more convincing and people are increasingly aware of them, the realistic AI-generated videos, images and audio threaten to disrupt crucial evidence at the center of the legal system.
Why it matters: Leaning on key videos in a court case — like a smartphone recording of a police shooting, for example — could become more difficult if jurors are more suspicious of them by default, or if lawyers call them into question by raising the possibility that they are deepfakes.
What's happening: Elected officials, experts and the press have been warning about the potential fallout for business or elections from deepfakes. But apart from a few high-profile examples, the tech so far has been used almost exclusively for porn, according to a landmark new report from Deeptrace Labs.
- Plus, when President Trump and his supporters throw around accusations of "fake news" to discredit information that they don't like, it can deepen the atmosphere of distrust.
- All this could lead jurors or attorneys to falsely assume that a real video is faked, says Riana Pfefferkorn, associate director of surveillance and cybersecurity at Stanford's Center for Internet and Society.
"This is dangerous in the courtroom context because the ultimate goal of the courts is to seek out truth," says Pfefferkorn, who recently wrote an article about deepfakes in the courtroom for the Washington State Bar magazine.
- "My fear is that the cultural worry could be weaponized to discredit [videos] and lead jurors to discount evidence that is authentic," she tells Axios.
- If a video's authenticity comes into question, the burden shifts to the side that introduced it to prove it's not fake — which can be expensive and take a long time.
Already, people accused of possessing child porn often claim that it's computer-generated, says Hany Farid, a digital forensics expert at UC Berkeley. "I expect that in this and other realms, the rise of AI-synthesized content will increase the likelihood and efficacy of those claiming that real content is fake."