Experts are advising 2020 presidential campaigns to form rapid-response plans to combat an urgent threat from deepfakes, Kaveh reports.
Over a year since prominent deepfake videos first attracted wide attention — and weeks after a simple alteration of a Nancy Pelosi speech went viral — an Axios survey shows that the 2020 campaigns are largely unprepared for the potential crisis.
The big picture: As we reported this morning, do-it-yourself deepfakes are within reach of anyone with some tech savvy and a decent computer. Though the most convincing videos take extra effort, a basic alteration — like the slowdown that made Pelosi seem intoxicated — will do the trick, too.
These can cause all manner of mayhem, both overt and subtle.
- A video altered to show a candidate dropping a racial slur could dominate news cycles or kill a campaign entirely.
- And mushrooming fakery could give candidates cover to call baloney on a video or audio clip that's actually real.
"Disinformation is one of the largest threats to the Western liberal order," says Lisa Kaplan of the Alethea Group, a political consulting outfit focused on misinformation. "All campaigns need to realize that they're likely to be a target at some point."
What's happening: Axios contacted all 24 Democratic presidential campaigns, plus the campaigns of President Trump and Republican challenger Bill Weld. None could cite any specific steps they had taken to ward off deepfakes.
But experts say the threat of provocateurs altering videos is imminent and call for a critical focus on defenses:
- Establish a video record of everything candidates say in public, so that a manipulated clip can be outed as fake.
- Campaigns often film their own big events, but don't systematically record and store every public encounter involving their candidates.
- A next step: Several startups are developing technology that authenticates the master cut as a video is being filmed, creating a permanent record of the original that can be stored on a blockchain.
- Prepare a robust response plan in case a corrupted video starts to go viral. Kaplan, who previously helped Sen. Angus King's campaign game out scenarios for his 2018 re-election, said that not every fake calls for hard pushback: Sometimes the best reaction is none at all.
- Cultivate close contacts with social media companies, which hold the keys to the algorithms that help fake videos go viral, or else stop them in their tracks.
- Ultimately, these firms' policies will dictate how far fake videos go. It took Facebook a day and a half to solicit fact-checks and reduce the spread of the Pelosi video; critics continue to complain that it did not entirely take the video down.
- "It was just the wrong call," said Hany Farid, a digital forensics expert at UC Berkeley, of Facebook's decision to leave the video on the site.
Heading off a fake video once it's started circulating can keep it from causing further damage, but it cannot reverse the harm already done, experts warn. "Intrinsic to the nature of the threat is that you can't unsee the video," says Christopher Porter of the cybersecurity company FireEye.
What's next: House Intel Chairman Adam Schiff today announced that a committee hearing next week will tackle the threat of deepfakes on the 2020 elections.