A video selectively edited to frame one of Joe Biden's stump speeches as racist was shared by GOP strategists and a former speaker of the Missouri House, the New York Times reports, citing data from misinformation tracker VineSight.
Why it matters: Sharing misleading information via social media to incite anger toward presidential candidates is easy — and it works.
In a clip from a stunning new AI-manipulated video, President Nixon delivers a somber speech he never gave in real life, appearing to eulogize American astronauts left on the moon to die.
Why it matters: The video simultaneously shows the dangerous power of deepfake technology that can put words into the mouths of powerful leaders — and its potential to expand the boundaries of art.
Ad targeting is how Facebook, Google and other online giants won the internet. It's also key to understanding why these companies are being held responsible for warping elections and undermining democracy.
The big picture: Critics and tech companies are increasingly considering whether limiting targeting of political ads might be one way out of the misinformation maze.
Technology could erode the evidentiary value of video and audio so that we see them more like drawings or paintings — subjective takes on reality rather than factual records.
What's happening: That's one warning from a small group of philosophers who are studying a new threat to the mechanisms we use to communicate and to try to convince one another.
Hoping to stem a forecast rising tide of faked video, Adobe, Twitter and the New York Times are proposing a new industry effort designed to make clear who created a photo or video and what changes have been made.
Why it matters: With editing tools and artificial intelligence rapidly improving, it will soon be possible to make convincing videos showing anyone saying anything and photos of things that never happened.
Over the past week, Facebook and Twitter have codified a dual-class system for free speech: one set of rules for politicians or "world leaders," another for the rest of us.
Why it matters: Social media platforms are privately owned spaces that have absorbed a huge chunk of our public sphere, and the rules they're now hashing out will shape the information climate around elections for years to come.
The threat of deepfakes to elections, businesses and individuals is the result of a breakdown in the way information spreads online — a long-brewing mess that involves a decades-old law and tech companies that profit from viral lies and forgeries.
Why it matters: The problem likely will not end with better automated deepfake detection, or a high-tech method for proving where a photo or video was taken. Instead, it might require far-reaching changes to the way social media sites police themselves.
Disinformation campaigns used to consist of trolls and bots orchestrated and manipulated to produce a desired result. Increasingly, though, these campaigns are able to find willing human participants to amplify their messages and even generate new ones on their own.
The big picture: It's as if they're switching from employees to volunteers — and from participants who are in on the game to those who actually believe the disinformational payload they're delivering.
As deepfakes become more convincing and people are increasingly aware of them, the realistic AI-generated videos, images and audio threaten to disrupt crucial evidence at the center of the legal system.
Why it matters: Leaning on key videos in a court case — like a smartphone recording of a police shooting, for example — could become more difficult if jurors are more suspicious of them by default, or if lawyers call them into question by raising the possibility that they are deepfakes.
A growing industry of commercial disinformation services based in countries like Russia and the Philippines have the language skills, local contacts and cultural background to influence an English language conversation half a world away.
Driving the news: A new report from the security firm Recorded Future documents two campaigns that it paid Russian-speaking, dark web propagandists-for-hire to run.