Deepfakes' parody loophole
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Aïda Amer/Axios
As AI keeps refining its ability to copy the voices and moving images of public figures, deepfake creators are turning to the "it's just a parody" defense.
The big picture: American media's long tradition of political humor is well-protected by the First Amendment — letting citizens inject almost any kind of fiction or fraud into the national dialogue as long as they label it comedy.
Driving the news: Friday night, Elon Musk reposted a video to his 190 million followers on X with overdubs in a phony voice that sounds like Vice President Kamala Harris.
- The video's original poster, "MrReaganUSA," introduced it as a "Kamala Harris Campaign Ad PARODY," but Musk's repost removed that label and just called it "amazing 😂."
- The video — which uses footage from a real Harris campaign video — has Harris calling herself "the ultimate diversity hire" and describing both herself and President Biden as "deep state puppets."
- It's unclear whether the fake audio of Harris was generated with AI or some other technique.
Musk owns X and recently endorsed former president Donald Trump's campaign.
- Users quickly pointed out that Musk's repost appeared to violate X's rules on "synthetic and manipulated media" and "deceptive identities."
- As of this writing, the post had about 130 million views and had not received a "Community Notes" annotation — X's way of warning readers about posts that may be misleading.
- When California Gov. Gavin Newsom wrote on X that Musk's post should be "illegal," Musk replied, "Parody is legal in America 🤷♂️."
The intrigue: When Musk acquired X, then called Twitter, in 2022, he said he'd prioritize the use of real, verifiable identities on the platform.
- He also sought to suppress users who got creative about parodying him via spoof accounts.
Our thought bubble: No owner or CEO of a major tech platform has ever endorsed a political candidate and then used their soapbox to promote that campaign using content that many view as deceptive.
Zoom out: Election-year parodies have been an online staple for decades.
- On the pre-YouTube internet of 2004, JibJab made crude videos pasting George Bush and John Kerry heads on cartoon bodies.
- In 2008, humorists attached more photorealistic Barack Obama and John McCain heads onto bodies in a break-dance competition video.
- Clips of the "Saturday Night Live" crew's political impersonators became viral video standards as soon as social media made that possible.
- More recently, actor/comedian Sarah Cooper struck a chord in 2020 with her lip-syncing to Trump recordings, making it look like a woman of color was delivering the president's tirades.
Between the lines: Those who rely on the parody loophole to protect their speech will usually argue that it's "obvious" their work is intended to amuse rather than to deceive — and that's frequently the case.
Yes, but: Widely accessible generative AI programs can now create lifelike audio and video of public figures saying and doing things they never actually said or did.
- As a result, parodies look and sound more realistic than ever before — and much of the public still hasn't adjusted their truth meters.
- That's widening the parody loophole into an open freeway for misinformation and fakery.
The other side: So far in the 2024 election cycle, deepfakes haven't turned into the widespread problem many experts predicted, as most social media platforms have moved quickly against outright frauds and required labeling for AI-generated material.
- People who want to believe in conspiracy theories have never been especially picky about the evidence they'll embrace, and don't really need AI-quality realism anyway.
- Much of online discourse is both performed and consumed as a game of trolling today, and sometimes lies bolstered by made-up evidence just take off.
- That's what seems to have happened with the fictional tale of J.D. Vance's furniture fetish; its apparent fabricator even provided a page reference to Vance's memoir. It, too, was fictitious.
The bottom line: The public sphere in America has always left room for political mockery, but AI is making it increasingly tough to establish a boundary between election-year humor and malicious fraud.
