Nov 9, 2019

Philosophers tackle deepfakes

Photo illustration: Eniola Odetunde. Photo via Francois G. Durand/Getty Images

Technology could erode the evidentiary value of video and audio so that we see them more like drawings or paintings — subjective takes on reality rather than factual records.

What's happening: That's one warning from a small group of philosophers who are studying a new threat to the mechanisms we use to communicate and to try to convince one another.

Why it matters: Understanding the coming changes — and how people are likely to react — could help inoculate us against their worst effects.

The big picture: We generally trust that videos and audio clips tell us something about real events, in part because of how costly and time-consuming it is to fake them — unlike, say, a sketch, a statement spoken aloud or even a photo.

  • But in a world where deepfakes are cheap and easy, videos — no longer constrained by the physics of light — will carry less information than they once did, argues Don Fallis, a philosophy professor at Northeastern University.
  • "Under those circumstances, videos and photographs would be no more evidential, no more probative, than a drawing," Fallis says.
  • "There's a power for someone to step in between and tinker with what we're seeing," argues Regina Rini, a philosophy professor at York University in Toronto.

The big picture: There have been plenty of informed guesses about specific potential hazards of AI-powered deepfakes — a spoiled election, a tanked IPO, a derailed trial — but their broader effects on society remain hazy.

  • "Philosophical analyses can illuminate what's going on in the information environment," says Fallis. "If they don't help address it, they can at least help us be aware as we sink into chaos."

How it works: Normally, when you receive new information, you decide whether or not to believe it in part based on how much you trust the person telling you.

  • "But there are cases where evidence for something is so strong that it overrides these social effects," says Cailin O'Connor, a philosopher at UC Irvine. For decades, those cases have included video and audio evidence.
  • These recordings have been "backstops," Rini says. But we're hurtling toward a crisis that could quickly erode our ability to rely on them, leaving us leaning only on the reputation of the messenger.
  • One huge implication is that people may be less likely to avoid bad behavior if they know they can later disavow a recording of their mischief.

The big question: What comes next? When Photoshop made it easy to transform images, we could fall back on video or audio. But we may now be at the end of the line, Rini says.

  • "We're falling down one ledge at a time," she says. "It's not clear to me that there's a ledge after this one."
  • A potential new backstop could be something like 3D video — a type of recording that is still hard, expensive and time-consuming to counterfeit.

Go deeper

In a deepfake, Nixon laments a catastrophe that wasn't

In a clip from a stunning new AI-manipulated video, President Nixon delivers a somber speech he never gave in real life, appearing to eulogize American astronauts left on the moon to die.

Why it matters: The video simultaneously shows the dangerous power of deepfake technology that can put words into the mouths of powerful leaders — and its potential to expand the boundaries of art.

Go deeperArrowNov 26, 2019

The dangers of "AI washing"

Illustration: Sarah Grillo/Axios

Zealous marketing departments, capital-hungry startup founders and overeager reporters are casting the futuristic sheen of artificial intelligence over many products that are actually driven by simple statistics — or hidden people.

Why it matters: This "AI washing" threatens to overinflate expectations for the technology, undermining public trust and potentially setting up the booming field for a backlash.

Go deeperArrowNov 16, 2019

AI is the new co-writer

A recently released AI program that generates hyper-realistic writing has become a powerful tool for storytelling, hinting at a new genre of computer-aided creativity.

What's happening: Inventive programmers are using it to generate poetry, interactive text adventures, and even irreverent new prompts for the popular game Cards Against Humanity.

Go deeperArrowDec 7, 2019