Sign up for our daily briefing

Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Twin Cities

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa Bay news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa Bay

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Charlotte news in your inbox

Catch up on the most important stories affecting your hometown with Axios Charlotte

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Illustration: Rebecca Zisser/Axios

In the first signs of a mounting threat, criminals are starting to use deepfakes — starting with AI-generated audio — to impersonate CEOs and steal millions from companies, which are largely unprepared to combat them.

Why it matters: Nightmare scenarios abound. As deepfakes grow more sophisticated, a convincing forgery could send a company's stock plummeting (or soaring), to extract money or to ruin its reputation in a viral instant.

  • Imagine a convincing fake video or audio clip of Elon Musk, say, disclosing a massive defect the day before a big Tesla launch — the company's share price would crumple.

What's happening: For all the talk about fake videos, it's deepfake audio that has emerged as the first real threat to the private sector.

  • Symantec, a major cybersecurity company, says it has seen three successful audio attacks on private companies. In each, a company's "CEO" called a senior financial officer to request an urgent money transfer.
  • Scammers were mimicking the CEOs' voices with an AI program that had been trained on hours of their speech — culled from earnings calls, YouTube videos, TED talks and the like.
  • Millions of dollars were stolen from each company, whose names were not revealed. The attacks were first reported in the BBC.

And in March, a Twitter account falsely claiming to belong to a Bloomberg journalist reportedly tried to coax personal information from Tesla short-sellers. Amateur sleuths said the account's profile photo had the hallmarks of an AI-generated image.

Big picture: This threat is just beginning to emerge. Video and audio deepfakes are improving at a frightening pace and are increasingly easy to make.

  • There's been an uptick in sophisticated audio attacks over the past year, says Vijay Balasubramaniyan, CEO of Pindrop, a company that protects call centers from scammers.
  • But businesses aren't ready, experts tell Axios. "I don’t think corporate infrastructure is prepared for a world where you can’t trust the voice or video of your colleague anymore," says Henry Ajder of Deeptrace, a deepfakes-detection startup.

Even if companies were clamoring for defenses, few tools exist to keep harmful deepfakes at bay, says Symantec's Saurabh Shintre. The challenge of automatically spotting a deepfake is almost insurmountable, and there are hurdles still ahead for a promising alternative: creating a digital breadcrumb trail for unaltered media.

  • Pindrop monitors for audio attacks like altered voices on customer service lines.
  • Symantec and ZeroFOX, another cybersecurity company, say they are developing technology to detect audio fakes.

What's out there already isn't cheap.

  • New Knowledge, a firm that defends companies from disinformation, says its services can run from $50,000 to "a couple million" a year.
  • Just monitoring the internet for potential fakes comes at "a substantial cost," says Matt Price of ZeroFOX. "And that's not even talking about the detection piece, which will probably be fairly expensive."

As a result, businesses are largely defenseless for now, leaving an opening for a well-timed deepfake to drop like a bomb.

  • "If you're waiting for it to happen, you're already too late," New Knowledge COO Ryan Fox tells Axios.

Go deeper: Companies take the battle to online mobs

Go deeper

In photos: D.C. and U.S. states on alert for pre-inauguration violence

National Guard troops stand behind security fencing with the dome of the U.S. Capitol Building behind them, on Jan. 16. Photo: Kent Nishimura / Los Angeles Times via Getty Images

Security has been stepped up in Washington, D.C., and state capitols across the U.S. as authorities brace for potential violence this weekend.

Driving the news: Following the Jan. 6 insurrection at the U.S. Capitol by some supporters of President Trump, the FBI has said there could be armed protests in D.C. and in all 50 state capitols in the run-up to President-elect Joe Biden's inauguration Wednesday.

The new Washington

Illustration: Sarah Grillo/Axios

The Axios subject-matter experts brief you on the incoming administration's plans and team.

Rep. Lou Correa tests positive for COVID-19

Lou Correa. Photo: Tom Williams/CQ-Roll Call, Inc via Getty Images

Rep. Lou Correa (D-Calif.) announced on Saturday that he has tested positive for the coronavirus.

Why it matters: Correa is the latest Democratic lawmaker to share his positive test results after last week's deadly Capitol riot. Correa did not shelter in the designated safe zone with his congressional colleagues during the siege, per a spokesperson, instead staying outside to help Capitol Police.