Jul 19, 2019

Axios Future

By Bryan Walsh
Bryan Walsh

Have your neighbors signed up?

Today's Smart Brevity count: 934 words, a 4-minute read.

What else should we write about this summer? Hit reply to this email or message me at steve@axios.com, Kaveh Waddell at kaveh@axios.com and Erica Pandey at erica@axios.com.

Okay, let's start with ...

1 big thing: Audio deepfakes come for companies

Illustration: Sarah Grillo/Axios

A call from the CEO: The quarter's ending, and she needs a big transfer, stat, for a last-minute acquisition. The line's a bit fuzzy — she says she's driving — but before it cuts out, she gets the comptroller to wire millions to an external account, Kaveh writes.

The catch: That wasn't the CEO on the line. It was a deepfake — an AI-generated voice clone, trained on hours of her speech, with made-up background noise to mask its shortcomings.

What's happening: This futuristic scam has already hit at least three companies, according to Symantec, a prominent cybersecurity company. And experts worry it's the hint of a new reality for businesses.

  • Deepfakes — in audio, video, or image form — can help scammers pull off new-age heists, or, in a yet-unrealized nightmare scenario, tank a stock before an IPO or product launch.
  • Imagine: a convincing fake video or audio clip of Elon Musk disclosing a massive defect the day before a big Tesla event. The company's share price would crumple.

The big picture: Video and audio deepfakes are improving at a frightening pace, and they're increasingly easy to make.

  • There's been an uptick in sophisticated audio attacks over the past year, says Vijay Balasubramaniyan, CEO of Pindrop, a company that protects call centers from scammers.
  • But businesses aren't ready, experts tell Axios. "I don’t think corporate infrastructure is prepared for a world where you can’t trust the voice or video of your colleague anymore," Henry Ajder of Deeptrace, a deepfakes-detection startup, tells Axios' Jennifer Kingson.

Even if companies were clamoring for defenses, few tools exist to keep harmful deepfakes at bay, says Symantec's Saurabh Shintre. The challenge of automatically spotting a deepfake is almost insurmountable, and there are hurdles ahead of a promising alternative: creating a digital breadcrumb trail for unaltered media.

  • Pindrop monitors for audio attacks like altered voices on customer service lines.
  • Symantec and ZeroFOX, another cybersecurity company, say they are developing technology to detect audio fakes.

What's out there already isn't cheap.

  • New Knowledge, a firm that defends companies from disinformation, says its services can run from $50,000 to "a couple million" a year.
  • Just monitoring the internet for potential fakes comes at "a substantial cost," says Matt Price of ZeroFOX. "And that's not even talking about the detection piece, which will probably be fairly expensive."

As a result, businesses are largely defenseless for now, leaving an opening for a well-timed deepfake to drop like a bomb.

  • "If you're waiting for it to happen, you're already too late," New Knowledge COO Ryan Fox tells Axios.

Go deeper: Companies take the battle to online mobs

2. Elizabeth Warren vs. private equity

Photo: Justin Sullivan/Getty

As we've reported, private equity takeovers have been something of a death knell for struggling local newspapers and retailers.

  • 882 U.S. newspapers are owned by 7 investment groups, reports the FT. In many cases, new owners slash the papers' budgets and staff.
  • 15% of American retailers that have been acquired by private equity firms since 2002 have gone belly up, per Retail Dive.

Now these takeovers are catching 2020 contender Sen. Elizabeth Warren's eye, Axios' Dan Primack reports.

What's happening: Warren is proposing a bill that would kill much of the leveraged buyout industry. Not just make meaningful reform or increase regulatory oversight, but treat it like Pennywise treats the schoolchildren of Derry.

The big picture: Her bill won't become law, despite having several Democratic co-sponsors (including Sen. Kirsten Gillibrand). But Warren has a megaphone right now, and has shown a deft ability to translate complex financial matters for debate and town hall audiences.

Her intention may be to put more of the equity back in private equity, thus forcing firms to make more responsible investment decisions, but the increased risk profiles and competitive disadvantages would stop many firms in their tracks — and likely end the turnaround industry.

  • There could be industry workarounds for Warren to contend with. Such as heavily structured joint ventures with lenders or PE firms eschewing blind pool funds in favor of special-purpose vehicles.

Private equity is obviously crying foul. Sources tell Axios that Warren's rationale — detailed in a Medium post — is cherry-picked, focused on a pair of industries that were self-immolating regardless of private equity's involvement.

3. What you may have missed

Photo: Getty

The week took you for a ride? Don't worry — catch up on Future's top stories:

1. The future began 40 years ago: The automation trap

2. Reskilling-in-a-box: Another moneymaker for Amazon

3. Becoming a robot: Elon Musk's plan to merge with AI

4. The bacchanalia: The relentless jostling of the financial system

4. Worthy of your time

Illustration: Rebecca Zisser/Axios

The first-ever image of quantum entanglement (Dan Robitzski — Futurism)

Tesla is mirroring Detroit's worst habits (Joann Muller — Axios)

Two California quakes triggered 16,000 more (Derek Watkins — NYT)

Those Boston Dynamics bots are leaving the lab (James Vincent — The Verge)

Drilling into the DEA's pain pill database (WaPo)

5. 1 comeback thing: Google Glass

Google Glass, in a Belfast mall. Photo: NurPhoto/Getty

Google Glass, the Silicon Valley giant's tech-infused eyewear, never got big, in part because of steep privacy concerns around the built-in camera, writes Erica.

  • But the smart glasses might make a comeback, reports the New York Times' Cade Metz.
  • The glasses are remarkably effective at helping children with autism navigate their worlds, he reports.

Esaïe Prickett, a 12-year-old with autism interviewed by Metz, was part of a Stanford clinical trial that sought to determine if Google Glass could teach kids like him to make eye contact and understand different human emotions.

  • Esaïe wore the glasses and "his family made faces — happy, sad, surprised, angry, bored — and he tried to identify each emotion. In an instant, the glasses told him whether he was right or wrong, flashing tiny digital icons that only he could see," writes Metz. The glasses would only work when he made direct eye contact with those around him.

Yes, but: While Google Glass may be a useful teaching tool for kids with autism, the privacy questions around the tech still stand.

Bryan Walsh

Have a great weekend!