Sign up for a daily newsletter defining what matters in business and markets

Stories

AI is "awakening" surveillance cameras

Illustration of surveillance cameras with a binary code overlay
Illustration: Sarah Grillo/Axios

There are millions of surveillance cameras in the U.S., but not nearly enough eyes to watch them all. When you pass one on the street, you can rightly expect your actions to go unnoticed in the moment; footage is instead archived for review if something goes wrong.

What's happening: Now, AI software can flag behavior it deems suspicious in real-time surveillance feeds, or pinpoint minute events in past footage — as if each feed were being watched unblinkingly by its own hyper-attentive security guard. The new technology, if it spreads in the U.S., could put an American twist on Orwellian surveillance systems abroad.

Big picture: In a new report today, ACLU surveillance expert Jay Stanley describes a coming mass awakening of millions of cameras, powered by anodyne-sounding "video analytics."

Collecting data has become dirt cheap, but attention has remained a scarce, expensive resource — especially for analyzing video, Stanley says. That's what is changing.

  • "The danger is that video analytics would be used to make sure that if you do anything, it will never be missed," Stanley tells Axios. That would be a significant departure from today's largely unmonitored cameras.
  • "We're right on the cusp of this technology really becoming real."

Quick take: This new software democratizes high-powered surveillance — once the purview of wealthy governments and organizations. Companies are selling it effectively as "surveillance in a box" for far cheaper than hiring video analysts.

Police, retailers, railroads and even carmakers are installing various shades of this software. And we've written about its use in schools.

  • The full extent of its deployment, or even how well the technology lives up to its marketing promises, isn't entirely clear.
  • What's certain is that there's demand for it. Analysts predict that the video analytics market, which was worth $3.23 billion in 2018, will grow to $8.55 billion in 2023.

How it works: The software is marketed as being able to:

  • Detect specific events like people hugging, smoking, fighting or drinking, or instead automatically detect "anomalies" — deviations from the usual goings-on in a certain feed, like a car driving the wrong way or a person loitering at an odd hour.
  • Search historical footage by clothing or even skin color and "summarize" countless hours of footage into a single image or a short clip.
  • Determine a person's emotional state or even make assumptions about their personality, based only on their face and body movements.

The danger: Losing anonymity in public can change the way people behave, experts say, much like China's omnipresent surveillance can cause residents to constantly look over their shoulders.

  • "People will start to wonder if they'll be cataloged or monitored if they're at a protest or political event, and potentially be subject to retribution," says Jake Laperruque, a privacy expert at the Project on Government Oversight.
  • And in the case of emotion detection, significant decisions — like whether or not you get a job — can hang on the software's interpretation of your facial expressions, says Meredith Whittaker, co-founder of NYU's AI Now Institute.