SubscribeArrow

Who do you spend time with? How do you move around?

  • These and a host of other very personal behaviors are constantly being logged by cameras and cellphones, and analyzed in real time.

This Deep Dive — led by Axios' emerging technologies reporter Kaveh Waddell, managing editor Alison Snyder and producer Jessie Li — goes inside the automated future of surveillance.

  • Smart Brevity count: 1,668 words, a 6-minute read.
1 big thing: The end of anonymity

Illustration: Aïda Amer/Axios

More than 10,000 video cameras watch over public schools in Broward County, Florida, where the Parkland shooting took place last year, Kaveh writes.

  • Most are normal cameras, recording footage to be watched later. But about 150 feeds, from the "highest risk" of about 50 high schools and alternative schools, are instead scrutinized by unblinking artificial intelligence.
  • The AI system, which cost hundreds of thousands of dollars, can follow a student's movements over time, even if they appear on different cameras.
  • And it can detect "unusual" behavior, like someone running in a location where most walk, or appearing someplace that's usually empty at that time.

What's happening: Until now, the vast majority of information collected about us has remained untouched — there was just too much to make sense of it all. But AI allows data that might once have gone unnoticed to now be detected, analyzed and logged in real time.

It's already started supercharging surveillance at work, in schools and in cities:

The big picture: Humans have monitored each other as long as we've lived in communities to punish free riders and troublemakers.

  • But now, cheap, powerful machines are taking the place of human watchers, disrupting a long-held social contract.
  • Unlike in China, where high-tech surveillance is a tool of fear and control, systems in the West are not centralized for now, curbing the scope of data gathering.
  • And tech companies like Facebook and Google have perfected online versions of automated surveillance for profit, in the form of products we can no longer live without.

Details: Software can identify and track faces, skin color, clothing, tattoos, walking gait and various other physical attributes and behaviors. But it's been plagued with bias and inaccuracy problems that primarily harm people of color.

  • From facial expressions and body movements, AI can extrapolate emotions like happiness and anger — a process built on shaky scientific evidence.

The impact: This quiet shift from passive watching to active surveillance is chipping away at our ability to remain anonymous in physical and virtual spaces.

  • Blending into the crowd is no longer an option if every face in that crowd is captured, compared against a driver's license photo and logged.
  • Constant AI surveillance threatens to erode the all-important presumption of innocence, says Clare Garvie, a privacy expert at Georgetown Law.
2. The case for surveillance

Illustration: Aïda Amer/Axios

Boosters of surveillance technology argue that it can make cities safer, improve traffic, speed up airport and stadium lines, make workers more productive and safeguard valuable company property.

The big picture: People are willing to be watched in certain cases. Most trust police to use facial recognition responsibly, according to a new Pew Research survey, but a majority don't trust tech companies or advertisers to do the same.

Details: Police say facial recognition accelerates investigations, stripping away some human biases and shortcomings.

  • NYPD commissioner James O'Neill wrote in a New York Times op-ed in June that facial recognition matches led to nearly 1,000 arrests in 2018.
  • The volume of important digital evidence has exploded so fast that it's "quickly outpacing our ability to deal with it," says Jim Burch, president of the National Police Foundation.
  • But, but, but: Historically, surveillance hasn't clearly prevented or deterred terrorism and crimes. There’s no good data yet about emerging methods.

Companies, too, monitor employees' computers and phones to make sure they're not about to spill the beans to competitors or the press.

  • Or, by checking up on workers, they claim to "optimize productivity" and deliver projects more quickly.
  • For instance, Upwork, a company that helps clients find freelancers for code and design jobs, uses screen recording technology to "provide proof of work."
3. Biased surveillance

Illustration: Aïda Amer/Axios

Advanced surveillance technology is being deployed despite flaws that risk perpetuating racial biases in the criminal justice system, Orion Rummler writes.

The big picture: Even with recent improvements in the tech, people of color are more likely to be misidentified by facial recognition software — an error that can have life-changing results.

  • And predictive systems can reinforce over-policing of some neighborhoods.

What's new: People of color — especially black Americans — are more likely to distrust facial recognition technology used for policing, according to the Pew poll.

  • Pew found that 61% of white Americans trust police to use the technology responsibly, versus 56% of Hispanic Americans and 43% of black Americans.
  • Several surveys have shown people of color are concerned about facial recognition. That's because they're most likely to come into contact with it due to over-policing, says Garvie.

Details: Accuracy problems arise when a facial recognition system is trained on a dataset of mostly white and male faces. And they're amplified by mugshot databases where people of color are disproportionately represented, making for more possible mismatches.

What's happening: Citizen-driven surveillance is reinforcing biases in other ways. Security-oriented apps and devices that allow people to alert neighbors to sketchy activity can turn prejudices into police action.

  • The majority of people described as "suspicious" in over 100 user-submitted posts to the Neighbors by Ring app were people of color, a Motherboard analysis found.
  • At least 405 law enforcement agencies use Neighbors.

Go deeper: Uncovering secret government AI

4. What your office knows about you

Illustration: Aïda Amer/Axios

You may be under closest watch when you're on the clock.

What's happening: New technologies have made it easy and cheap to surveil workers, checking for everything from intellectual property theft to good old Facebook slacking. And the list of behaviors being watched is growing.

Why it matters: Employers amassing data on their workers' movements, actions, habits and even emotions can wield sweeping power over them.

  • And employees under surveillance find it difficult to reject it — if they're told about it at all.

Since private detectives were first hired in the 1850s to tail employees and report back, workers have had little privacy on the job.

  • IT departments regularly bug office workers' computers. Employers can install software to see every keystroke and print job, and even record every computer's screen at all times.
  • Email monitoring that once flagged predetermined keywords can now scan every message for emotional cues, giving bosses a heads up if someone's likely to quit, or if they seem to be considering corporate sabotage.
  • Cameras and sensors can monitor how long each individual worker spends at their desk.

For those who don't work in an office, surveillance can be just as dogged.

  • GPS logs drivers' movements and even their driving style, and can punish them for deviating from set routes.
  • Gig workers like delivery people, cleaners, and other app-summoned helpers are watched and rated not by bosses but by their customers — and their incomes hinge on the stars they receive.

No federal laws limit how people can be watched at work, says Ifeoma Ajunwa, a law professor at Cornell. Only in a few states — like California and Connecticut — are bosses even required to tell workers that they are being surveilled.

5. The psychology of being watched
Expand chart
Data: SurveyMonkey online poll of 3,454 employed adults, Aug. 20–25, with a margin of error of ±2.5 points. Chart: Andrew Witherspoon/Axios

The point of surveillance is to track and influence people's behavior — in the political sphere, at school, in the subway or at work, Alison writes.

Just how new technologies affect someone's behavior and thinking at work depends on each individual, what they are doing and the goals of those who are surveilling.

Now there is data tracing our days — Slack messages, comings and going via fob swipes and social media posts.

  • When managers poke and prod that happenstance information or monitor employees without communicating a clear purpose in doing it, it can signal a lack of respect and trust, says psychologist Tara Behrend of George Washington University.
  • Some people will then look for ways to reassert their autonomy, for example, by withholding effort or not helping to train a new employee.

What they're saying:

  • 62% of people surveyed by Axios and SurveyMonkey said it's appropriate for an employer to routinely monitor employees using technology.
  • 48% said they would change their behavior if they knew their employer was monitoring them.

The big picture: Experts predict AI-fueled mass surveillance will alter people's behavior. But exactly how it will change our thinking isn't well understood.

  • Society doesn't mirror the workplace: The agreements between parties are different.
  • Still, Behrend says surveillance in both cases involves one party having power over the other — a dynamic that defines monitoring and its effects on behavior.
6. 1 fun thing: Anti-surveillance fashion

Illustration: Aïda Amer/Axios

Some designers, researchers and activists are trying to fool facial recognition technologies with fashion, Jessie writes.

What’s happening: The protests in Hong Kong have drummed up new interest in anti-surveillance fashion, according to designers Adam Harvey and Scott Urban.

  • Camouflage makeup: Harvey's Computer Vision Dazzle is an anti-surveillance makeup project that tricks facial recognition algorithms by using unusual makeup tones, concealing the nose and creating asymmetry. But the flashy makeup can make you more visible to other humans.
  • Privacy eyewear: Urban designs IRpair sunglasses, which prevent even your iPhone from recognizing you — and dupe infrared facial recognition technologies.
  • Patches on beanie hats designed by researchers from Moscow trick the state-of-the-art facial recognition system ArcFace.
  • Minimalist brass masks by Polish designer Ewa Nowak, called Incognito, deflect facial recognition software like the DeepFace algorithm used by Facebook.

The bottom line: At stake is the question of what is considered identity in this new era of surveillance. “The challenge now,” says Harvey, “is looking at how people are looking at you.”

🗳 Reader poll: Interested in more?

Thank you for reading! Please click a link below to let us know whether you'd be interested in more reporting about emerging technologies.

📱 Please invite your friends and co-workers to sign up here for Axios AM and PM.