Illustration: Lazaro Gamio/Axios
Ten states have introduced bills in 2020 that would regulate, ban or study facial recognition systems, according to the Georgetown Law Center on Privacy and Technology.
The big picture: There is no federal regulation on this tech, despite consensus for guardrails from its creators and bipartisan support for its restraint in Congress.
What's new: States weren't as interested in facial recognition tech last year, according to Hayley Tsukayama, who monitors state surveillance legislation for the Electronic Frontier Foundation. But city-wide bans in Oakland, San Francisco and Cambridge proved that legislation was possible.
- Indiana, New Jersey, South Carolina and Washington state are proposing restrictions on law enforcement's use of facial recognition — when employed on its own or with body cameras. (The CEO of Axon, the largest supplier of body cameras to police in the U.S., expects the company to use facial recognition in 3-5 years.)
- New Hampshire is calling for a total ban, while Michigan wants to ban "real-time" use — recognizing and capturing faces in public, in real time.
- Vermont is concerned with notifying shoppers when facial recognition is used in stores, and Maryland proposes telling defendants if they were identified with the tech.
What's next for privacy advocates: Jameson Spivack, policy associate at Georgetown's privacy center, expects the bipartisan issue to "move beyond wealthy liberal bastions."
- Maryland lawmakers are working on another bill to limit external agency access to the state's facial recognition system, which includes mugshot and driver's license photos, one Georgetown researcher said.
- A New York bill banning biometric identification in school systems is expected to be reintroduced this year, the ACLU's Chad Marlow said. "I think if we had 72 more hours we would have passed it," he added.
Background: Big Tech companies selling facial recognition systems — like IBM, Microsoft and Amazon — have asked federal policymakers to judge how government agencies and law enforcement use the tech, and a few bipartisan measures have responded.
- A recent federal study found that facial recognition systems offered by those companies largely failed to identify people of color, predominately Asians and African Americans. Amazon did not submit its algorithm to the study, per the Post.
- IBM, Face ++ and Microsoft reduced accuracy issues for identifying darker-skinned women within 7 months of a 2018 MIT Media Lab study measuring those errors. MIT found that Amazon's system was the worst at identifying darker-skinned women, which the company has disputed.