
Amazon has come under fire for its facialrecognition software. Photo: David Ryder/Getty Images
A group of Democratic lawmakers want the Government Accountability Office to look at how companies and law enforcement agencies use facial recognition technologies.
Why it matters: Both Amazon and Microsoft have invited the government to lay out its views on the proper use of facial recognition technology. On the one hand, that introduces the possibility of regulation. On the other, the companies know that Congress is far from passing any actual legislation on the issue.
The details: In a letter sent to the GAO on Monday, top House Judiciary Committee Democrat Jerrold Nadler and Sens. Ron Wyden, Chris Coons, Ed Markey and Cory Booker asked the agency to look into a range of questions about facial recognition technology:
- They asked the GAO to examine which law enforcement agencies use the technology, how they use it and what protections are in place to govern its use — as well as the market for data associated the technology.
- The lawmakers suggest the agency buy facial recognition products to evaluate them with an eye toward “whether commercial entities selling facial recognition adequately audit use of their technology to ensure that use is not unlawful, inconsistent with terms of service, or otherwise raise privacy, civil rights, and civil liberties concerns.”
- “Given the recent advances in commercial facial recognition technology — and its expanded use by state, local, and federal law enforcement, particularly the FBI and Immigration and Customs and Enforcement — we ask that you investigate and evaluate the facial recognition industry and its government use,” the lawmakers wrote.
The context: When trained on a database that doesn't include a diverse set of faces, facial recognition software can exhibit bias. Studies have shown leading facial recognition platforms performing poorly on faces of women and people of color.
- Last week, the ACLU clashed with Amazon over the company's Rekognition platform, which at least one police agency is using to match suspects’ faces with existing mugshots. The ACLU found that Rekognition misidentified 28 lawmakers as criminals; Amazon said the ACLU tested its software using the wrong settings.
- Microsoft and IBM published improvements to their facial recognition platforms earlier this year, in response to an MIT study that found that they were biased. Earlier this month, Microsoft’s president, Brad Smith, called for “thoughtful government regulation” of facial recognition technology.
- The lawmakers asked GAO to examine whether the government has processes in place to check for these biases when it procures facial recognition software, or whether available software has "a disparate impact on certain racial or ethnic groups."
Threat level: Low. Companies can score PR points by asking the government to step in without fearing that lawmakers will actually act quickly.