Dec 19, 2023 - Health

Rite Aid faces 5-year facial recognition ban

Customers inside a Rite Aid store in New York, US, on Monday, Oct. 16, 2023.

Customers inside a Rite Aid store in New York in October. Photo: Bing Guan/Bloomberg via Getty Images

Rite Aid will be banned from using AI-powered facial recognition technology for five years under a proposed settlement of Federal Trade Commission charges, the FTC announced Tuesday.

Why it matters: The FTC alleged in a complaint Tuesday that the pharmacy retail chain failed to implement reasonable procedures in hundreds of stores and prevent harm to consumers with what the agency called Rite Aid's "reckless" use of facial recognition technology that it said "disproportionately impacted people of color."

  • Face-recognition tech has proven to be a popular option for retail and other industries, and these findings could galvanize advocacy groups that campaign against such surveillance.

Of note: While Rite Aid welcomed the proposed settlement, it said in a statement Tuesday "we fundamentally disagree with the facial recognition allegations in the agency's complaint," adding that the company used the technology in "a limited number of stores."

Driving the news: The FTC accuses Rite Aid in the complaint, filed in federal court in Pennsylvania, of failing to take reasonable measures to prevent harm to customers when using AI-based facial recognition technology from 2012 to 2020 to identify those they suspected of shoplifting or other problematic behavior.

  • It alleges Rite Aid's actions subjected consumers to embarrassment and harassment.
  • The commission said Rite Aid's actions violated a 2010 data security order by failing to adequately oversee its service providers.

Zoom in: The complaint alleges that Rite Aid used the technology "to capture images of all consumers" in its drugstores and created a database of those identified as carrying out suspicious behavior. The database included "accompanying information," such as names, birth years and details "related to criminal or 'dishonest' behavior.'"

  • Rite Aid workers would receive "match alerts" to their phones. "In numerous instances, the match alerts that led to these actions were false positives," the complaint states.
  • The company didn't inform consumers that it used the technology and "Rite Aid specifically instructed employees not to reveal" its use to customers, the FTC alleges.

What's next: The FTC's proposed order would require Rite Aid to implement comprehensive safeguards to prevent any future harm to customers.

  • It would require Rite Aid to stop using such technology and delete, and direct third parties to remove, any images or photos that have been collected.
  • Given that Rite Aid is going through bankruptcy proceedings, the FTC said the order would go into effect after approval from the courts.

What they're saying: "Rite Aid's reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers' sensitive information at risk," said Samuel Levine, director of the FTC's Bureau of Consumer Protection, in a statement accompanying the announcement.

The other side: "We respect the FTC's inquiry and are aligned with the agency's mission to protect consumer privacy," Rite Aid said in its statement.

  • "Rite Aid stopped using the technology in this small group of stores more than three years ago, before the FTC's investigation regarding the Company's use of the technology began," the company said in the statement.

Between the lines: Joy Buolamwini, an AI researcher who has studied face-recognitio's racial biases, told the Washington Post the Rite Aid case was an "urgent reminder" that the U.S. has failed to enact sweeping privacy laws.

  • The face is the final frontier of privacy and it is crucial now more than ever that we fight for our biometric rights, from airports to drugstores to schools and hospitals," Buolamwini said.

Go deeper... Report: Feds need rules for using facial recognition tech

Editor's note: This article has been updated with comment from AI researcher Joy Buolamwini.

Go deeper