
Monika Bickert, head of global policy management at Facebook, testifies during a Senate Commerce Committee hearing in September 2019. Photo: Mark Wilson/Getty Images
Lawmakers questioned Facebook's new deepfake policy at a hearing Wednesday, with Democrats arguing the social media company's plan for addressing manipulated video does not go far enough.
Why it matters: Many policymakers already say tech giants have proven they're not up to the task of regulating themselves. Dissatisfaction with Facebook's plans for handling deepfakes will only further fuel calls for Washington to step in.
Context: Facebook unveiled the deepfake policy this week after criticism about how the company handles altered videos. Top of mind for Democrats was Facebook's decision last year to not remove a doctored video of House Speaker Nancy Pelosi that made her appear drunk.
Driving the news: Rep. Jan Schakowsky, chairperson of the Consumer Protection and Commerce subcommittee, said the new policy appears "wholly inadequate" because it would not have prevented the Pelosi video.
- Monika Bickert, Facebook's vice president of global policy management, confirmed the video would not have fallen under the new policy, "but it would still be subject to our other policies that address misinformation."
Details: Facebook's new standard is to remove videos that are altered using artificial intelligence and would mislead the average person into thinking the subject of the video said something they did not.
- Schakowsky noted that this wouldn't cover videos where just the image is altered. "I really don’t understand why Facebook should treat take fake audio differently from fake images. Both can be highly misleading and result in significant harm to individuals."
- Democratic Florida Rep. Darren Soto also pressed Bickert on why Facebook wouldn't take down the Pelosi video. "Our approach is to give people more information so that if something is going to be in the public discourse, they will know how to assess it," Bickert said.
- She did acknowledge that the company could've gotten the video to fact-checkers faster and put a clearer label on the video to identify it as false.
What to watch: Schakowsky seemed at least somewhat receptive to a call from Tristan Harris, a former Google employee who co-founded the Center for Humane Technology, to give federal agencies such as the Departments of Education and Health and Human Services a "digital update" to expand their jurisdictions to tech platforms.
- Harris suggested agencies like HHS could "audit Facebook on a quarterly basis and, say, tell us how many users are addicted between [certain] ages" and press the company on what it's doing "next quarter to make adjustments to reduce that number."
- Schakowsky said she hopes to get bipartisan conversations moving on building a new regulatory framework for tech — one, she said, that could include the kinds of audits that Harris floated.
- "It would not necessarily create new regulatory laws ... but we may need to," she said. "To me, that's the big takeaway today."