
Photo: Win McNamee/Getty
Facebook's refusal to remove a doctored video of House Speaker Nancy Pelosi is reigniting a debate over how tech platforms should respond to misinformation and hoaxes.
Why it matters: Facebook's decision to eventually demote — not remove — the doctored video has critics questioning whether its standards and processes are adequate enough to handle the constant attempts to spread misinformation on its platform ahead of the 2020 elections.
Driving the news: Perhaps the clearest description of Facebook's positioning came from a rare primetime cable interview last Friday between CNN's Anderson Cooper and Facebook VP of Product Policy and Counterterrorism Monika Bickert.
- Bickert argued that when it comes to content about political discourse, even if it has been deemed "false" by its third-party fact-checking partners, "we think the right approach is to let people make an informed choice."
- This would not be the case if the video was directly directly linked to violence, terrorism, or impacted public safety, she said.
The delay to remove the video, which CNN reported was 32 hours, was attributable to the fact that Facebook outsources its fact-checking process to a network of third-party fact-checking partners, and they needed time to inspect the video themselves.
- Once Facebook confirmed that fact-checkers deemed the video "false," it began "dramatically reducing" the video's distribution and added a fact-checking prompt to a screen before the video is shared that shows a list of fact-checking articles about the video.
- Facebook says it needs to tread carefully about how it labels something as "false" because research shows it can inadvertently lead to more distribution — not less.
- YouTube very quickly removed the video from its platform arguing the content violated its policies. Twitter has not commented on the record.
The big picture: Facebook is doing more to explain its rationale and policies, but critics aren't all happy with the platform's decision-making.
- Some argued that Facebook is hiding from its responsibility as one of the world's largest news distributors, even if it's legally protected from having to make such decisions.
Our thought bubble: One bright side of this debacle is that Facebook is being more transparent about how it makes these decisions than it has been in the past. This is forcing lawmakers to better understand and then question how Facebook makes such editorial decisions, as opposed to scold them for hiding behind those decisions.
The bottom line: Facebook outsources these decisions so that it doesn't have to make them itself. "We aren't in the news business. We are in the social media business," Bickert said in her interview with Anderson Cooper.