Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Denver news in your inbox
Catch up on the most important stories affecting your hometown with Axios Denver
Des Moines news in your inbox
Catch up on the most important stories affecting your hometown with Axios Des Moines
Minneapolis-St. Paul news in your inbox
Catch up on the most important stories affecting your hometown with Axios Twin Cities
Tampa Bay news in your inbox
Catch up on the most important stories affecting your hometown with Axios Tampa Bay
Charlotte news in your inbox
Catch up on the most important stories affecting your hometown with Axios Charlotte
Photo illustration: Omar Marques/SOPA Images/LightRocket via Getty Images
Facebook is temporarily demoting posts containing election-related misinformation on its platforms and limiting the distribution of livestreams that may relate to the election, the company confirmed Thursday.
Why it matters: Facebook is turning on emergency measures like those used in countries where democracy is under threat as it looks to contain the spread of false claims and conspiracy theories about ballot counting.
What they're saying: "As vote counting continues , we are seeing more reports of inaccurate claims about the election," a Facebook spokesperson said.
- "While many of these claims have low engagement on our platform, we are taking additional temporary steps, which we’ve previously discussed, to keep this content from reaching more people."
The New York Times was first to report Facebook was preparing to activate the measures, which the Times reported also include added "friction" to make people take additional steps before sharing content.
- Twitter announced a similar step last month, nudging users to quote a tweet they want to share to add context before simply retweeting it.
Context: Facebook has until now been taking a softer public approach to election misinformation than Twitter, simply adding a label to misleading posts that steers users to an election information hub. Twitter has been hiding especially problematic election misinformation and limiting its ability to be shared.
Meanwhile: BuzzFeed reported Thursday that Facebook has seen a sharp rise since Oct. 31 in sentiments linked to the incitement of violence, per an internal tool that tracks hashtags and search terms.
- "We're staying vigilant in detecting content that could incite violence during this time of heightened uncertainty," the Facebook spokesperson said. "We've readied products and policies in advance of this period so we can take action quickly and according to our plans."
The big picture: Conservatives are fuming online as platforms clamp down on efforts to spread manufactured evidence, including in private Facebook groups, that Democrats are stealing the election. Extremism experts worry the baseless claims could spill over into real-world violence.