Illustration: Rebecca Zisser/Axios.
Several new studies have been published that explore how fake news has actually decreased on Facebook users’ feeds since the 2016 presidential election.
Why it matters: Axios’ Sara Fischer explains that Facebook is making sure everyone knows that academics are finding that the company's fake news fight is working. But these studies address the main Facebook app, not Instagram and its messaging platforms, where the problem is also prevalent and deepening.
- A Stanford University study finds that from 2015-17, Facebook interactions with 570 false news sites declined by more than half after the 2016 election, suggesting that "efforts by Facebook following the 2016 election to limit the diffusion of misinformation may have had a meaningful impact."
- Another study from the University of Michigan says that Facebook has 50% less “iffy” or questionable content than Twitter.
- In France, Facebook engagement with “unreliable or dubious sites” has halved since 2015, per Decoders of French newspaper Le Monde.
- "We’re learning from academics, scaling our partnerships with third-party fact-checkers and talking to other bodies like civil society organizations and journalists about how we can work together," said Facebook in a press release Friday.
- Facebook was not affiliated financially with any of the studies and did not provide data for any of the research.