Jul 28, 2023 - Technology
New research suggests Facebook algorithm doesn't drive polarization
- Sara Fischer, author of Axios Media Trends

Illustration: Sarah Grillo/Axios
A series of four groundbreaking studies published Thursday in Science and Nature reveal new details about the role that the world's largest social media platform played in driving polarization during the 2020 election and beyond.
Why it matters: The study represents one of the largest data sets ever captured by researchers about the use of social media during an election. The findings may help regulators and tech firms better navigate future elections.
- One of the four studies relied on a sample size of 208 million U.S. Facebook users during the 2020 presidential election.
Key takeaways:
- Facebook's algorithmic feed "did not significantly alter levels" of polarization: The researchers surveyed users for the three months leading up to the election and found that while those who use a chronological feed see fewer posts, it doesn't make their experience less polarizing.
- Reshare buttons drive more political news consumption: Suppressing reshared content substantially decreased the amount of political news users were exposed to on Facebook, but it didn't impact people's political opinions.
- Conservatives live in a much deeper echo chamber on Facebook: As a result, they encounter more misinformation. "Our analyses highlight that Facebook … is substantially segregated ideologically—far more than previous research on internet news consumption based on browsing behavior has found," the researchers wrote.
- Polarization surfaces mostly in Pages and Groups: The researchers say the groups "benefit from the easy reuse of content from established producers of political news and provide a curation mechanism by which ideologically consistent content from a wide variety of sources can be redistributed."
The big picture: The findings suggest America's growing polarization can't fully be blamed on social media.
- But the design of tech platforms can impact the exposure of users to misinformation and like-minded people and groups that may be more likely to pull them into bubbles.
What to watch: More studies are set to be released about the research captured during this time period on Facebook.