Photo: David Paul Morris / Bloomberg via Getty Images
Facebook CEO Mark Zuckerberg announced Friday that the tech giant will begin to prioritize news that is "trustworthy, informative, and local" from "trusted sources" in its News Feed beginning next week. They will do this by using data from user surveys about which news sources they find trustworthy to inform their News Feed algorithm.
Why it matters: It's a major step Facebook is taking to weed out bad content and misinformation on its platform and to develop more meaningful interactions between members of its community.
The move is part of a larger mission-driven overhaul of the News Feed to ensure that "meaningful interactions" will replace "passive scrolling" on Facebook, which will strengthening community relationships on the platform and eliminate misinformation, clickbait and spam.
- The news comes a week after Facebook announced that it would begin lowering the rankings of publishers' content and public content within its News Feed.
- With the new update, Facebook says the News Feed changes announced last week will reduce news content by 1% on its platform, from 5% roughly today to 4% moving forward.
The bigger question for Facebook is how effective its new system will be in determining what are "trusted sources" and what is "trustworthy" and "informative" content.
- The company has intentionally avoided hiring journalists to make these kinds of decisions, and instead has relied on technology (algorithms, artificial intelligence, etc.) to weed out potentially bad news sources.
- Zuckerberg says making those decisions itself is "not something we're (Facebook is) comfortable with," and thus says it will rely on user feedback to determine ranking.
- Facebook will begin asking people whether they're familiar with a news source and, if so, whether they trust that source.
Facebook says the update won't change the amount of news users see on Facebook after the announced change about reducing publisher content in the News Feed. "It will only shift the balance of news you see towards sources that are determined to be trusted by the community."
Our thought bubble: Facebook did hire people to make judgments about content when it hired editors for its trending topics section years ago. But it faced backlash for that decision in 2016 when it was alleged that some of those editors were burying news stories and outlets favored by conservatives. They’ve struggled to address content moderation ever since.