Facebook plans to give users more controls over what they see
Facebook intends to provide users with new controls to directly personalize what items show up in their news feeds, the company’s VP of global policy Nick Clegg announced in an op-ed Wednesday.
What's happening: From the documentary "The Social Dilemma" to analysts of "surveillance capitalism" — both of which Clegg aims to rebut — Facebook's critics have zeroed in on the algorithms that shape users' experiences by selecting news feed content. With the company's new moves, it's trying to say to users, "You decide!"
Details: Clegg's article said the company has made several product changes to help users more easily identify and engage with content they choose, versus what an algorithm assumes they like.
- For example, Clegg says Facebook is building on its Favorites feature, which allows users to star top friends and Pages in order to boost the visibility of those favorites' posts. In a new feature, those select posts will also populate a new “Favorites” feed, as an alternative to the standard News Feed.
- Clegg says Facebook is also introducing a new "Feed Filter Bar" to make toggling between a chronological News Feed and the standard, algorithmically- populated News Feed easier (as Twitter users have long done).
- "You should be able to talk back to the algorithm and consciously adjust or ignore the predictions it makes — to alter your personal algorithm in the cold light of day, through breathing spaces built into the design of the platform," Clegg writes.
Facebook is also "in the relatively early stages of exploring whether and how to rank some important categories of content differently – like news, politics, or health," Clegg says.
Why it matters: It’s the latest effort by the tech giant to pull controversial topics like politics out of its News Feed to try to make the platform less polarizing.
- In January, Facebook CEO Mark Zuckerberg said the company will dial back on pushing political groups and content to users.
- Specifically, Facebook said it would stop recommending civic and political groups to users in the U.S., and that policy has recently been expanded globally, Clegg says.
Between the lines: The op-ed also addressed popular perceptions about why Facebook uses algorithms to push content to users in the first place.
- Clegg dismissed the idea that the company uses algorithms to intentionally push divisive content to users, and instead argued that the company has tried to downplay sensational content by offering users with additional context around sensitive events, like the pandemic or the election, and filtering out eye-catching spam.
- “The reality is, it’s not in Facebook’s interest — financially or reputationally — to continually turn up the temperature and push users towards ever more extreme content,” he writes.
The big picture: Facebook and other tech companies have long been criticized for a lack of transparency around how their algorithms work, and subsequently hook users and fuel polarization.
- CEO Mark Zuckerberg appeared in front of the U.S. Congress for the 7th time last week to defends the company’s policies around policing misinformation.