Mark Zuckerberg told "Axios on HBO" that it's "just wrong" to consider Facebook a right-wing echo chamber, even though conservative voices top the platform's most-engaged content.
- "It's true that partisan content often has kind of a higher percent of people ... engaging with it, commenting on it, liking it," Zuckerberg told Axios.
- "But I think it's important to differentiate that from, broadly, what people are seeing and reading and learning about on our service."
With social media feeding online rage across the political spectrum, Zuckerberg said there's a "meme" out there "that says ... our algorithm is just trying to find things that are going to kind of enrage people somehow, and that that's what we try to show people. And that's not actually how our systems work."
- When I reminded Zuckerberg how much hate is on Facebook, he replied: "If you look in the country right now ... a lot of people ... are very exercised and I think, frankly, for a lot of good reasons. And we have real issues."
- "I think sometimes there is a fine line between an important level of high energy around an important issue and something that can kind of tilt over into causing harm."
Reality check: If you doubt that the right has learned how to exploit Facebook, consider this piece by the N.Y. Times' Kevin Roose on the power of "the right-wing Facebook bubble":
- "Most days, the leader board looks roughly the same: conservative post after conservative post, with the occasional liberal interloper."
- That's based on engagement (likes, shares, comments), while Facebook prefers to cite the content people merely see, which is heavy on legacy news organizations.
Asked how worried he is that history will record Facebook as an accelerant of social destruction, Zuckerberg said: "I have a little more confidence in democracy than that. And I hope my confidence isn't misplaced."
- "But what we do, and I think a lot of what the internet does overall, is gives individuals more power."
Won't take down anti-vaxxer posts: Zuckerberg said he's not ready to move against anti-vaxxers the way he did against COVID misinformation: "If someone is pointing out a case where a vaccine caused harm or that they're worried about it — you know, that's a difficult thing to say from my perspective that you shouldn't be allowed to express at all."
- Our thought bubble, by Axios' Sam Baker: Misinformation about vaccines has spread rampantly on big tech platforms for years, and has been linked to outbreaks of once-vanquished diseases like measles.
- And though Zuckerberg said Facebook will work with health authorities to try to provide reputable information about the COVID vaccine, this is likely to be a perfect storm of confusion, politically motivated reasoning and straight-up misinformation.
Calls for investigation of Apple App Store: "I do think that there are questions that people should be looking into about that control of the App Store and whether that is enabling as robust of a competitive dynamic."
- Should the government investigate? "I think I'm not necessarily the person to answer that," Zuckerberg said. "I think some of the behavior certainly raises questions. And I do think it's something that deserves scrutiny."
Why ads 7 days before the election instead of 30? Zuckerberg said Facebook will block ads seven days out to prevent misinformation — but 30 days out is different because "people want to be able to run get-out-the-vote campaigns," as well as respond to attacks and make closing arguments.
Taking down threats against election officials: One red line Zuckerberg is willing to draw, he said, is to "very aggressively take down any threats against those people who are going to be involved in doing the counting and making sure that the election goes the way it's supposed to."
- Those kinds of threats, he said, "would obviously undermine the legitimacy of the election."