Mark Zuckerberg says Facebook shouldn't be in the business of removing pages that spread fake news and conspiracy theories on its platform. But there's still plenty of debate inside and outside the company over whether that's the right approach.
The bottom line: There are lots of ideas out there, but no consensus:
- New York magazine's Max Read suggested drafting a constitution
- The Verge's Casey Newton suggested Facebook might want to take a cue from YouTube, which flags misinformation and links to Wikipedia pages that address the topic
- The Daily Beast says Facebook needs to stop being hostage to critiques from the right and be willing to call out obvious lies.
Here's what a few experts told Axios they would do if they ran Facebook:
- Sherry Turkle, MIT professor :
- John Battelle, CEO of NewCo:
- Tarleton Gillespie, author of "Custodians of the Internet," about content moderation on social platforms:
- Jeff Jarvis, CUNY journalism professor and blogger:
Where would you draw the line?
- Battelle: "What is important is not where the line is, but that there is a line. Free speech means different things in different regions (and of course, most of us disagree on the finer points). You must draw a line, and make it very clear. However, this genie got out of the bottle a while back. It's going to be very hard to get it back in. Instead, I suggest this: Malicious falsehoods, hate speech, porn. Of these, only 'malicious falsehoods' has been where Facebook refuses to draw a line."
- Jarvis: "It's not an easy question with a quick answer, no matter how much people expect Facebook to come up with that quick answer. I'd start here: What ongoing, enforceable standards would be needed to ban not only incitement to violence in Myanmar and Sri Lanka but also to ban the obviously repellant Infowars. I would look at manipulation, threats, bigotry, and conspiracy theories as elements in this judgment."
Should moderation be done more by humans, more by computers or an even mix?
- Turkle: "He is going to rely on machines because that is a practical element of the reset, but he will need a lot more people, a lot more people who are steeped in the culture where they are working. There are great challenges and great opportunities for Facebook here if instead of shying away from this they step up to it."
- Battelle: "It has to be a combination. This is not as hard as they are making it out to be, save for the first year or so. But once trained a system of experts plus (algorithms) would work." And, he said, it should involve input from outside Facebook. "Too much is at stake."
- Jarvis: "Humanity does not scale. Humanity is messy. Humanity is expensive. But I'm afraid it's humanity that's needed to work on this problem. My advice to Facebook is that it needs people empowered not only with rules and community standards but also with trust in their judgment to recognize and act on threats, harassment, bigotry, manipulation, and generally behavior unbecoming to a human being. "
Yes, but: Though not as loud, there are other schools of thought which hold that policing speech is either impractical, not Facebook's job, or too important to trust to Facebook.