Screenshot from ChangeTheTerms.org
Although hate continues to flourish on social media, experts say the situation is not hopeless. Among the recommendations are allowing broader reporting of hate speech, offering a similar reporting system across different social networks, and putting content moderation on par with finding bugs in code.
What they're saying: "It’s common for a bounty to be paid for reporting code issues to a company — companies should do the same with content moderation," Newhouse School of Public Communications professor Jennifer Grygiel tells Axios. "The public, researchers, experts etc. should be paid for reporting content that violates Twitter’s community guidelines."
Driving the news: After finding a number of years-old anti-Semitic posts on Twitter, Grygiel spent the weekend reporting them to Twitter.
"I logged on to Twitter to monitor hate speech after the word “jews” trended on Twitter — I was concerned that Twitter was not prepared to address this issue. I was expecting new threats to come in — I was not expecting to find violent threats that were years old. Twitter touts AI and machine learning, yet they have not found the most basic of violent threats and hate speech that have been on the platform for years."
A report last week offered some additional recommendations for how internet companies should change their terms of service to deal with hate speech.
- The recommendations, made by a coalition of civil rights groups, offered model terms of service that also include mechanisms for transparency, training, and enforcement as well as a right to appeal any punitive measures taken.
- Specifically, the coalition recommends sites prohibit “hateful activity,” which it defines as “activities that incite or engage in violence, intimidation, harassment, threats, or defamation targeting an individual or group based on their actual or perceived race, color, religion, national origin, ethnicity, immigration status, gender, gender identity, sexual orientation, or disability.”
Our thought bubble: Tightening the standards is one piece of the puzzle. But just as important is the fact that social media companies need to develop the capacity to actually enforce such policies.
- Though Twitter is often criticized for failing to enforce its terms of service, it isn't alone. A New York Times piece yesterday noted how Instagram, too, is allowing hate-filled hashtags to spread.
The bottom line: Tech companies show an incredible ability to adapt their algorithms to boost engagement and profits. They need to devote similar energy to creating algorithms that minimize hate and harassment — for their sake and for society's.
- And if any companies need more help, they might want to look to Twilio CEO Jeff Lawson. Lawson wrote a powerful essay on what it means to be a leader in the current moment.
- He wrote, "Even though we should cherish tolerance, we must reject and shun those ideas that violate our most basic principles."
- Rather than throw up his hands, Lawson makes the case that it's all the more important for business leaders to stand up for American values at a time when they are under siege. The whole post is a must-read.