Jun 8, 2020 - Technology

A call to end outsourced content moderation for social media platforms

headshot

Photo: Mateusz Slodkowski/SOPA Images/LightRocket via Getty Images

A new report from NYU finds that a heavy reliance on contractors to handle content moderation at Facebook, Google and YouTube has led to bad working conditions and a lack of attention to real-world harms caused by inflammatory or deceptive content.

Why it matters: A great deal of attention is paid to these platforms' content policies, but much of the actual moderation work is being left to people who don't even directly work for the companies.

What they're saying: "The widespread practice of relying on third-party vendors for content review amounts to an outsourcing of responsibility for the safety of major social media platforms and their billions of users," said Paul Barrett, deputy director of the NYU Stern Center for Business and Human Rights and author of the report.

Details: In addition to bringing the moderation process in house, the report recommends that social media companies:

  • Double the number of content moderators.
  • Hire content moderation "czars."
  • Expand content moderation in countries where online-fueled violence is likely.
  • Provide better medical and mental health care to moderators.
  • Fund research into the health effects of content moderation on workers.
Go deeper