Illustration: Sarah Grillo/Axios
Facebook agreed in a preliminary settlement on Friday to pay $52 million in damages to current and former content moderators for mental health issues that were developed on the job, The Verge reports.
Why it matters: The settlement is an acknowledgement from the social network that content review systems take a toll on workers' mental health. The preliminary settlement filed in San Mateo Superior Court also said Facebook would provide more counseling for workers.
The big picture: Content moderators spend hours reviewing content containing violence, sexual acts and other disturbing material that poses violations to the website's community standards.
Between the lines: Facebook is already shifting much of its content moderation to AI, which may mean less trauma for its human moderators. AI now accounts for nearly 89% of Facebook’s content removal, the company disclosed Tuesday.
- The tech company has further offloaded moderation of particularly troubling content to AI and in-house employees after sending its contractors home with pay as a result of the coronavirus pandemic.
- However, Facebook also said Tuesday it will start allowing some of its human reviewers back in the office on a gradual basis.
The settlement covers 11,250 moderators.
- Each moderator will receive a minimum of $1,000.
- Others could be eligible for additional compensation if they are diagnosed with post-traumatic stress disorder or related conditions.
- Lawyers in the case believe that "as many as half of them may be eligible for extra pay related to mental health issues associated with their time working for Facebook, including depression and addiction," per The Verge.