Stories

Leaked documents show how Facebook handles sensitive content

Alessio Jacona / Flickr cc

The Guardian reports that it has seen 100 internal "training manuals, spreadsheets and flowcharts" that show how Facebook's content moderators tackle content with violence, hate speech, terrorism, pornography, self-harm and racism in accordance to Facebook's community standards. Some of the documents showcase how sensitive and nuanced censoring content on Facebook's platform can be.

Why it matters: Facebook is in a lose-lose situation. If it doesn't filter its content, advertisers may not feel that their environments are brand-safe enough to run ads and users may feel that the platform is no longer a safe, inclusive or pleasant environment. If it does filter their content, they risk being blamed for judgement bias. For example, last Friday Facebook was criticized by pro-choice activists for removing the page of an organization that helps women obtain abortion pills. Facebook says it violated its policy against the "promotion or encouragement of drug use."

Key findings from The Guardian:

  • Facebook reviews more than 6.5 million reports a week relating to potentially fake accounts – known as FNRP (fake, not real person)
  • According to Facebook's standards, videos of violent deaths, while marked as disturbing, don't always have to be deleted because they can help create awareness of issues such as mental illness. Facebook generally removes videos that "Glorify" violence.
  • Videos of abortions are allowed, as long as there is no nudity.
  • Facebook will allow people to livestream attempts to self-harm because it "doesn't want to censor or punish people in distress."
  • Some photos of non-sexual physical abuse and bullying of children do not have to be deleted or "actioned" unless there is a sadistic or celebratory element.
  • All "handmade" art showing nudity and sexual activity is allowed but digitally made art showing sexual activity is not.
  • Anyone with more than 100,000 followers on a social media platform is designated as a public figure – which denies them the full protections given to private individuals.