Leaked documents show how Facebook handles sensitive content
The Guardian reports that it has seen 100 internal "training manuals, spreadsheets and flowcharts" that show how Facebook's content moderators tackle content with violence, hate speech, terrorism, pornography, self-harm and racism in accordance to Facebook's community standards. Some of the documents showcase how sensitive and nuanced censoring content on Facebook's platform can be.
Why it matters: Facebook is in a lose-lose situation. If it doesn't filter its content, advertisers may not feel that their environments are brand-safe enough to run ads and users may feel that the platform is no longer a safe, inclusive or pleasant environment. If it does filter their content, they risk being blamed for judgement bias. For example, last Friday Facebook was criticized by pro-choice activists for removing the page of an organization that helps women obtain abortion pills. Facebook says it violated its policy against the "promotion or encouragement of drug use."