Stories

How Facebook makes tough calls on censorship

Facebook "like" on signage at HQ
Facebook headquarters in Menlo Park, California. Photo: Josh Edelson/AFP via Getty Images

Facebook has published the internal guidelines it uses to make tough decisions on sensitive topics on its platform, including hate speech, child safety and terrorism.

Why it matters: It's Facebook's way of telling censorship critics that the tech giant is methodological and consistent about how it polices content on its platform.

  • Facebook will also build out the ability for people to appeal its decisions over the coming year, something Facebook CEO Mark Zuckerberg has alluded to in past interviews. As a first step, it's launching appeals for posts that were removed for nudity, sexual activity, hate speech or graphic violence.
  • The company let a few reporters sit in on its regular content meeting last week, which typically includes employees from legal, safety policy, community operations, public policy (including regional), communications, community integrity, diversity, government and politics teams, product, etc.
  • In May, Facebook will launch Facebook Forums: Community Standards, a series of public events in Germany, France, the U.K., India, Singapore, the U.S. and other countries to get people's feedback directly.
  • By the numbers: Facebook says it currently has 7,500 content reviewers — more than 40% than it had last year — and has experts reviewing content reports in over 40 languages around the world.

Gut check: The majority of Americans (58%) are resistant to action by the U.S. government that might also limit free speech, but they're more open to action from technology companies themselves, a new Pew Research Center survey finds.

  • The issue isn't partisan. Democrats and Republicans are equally resistant to government action against false news that could limit freedoms.