Mar 26, 2024 - Technology

Meta's broad-brush ban on Islamic term harmed free speech, board rules

Illustration of hands with markers crossing out content on a laptop screen

Illustration: Sarah Grillo/Axios

Meta has "substantively and disproportionately" restricted free speech through a blanket ban on content that uses the term "shaheed," which can be a reference to Islamic martyrs, Meta's independent Oversight Board said in a ruling released Tuesday.

Why it matters: Removal of references to "shaheed" accounts for more content removals than any other single word on Meta's platforms, per the board.

Meta failed to appreciate linguistic complexity, the board said, by treating the term shaheed "always and only as the equivalent of the English word martyr."

  • Extremists often use "shaheed" to glorify people killed while committing acts of terrorism. But there are other uses of the word which do not glorify or convey approval of martyrdom, and there is no direct English translation.

The decision says that Meta, which operates Facebook, Instagram and WhatsApp with a combined 3.5 billion users, engaged in "over broad" censorship.

The board recommended that Meta only remove instances of "shaheed" when the word is used alongside particular "signals of violence," such as a visual depiction of weapons or a statement of intent to commit violence.

  • "Reporting on, neutrally discussing and condemning" actions labeled as "shaheed" should be allowed, the board said.

The big picture: Meta has been accused of censorship in recent years, including politically charged complaints from Republicans in the U.S. investigating anti-conservative bias.

Friction point: "Meta has been operating under the assumption that censorship can and will improve safety," but that's not supported by evidence, board chair Helle Thorning Schmidt said in a statement.

  • Additionally, the censorship removed a "substantial amount" of material that was not intended to praise terrorists or their actions, the board said.

Context: Meta asked the board in February 2023 to rule on whether it should continue its approach of removing all content referring to "shaheed," and the board was preparing to publish findings when Hamas attacked Israel on Oct. 7.

  • Hamas is designated as a "Tier 1" dangerous organization, the highest, under Meta's dangerous organizations and individuals policy, and the attacks were quickly designated as terrorism under the same policy.

Behind the scenes: The board postponed making a decision on the issue in case fallout from Oct. 7 resulted in policy changes by Meta, but decided its original views held up to the "extreme stress" of the Hamas attacks and Israel's military response.

  • Board members said that Meta admitted the policy may lead to wrongly removing "swathes of content," including that critical of terrorism.

What's next: Meta can accept or discard the Oversight Board's general policy recommendations, though it has committed to observing board rulings that apply to specific posts and users.

What they're saying: "We want people to be able to use our platforms to share their views, and have a set of policies to help them do so safely. We aim to apply these policies fairly but doing so at scale brings global challenges," a Meta spokesperson told Axios.

  • The company promised a full response to the recommendations within 60 days.

Editor's note: This story has been updated with additional comment from Meta.

Go deeper