Facebook tightens rules around self-harm images
Coinciding with World Suicide Prevention Day, Facebook is announcing a series of policy changes designed to keep users from encouraging self-harm, while also trying to preserve the ability for people to discuss their struggles without shame.
Why it matters: Globally, someone dies every 40 seconds by suicide and experts say up to 25 times as many will attempt suicide.
Deciding just what to allow and not allow on social networks is a tricky balance and something Facebook has been wrestling with for more than a decade. The new changes are the result of the latest consultations with experts over the past year.
Specifically, Facebook is
- tightening its rules to limit graphic depictions of cutting on Facebook and Instagram
- tightening its policies around acceptable images related to eating disorders
- hiring a full-time health and well-being expert on its safety team to focus on these and other issues
- promoting the #chatsafe guidelines designed to help encourage healthy dialogue with those dealing with suicidal feelings
- looking for ways to share some public user data with the academic community, starting with two researchers who study suicide prevention.
What they're saying: "While suicide prevention work and dealing with self-harm can be some of the most challenging policy work we do, it's also some of the most important work we do," Facebook global head of safety Antigone Davis told Axios.
In creating the new policies, Davis said Facebook is trying to limit people from unwittingly being exposed to harmful content while at the same time preserving preserve opportunities for people to share their struggles and gain a sense of community."
At Facebook we have a unique opportunity to help people connect and find the support they need," Davis said.
Our thought bubble: Facebook needs to perform a delicate and difficult balancing act. Sharing thoughts of suicidal intention, self-harm and disordered eating can have a contagious effect. At the same time, people dealing with these issues need outlets to talk about them or they suffer shame and isolation. Sharing one's struggles can provide relief — yet what's helpful to someone posting could end up being harmful to someone reading.
What's next: Facebook is moving to place greater emphasis on private groups and messaging. That will only make these issues thornier. And, of note, today Facebook doesn't proactively screen private groups for the types of content banned under the new policies. Rather, it relies on reports from users, meaning someone within the group would need to voice a complaint.