Facebook turns to AI to spot suicidal thoughts
Facebook said Monday it would use artificial intelligence to identify posts and live videos where users are expressing suicidal thoughts.
Why it matters: The company has been under pressure for live-streaming violent and graphic incidents, while being accused of not having the human resources to successfully moderate live content on its platform.
- Facebook's Guy Rosen said the company would use "pattern recognition technology to help identify posts and live streams as likely to include thoughts of suicide."
- That technology will launch outside of the U.S. and, according to Facebook, ultimately be used around the world but not in the European Union.
- The company will also be tasking more human employees with vetting reports of possible self-harm on the platform.
Go deeper: Our report earlier this year on how Facebook Live had been hosting videos of gruesome crimes.