Meta shares a tool for spotting terrorist content
Meta is sharing an open-source tool that other social networks can use to help spot and track problematic images and video content, including terrorist threats and child sexual abuse material.
Why it matters: It's a challenge for a company as big as Meta to try to monitor and remove dangerous and abusive content, but doing so is even harder for smaller companies with fewer resources.
Details: The tool, known as Hasher-Matcher-Actioner, is a kit for software developers that works by creating a unique digital fingerprint (typically a string of letters and numbers) for each image or video on a platform.
- That allows companies to quickly find and take action on not just a single posting of problematic media, but also other copies of that image or video.
- It also means that companies don't have to store the image or video in question. That protects them from the legal and ethical risks of keeping versions of problematic content on their servers. It can also save a lot of space, since hashes are far smaller than the media they represent.
- While Meta's tool can work with existing hash databases, such as the one run by the Global Internet Forum to Counter Terrorism, it can also be used independently, allowing a platform to build and maintain its own database of violating material.
- Other companies can use the tool to label and find any type of content, and choose what actions to take based on their own service's policy.
What they're saying: "Terrorists don’t limit themselves to trying to abuse just one platform," Dina Hussein, head of counterterrorism policy at Meta, said in a statement to Axios. "We cannot fight this fight alone — and we know the more companies work together, the safer the internet will be."