Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Stay on top of the latest market trends
Subscribe to Axios Markets for the latest market trends and economic insights. Sign up for free.
Sports news worthy of your time
Binge on the stats and stories that drive the sports world with Axios Sports. Sign up for free.
Tech news worthy of your time
Get our smart take on technology from the Valley and D.C. with Axios Login. Sign up for free.
Get the inside stories
Get an insider's guide to the new White House with Axios Sneak Peek. Sign up for free.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Want a daily digest of the top Denver news?
Get a daily digest of the most important stories affecting your hometown with Axios Denver
Want a daily digest of the top Des Moines news?
Get a daily digest of the most important stories affecting your hometown with Axios Des Moines
Want a daily digest of the top Twin Cities news?
Get a daily digest of the most important stories affecting your hometown with Axios Twin Cities
Want a daily digest of the top Tampa Bay news?
Get a daily digest of the most important stories affecting your hometown with Axios Tampa Bay
Want a daily digest of the top Charlotte news?
Get a daily digest of the most important stories affecting your hometown with Axios Charlotte
Illustration: Rebecca Zisser / Axios
Facebook says nearly all ISIS and Al Qaeda-related terror content is removed through machine learning before anyone flags it, and most copies of that content (83%) are removed within an hour of being uploaded.
Why it matters: In the past, Facebook has been criticized for not moving quickly enough to remove bad content once it has been uploaded or flagged to human reviewers. Critics, for example, cried foul when Facebook took over two hours to remove videos of a Cleveland murderer after the initial video, in which the murderer stated his intentions, was uploaded.
The main way Facebook removes such content is by creating a database of video, images and text that can be machine read and automatically blocking any matches from living on the platform.
The catch: Terror content is not all the same. AI can find matches, but "a system designed to find content from one terrorist group may not work for another because of language and stylistic differences in their propaganda," Facebook says. So the company is focusing its efforts on the terrorist organizations that pose the biggest global threat: ISIS and Al Qaeda. The hope is to "expand the use of automated systems to detect content from regional terrorist organizations too," the company says.
Our thought bubble: A lot of terror content that spreads online these days is domestic. While people are certainly inspired by ISIS and Al Qaeda-related posts, there's still a long way to go to ensure that all terrorist content is removed as thoroughly and as quickly.