Sign up for our daily briefing

Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Twin Cities

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa Bay news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa Bay

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Charlotte news in your inbox

Catch up on the most important stories affecting your hometown with Axios Charlotte

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Illustration: Rebecca Zisser

At the time of the September 11th attacks, the world's glimpse into terrorist thinking and communication was primarily through group-sponsored terrorist videos, often physically seized by authorities, and carefully edited for mass audiences by the television networks. But today, so much terrorist propaganda is easily accessible online and through social media, making it harder to censor, monitor and control.

Why it matters: As the Internet becomes more accessible, the role of monitoring terrorist content is spreading from governments to technology companies, and this year has seen some of the most aggressive efforts by both groups to get the situation under control.

Government pressure:

  • China announced this summer that it's investigating its own tech companies, like Tencent and Baidu, for giving users an avenue to spread violence and terror.
  • Government campaigns proposed earlier this year in the UK, France and Germany intend to place legal liability on tech companies for failing to control the presence of terrorist-related content on their platforms.
  • Regulatory pressure on the tech community is rising as lawmakers begin to question the power of major tech monopolies.

Brand pressure:

  • An uptick in pressure from some of the world's biggest advertising agencies caused brands to abandon YouTube in droves this spring. YouTube's direct ad spend was reportedly down 26% in Q2 as a result.
  • YouTube quickly announced changes to its policies to ensure advertisers and users felt its platform was a safe environment for content consumption and messaging. (More below.)

How tech companies are weeding out threats: Tech companies built on open platforms, like Google, Facebook and Twitter, have ramped up action in the past several months to ensure that terrorist accounts are blocked and terrorist content is removed.

Their strategies are often two-fold: 1) Improve rapid-response efforts to remove terrorist content or accounts when they are reported, and 2) invest in artificial intelligence technologies that can block terrorist content before it's ever uploaded.

  • Twitter has perhaps been the most aggressive about blocking individual accounts. According to a Twitter spokesperson, the company has suspended 636,248 accounts since mid-2015 and of them, "more than 360,000 accounts for threatening or promoting terrorist acts, primarily related to ISIS," since the middle of 2015.According to Twitter's most recent Transparency Report, 74% of accounts suspended in the second half of 2016 were surfaced by internal, proprietary spam-fighting tools.
  • Facebook revealed in a memo published this summer that it's turning to artificial intelligence to help stop terrorism from spreading on its site. "The company deploys artificial intelligence to detect when people try to repost photos and videos associated with terrorism ... and when there are "new fake accounts created by repeat offenders," Axios' David McCabe reported in June.Facebook does not reveal the number of accounts its suspends due to terrorism.
  • YouTube is mounting an intervention when users search for terms linked to extremism. Per McCabe, the company will "display a playlist of videos debunking violent extremist recruiting narratives."
  • Facebook, Twitter, Microsoft and YouTube will be involved in a new coalition that will, according to YouTube, make the companies' "hosted consumer services hostile to terrorists and violent extremists." The companies will share information with outside groups, work on technology to address extremism and "commission research to inform our counter-speech efforts and guide future technical and policy decisions around the removal of terrorist content."

One important caveat: Axios' Mike Allen highlighted an important finding from Professors Peter Neumann and Shiraz Maher, of the International Centre for the Study of Radicalisation in the BBC earlier this year: "The internet plays an important role in terms of disseminating information and building the brand of organisations such as [the Islamic State], but it is rarely sufficient in replacing the potency and charm of a real-world recruiter."

Go deeper

The dark new reality in Congress

National Guard troops keep watch at security fencing. Photo: Kent Nishimura/Los Angeles Times via Getty Images

This is how bad things are for elected officials and others working in a post-insurrection Congress:

  • Rep. Norma Torres (D-Calif.) said she had a panic attack while grocery shopping back home.
  • Rep. Jim McGovern (D-Mass.) said police may also have to be at his constituent meetings.
  • Rep. Adam Kinzinger (R-Ill.) told a podcaster he brought a gun to his office on Capitol Hill on Jan. 6 because he anticipated trouble with the proceedings that day.
Off the Rails

Episode 3: Descent into madness

Photo illustration: Sarah Grillo/Axios. Photos: Tom Williams/CQ-Roll Call, Chip Somodevilla/Getty Images

Beginning on election night 2020 and continuing through his final days in office, Donald Trump unraveled and dragged America with him, to the point that his followers sacked the U.S. Capitol with two weeks left in his term. This Axios series takes you inside the collapse of a president.

Episode 3: The conspiracy goes too far. Trump's outside lawyers plot to seize voting machines and spin theories about communists, spies and computer software.

President Trump was sitting in the Oval Office one day in late November when a call came in from lawyer Sidney Powell. "Ugh, Sidney," he told the staff in the room before he picked up. "She's getting a little crazy, isn't she? She's really gotta tone it down. No one believes this stuff. It's just too much."

Convicts turn to D.C. fixers for Trump pardons

Trump confidante Matt Schlapp interviews Jared Kushner last February. Schlapp is seeking a pardon for a biotech executive. Photo: Samuel Corum/Getty Images

A flood of convicted criminals has retained lobbyists since November’s presidential election to press President Trump for pardons or commutations before he leaves office.

What we're hearing: Among them is Nickie Lum Davis, a Hawaii woman who pleaded guilty last year to abetting an illicit foreign lobbying campaign on behalf of fugitive Malaysian businessman Jho Low. Trump confidante Matt Schlapp also is seeking a pardon for a former biopharmaceutical executive convicted of fraud less than two months ago.