Illustration: Rebecca Zisser/Axios

The online spread of the Christchurch mosque killer's sickening first-person video divided experts, industry insiders and the broader public into two opposite camps: Some saw the debacle as proof that Facebook and YouTube can't police their platforms. Others saw it as evidence that they won't.

Why it matters: How we define the platforms' struggle to block the New Zealand shooter's video will shape how we respond to the problem. Either way, Facebook and YouTube don't come off well.

Driving the news:

  • Late Monday night, Facebook posted new details about the video, reporting that the shooter's original live stream was viewed fewer than 200 times in total. None of those viewers flagged it to moderators. The first report about the video came in 12 minutes after the live stream ended.
  • "Before we were alerted... a user on [troll site] 8chan posted a link to a copy of the video on a file-sharing site," Facebook says. Once Facebook took down the original, users began reposting copies.
  • Facebook previously reported that in the 24 hours after the shooting, it removed 1.5 million copies of the video — 1.2 million of which were blocked as they were uploaded.
  • YouTube didn't report numbers, but a Washington Post account said that despite the video platform's efforts to expand human moderation and automated systems, "humans determined to beat the company’s detection tools won the day."

One widely held view is that Facebook and YouTube are simply too big to monitor and control, even with the legions of human moderators they employ and the AI-driven recognition tools they are beginning to deploy.

  • "Social-media platforms were eager to embrace live streaming because it promised growth. Now scale has become a burden," Neima Jahromi wrote in The New Yorker.
  • In this picture, outnumbered moderators will always be a step behind masses of determined users, and the whack-a-mole game will never end.
  • As platforms get better at identifying and blocking particular classes and instances of undesirable content, the content's proponents will find new tactics for modifying, hiding and redistributing the material.

Another view holds that Facebook and YouTube have both repeatedly shown their ability to police their vast online estates when given no alternative:

  • They've taken strong measures against child pornography, and indeed kept most kinds of porn at bay.
  • They've moved forcefully to keep ISIS recruiting videos from publicly circulating.
  • They've worked to eliminate access to Nazi propaganda in Germany, where it's outlawed.
  • They've cracked down effectively on distribution of copyrighted materials via their services.
  • If strong enough legal, financial and socio-political incentives have done the trick in these areas, the argument goes, surely Facebook and YouTube can also take effective action against violent right-wing extremists.

These two scenarios paint two very different pictures of what's going on.

  • In one, platform managers are playing a Sisyphean delete-and-block game against persistent and inventive opponents.
  • In the other, companies that prioritize engagement metrics are protecting their business interests by failing to limit offensive content — except when media coverage and ad boycotts make action unavoidable.

Be smart: Hard as the problem is, people are going to keep pushing the platforms to solve it. And there are plenty of other steps Facebook and YouTube could take.

  • For instance, during a crisis they could suspend real-time uploads, or temporarily block those coming from new or unverified accounts.
  • In a Twitter thread, Homebrew's Hunter Walk (a former YouTube exec) proposed methods for YouTube to protect freedom of speech while curtailing "freedom of reach."

Go deeper: The real tech regulators

Go deeper

Trump tightens screws on ByteDance to sell Tiktok

Illustration: Aïda Amer/Axios

President Trump added more pressure Friday night on China-based TikTok parent ByteDance to exit the U.S., ordering it to divest all assets related to the U.S. operation of TikTok within 90 days.

Between the lines: The order means ByteDance must be wholly disentangled from TikTok in the U.S. by November. Trump had previously ordered TikTok banned if ByteDance hadn't struck a deal within 45 days. The new order likely means ByteDance has just another 45 days after that to fully close the deal, one White House source told Axios.

Updated 4 hours ago - Politics & Policy

Coronavirus dashboard

Illustration: Aïda Amer/Axios

  1. Global: Total confirmed cases as of 9:30 p.m. ET: 21,056,850 — Total deaths: 762,293— Total recoveries: 13,100,902Map.
  2. U.S.: Total confirmed cases as of 9:30 p.m ET: 5,306,215 — Total deaths: 168,334 — Total recoveries: 1,796,309 — Total tests: 65,676,624Map.
  3. Health: CDC: Survivors of COVID-19 have up to three months of immunity Fauci believes normalcy will return by "the end of 2021" with vaccine — The pandemic's toll on mental health — FDA releases first-ever list of medical supplies in shortage.
  4. States: California passes 600,000 confirmed coronavirus cases.
  5. Cities: Coronavirus pandemic dims NYC's annual 9/11 Tribute in Light.
  6. Business: How small businesses got stiffed — Unemployment starts moving in the right direction.
  7. Politics: Biden signals fall strategy with new ads.

Harris: "Women are going to be a priority" in Biden administration

Sen. Kamala Harris at an event in Wilmington, Del. Photo: Drew Angerer/Getty Images

In her first sit-down interview since being named Joe Biden's running mate, Sen. Kamala Harris talked about what she'll do to fight for women if elected VP, and how the Democrats are thinking about voter turnout strategies ahead of November.

What they're saying: "In a Biden-Harris administration women are going to be a priority, understanding that women have many priorities and all of them must be acknowledged," Harris told The 19th*'s Errin Haines-Whack.