Jun 6, 2019 - Technology

Inside YouTube's hate speech minefield

This illustration shows the YouTube "play button" logo with a lit fuse on top.

Illustration: Lazaro Gamio/Axios

YouTube found itself the center of discussion around hate speech Wednesday, but not in the way it had hoped.

Driving the news: The company had long ago picked the date to announce a range of new policies aimed at limiting the presence and spread of hate speech on its platform. Instead, the announcement came in the midst of another uproar over the platform's enforcement of its policies, this one centering around conservative host Steven Crowder and the many homophobic and racial insults he has made over the years against Vox's Carlos Maza.

The timeline: On Tuesday, YouTube said that, after a days-long investigation, it decided not to take action against Crowder, who has 3.8 million subscribers.

  • While YouTube said his comments were "hurtful," it suggested they were made as part of a broader argument and thus did not violate its rules.
  • On Wednesday morning, YouTube made its policy announcement. The changes addressed less-targeted forms of hate than were at issue in the Crowder controversy, such as attacks on entire groups of people. These policy revisions had been months in the works and were not a reaction to Maza's complaint.
  • Less than 3 hours later — and amid significant outcry and rumblings of a boycott — YouTube announced it was suspending Crowder from the program that allows creators to run ads and share in revenue from the videos, saying "a pattern of egregious actions has harmed the broader community."

Between the lines: Criticism of YouTube was widespread, coming from within Google and YouTube, plus from outside on both the left and right.

  • Those on the left accused Google of doing too little, too slowly, while at the same time portraying itself as a friend of the LGBTQ community by changing its Twitter icon to a rainbow version of its logo for Pride Month.
  • On the right, Crowder and his supporters accused YouTube of caving to pressure since it had earlier said his content didn't violate its policies.

Our thought bubble: Although they are making opposing arguments, both sides are actually pointing at the same problem: YouTube's rules for taking down videos and "demonetizing" creators still appear to be vague and unevenly enforced.

  • This leads many observers to conclude that the decisions have more to do with how loud a fuss is raised and by whom.

What they're saying:

"Not everyone will agree with the calls we make — some will say we haven’t done enough; others will say we’ve gone too far."
"And, sometimes, a decision to leave an offensive video on the site will look like us defending people who have used their platforms and audiences to bully, demean, marginalize or ignore others."
— YouTube's Chris Dale

Go deeper: YouTube coverage of hate speech hearing marred by hate speech

Go deeper