Sep 3, 2019 - Technology

YouTube says its new hate speech policies are working

Phone with YouTube logo

Photo illustration: Avishek Das/SOPA Images/LightRocket via Getty Images

YouTube says that changes it made to broaden its hate speech policies in June have resulted in a significant increase in problematic videos being removed from its platform.

Why it matters: The video giant says that usually it takes months for the company to ramp up enforcement of a new policy, but results from its latest quarterly report show that the June updates have quickly boosted the amount of content it's pulling off its platform.

What's new: YouTube says that it removed more than 100,000 videos last quarter and more than 17,000 channels — five times the number of videos and channels last quarter in comparison to Q1. It also says it removed over 500,000 comments, nearly double the amount it removed in Q1.

Between the lines: YouTube says that it's gotten much better at removing rule-breaking content faster, thus dramatically reducing the number of views videos rack up before they are eventually removed from its platform.

  • For example, the company says that the nearly 30,000 videos that it removed for hate speech violations over the last month generated just 3% of the views that knitting videos did over the same time period.
  • In total, YouTube says that over the last 18 months, changes it made to its content policies and removal practices have reduced views on videos that are later removed for violating its policies by 80%.

The big picture: The changes are part of a newly-released set of priorities by YouTube to take more responsibility for the content on its platform, including content that brushes up against its policies but doesn't explicitly violate them.

  • YouTube, like other tech companies, relies on a mix of humans and machines to flag and remove problematic content.
  • While human context is important, the company says that over 87% of the 9 million videos it removed in the second quarter of 2019 were first flagged by its automated systems, not people.
  • But the systems are good enough that more than 80% of the videos that were auto-flagged were removed before they received a single view last quarter.

Our thought bubble: Efficiency is important, but the numbers YouTube is sharing are hard to evaluate in a vacuum. We can't tell, for instance, whether the total amount of hate-oriented video content and viewing on YouTube is growing or shrinking.

Go deeper: Inside YouTube's hate speech minefield

Go deeper