YouTube removed 11 million videos last quarter using more automation
- Sara Fischer, author of Axios Media Trends

Photo: Aytac Unal/Anadolu Agency/Getty Images
YouTube said on Tuesday that it removed 11.4 million videos last quarter, largely by relying more heavily on automated content moderation. The company said 95% of the problematic videos removed at first detection were found by its software.
The big picture: With fewer human reviewers, thanks to COVID-19 forcing people to work remotely, YouTube had to choose between relying more on automated systems and over-removing content or relying on fewer humans and allowing more rule-violating videos to remain online. It says it chose the former to protect its community, but removal appeals doubled as a result.
By the numbers: The number of videos that were appealed doubled from Q1 to Q2, the company said. But YouTube says it prepared for the increase in appeals by adding more resources, and as a result, it was able to reinstate double the amount of appealed content from Q1 to Q2.
- Still, YouTube says appeals are pretty rare and only occur in less than 3% of all video removals.
- Most of the videos removed got few views, according to YouTube. In total, 42% of removed videos had 0 views, 24% had 1–10 views and 24% had more than 10 views.
- Most of the videos were removed for child safety policy violations (3.8 million videos), spam (3.2 million videos) or nudity (1.7 million videos).
- The aggressive policy enforcement led to more than 3x the number of removals of content that YouTube's systems suspected was tied to violent extremism or was potentially harmful to children.
- The company said it only removed 2 billion comments last quarter.
Context: Facebook also said it was forced to rely more heavily on automation as a result of the pandemic.