Photo: Aytac Unal/Anadolu Agency/Getty Images
Between the lines: Google-owned YouTube can't rely on simply rooting out “bad” content, since many of the videos pedophiles are exploiting can seem innocuous to human ratings teams, per WIRED. For example, many of the offending comments sections are on videos of children doing gymnastics or dancing. Some of the comments are seemingly innocent as well — such as “swimsuit” or “nice” — and may not get flagged as inappropriate.
- Context: The announcement comes after advertisers, including AT&T, Disney, and Nestle, started pulling their ads from YouTube to boycott having their ads placed next to harmful content.
What’s next: YouTube said it will be launching a comment classifier that will identify and remove comments of a predatory nature at twice the current rate. YouTube also said it will soon be disabling comments on videos of older minors that may be at risk of attracting predatory behavior.
The bottom line: YouTube drives recommendations based on watch time. But as watch times on these videos of young girls stack up, sometimes totaling of hundreds of thousands of views, the platform's algorithm will need tweaking in order to protect children online.
Go deeper: The internet reckons with kids