YouTube already doesn't allow video creators to make money by the inappropriate use of family-friendly characters. Now the video platform says it will also implement a new policy to age-restrict inappropriate content so it is automatically blocked from showing up in the YouTube Kids app, The Verge reports.
Why it matters: The kids YouTube video genre got a lot of attention this week with a well-read Medium post and a New York Times story about the bizarre and sometimes disturbing videos that turn up in YouTube search results, and the video factories that have learned how to game the algorithms to slip jarring content disguised as children-friendly videos through the YouTube filters.
YouTube says the problem is relatively small, and that most inappropriate content not caught by algorithms is flagged by users or human employees. The Verge reports that YouTube is also exempting age-restricted content from advertising, so it's willing to forgo ad revenue from the strange kids videos.