Youtube Rego Korosi via Flickr CC
YouTube says it has removed more than 150,000 videos last week featuring children who were targeted by sexual predators in the comments section, Variety reports. The tech giant has also reportedly disabled comments for over 625,000 videos and terminated the accounts of several hundred users who had posted inappropriate content on videos featuring minors.
Why it matters: Google, which is YouTube's parent company, has policies of openness and inclusivity that have allowed bad actors to take advantage of their platforms. The company is working to integrate machine learning and better human processes to catch malicious content before it is even posted.