Exclusive: TikTok introducing more automation to video removals
TikTok is rolling out a new system that will allow the company to block videos that violate its policies automatically when they're uploaded. The social network is also changing the way it will notify users when their content is removed.
Why it matters: TikTok says the new system will not only improve the user experience, but will help reduce the number of distressing videos (such as those with violent content) that its safety team must review, freeing staff to focus on more nuanced content areas, like hate speech, bullying and harassment.
Details: Beginning this week, TikTok will test the automatic deletion of several content categories that violates its policies, like minor safety, adult nudity and sexual activities, violent and graphic content, illegal activities and regulated goods.
- The company says its technology has the highest degree of accuracy in identifying violating content across these categories.
- It's been testing these changes in outside markets before bringing them to the U.S. TikTok says tests show the false positive rate for automated removals is 5%, while requests to appeal a video's removal have remained consistent.
Be smart: TikTok's safety team has always removed content its technology screened as a violation of its rules, but these changes will bring more automation to the process, making its moderation efforts more efficient. Its safety team will continue to review reports and content removal appeals from users.
The big picture: The changes are part of a wider company effort to be more transparent about the way TikTok moderates content.
- Last week, the company said for the first time it will report how many accounts it removes because they belong to users under age 13.
What's next: TikTok said that as a part of Friday's update, it will also change the way it notifies users when they violate the Community Guidelines.
- The new system takes actions on accounts based on the amount and severity of violations over time.
- Moving forward, TikTok will send an in-app warning to users that their content violates TikTok's rules, and could result in a ban on their account.
- If a user uploads a piece of content that TikTok has a zero-tolerance policy on (i.e.-child sexual abuse material), it will result in an automatic ban.
- Thereafter, users' accounts could be suspended or restricted if they are found to have repeatedly violated community guidelines. They will be notified after several violations if their account is on the verge of being banned.
Yes, but: TikTok acknowledges that its tech isn't perfect and it may inadvertently remove someone's video that doesn't violate its terms. In that scenario, TikTok says the content will be reinstated and the penalty will be erased from that user's record.
- Accrued violations, the company says, will expire from a person's record over time.