Jun 30, 2021 - Technology

Exclusive: TikTok removed 7 million+ accounts for being under 13 in Q1

Illustration of a finger about to delete the TikTok app

Illustration: Sarah Grillo/Axios

TikTok on Wednesday said it removed more than 7 million accounts "from the full TikTok experience” during the first quarter of 2021 for potentially belonging to people under the age of 13.

Why it matters: The disclosure marks the first time the tech giant has revealed the number of accounts it has had to address for possibly belonging to kids and pre-teens.

  • It's been reported that close to one-third of TikTok's daily active users in the U.S. were under 14.

Be smart: The minimum age to use TikTok's full experience is 13-years-old. If users under 13 are using TikTok or posting content without properly disclosing their age, they will be removed.

  • TikTok currently has a dedicated section in its app in the U.S. that includes curated content for children under 13. It doesn't let children under 13 post videos or comments, although some children safety groups argue it can be bypassed by kids who can easily lie about their birth date.
  • TikTok has been pushing to address its under-13 community more aggressively ever since it was forced to pay a $5.7 million fine to the FTC for children's privacy violations in 2019.
  • In January, it introduced a new default privacy settings for teens. Last August, it debuted a new "Family Pairing" feature, which allows parents to control how long their kids spend on the app and who can message them. In November, it added more oversight tools, including allowing parents to control whether or not teens can search for content.

By the numbers: TikTok said that the 7 million+ accounts it removed for potentially belonging to a person under 13 make up more than half of the 11 million+ accounts that it removed for violating its community standards policies.

  • When it comes to content, TikTok said that it removed 61,951,327 videos globally for violating its terms in Q1, amounting to less than 1% of all videos uploaded on the platform. A majority of videos (91.3%) were identified and removed before a user reported them.
  • Most video removals came from the U.S., Followed by Pakistan, Brazil and Russia. the most prevalent policy violation for videos removed was minor safety, followed by illegal activities and regulated goods, adult nudity and sexual activities, harassment and bullying and violent and graphic content.

The company also debuted its first-ever security overview on its “global bug bounty program,” a program in which TikTok compensates users or people outside the company for reporting bugs.

  • TikTok disclosed as part of the bounty program that in Q1, it received 33 valid submissions of bugs and resolved 29 of them. It also says it received and published 8 public disclosure requests.

What’s next: TikTok usually publishes community standards enforcement reports twice per year. But moving forward, it will publish that information on a quarterly basis, while information related to legal requests will continue to be published bi-annually.

Go deeper