Feb 24, 2021 - Technology

TikTok removed less than 1% of videos in second half of last year

Illustration of the TikTok logo made from binary code.

Illustration: Aïda Amer/Axios

TikTok said Wednesday that it removed under 1% of the videos uploaded on its platform during the latter half of last year amid the election and start of the COVID-19 vaccine rollout.

Why it matters: Most of the videos the company removed were attributed to child and adult nudity, a similar trend that occurs across most tech platforms. About 13% of the content it removed came from the U.S.

Details: TikTok says the vast majority of the videos and bad content it removed was done via automated moderation tools. Most videos (92%) were removed before a user reported them and 83% were removed before they received any views.

By the numbers: In total, TikTok said it removed more than 6 million accounts for violating its community guidelines and 9.5 million spam accounts.

  • The company says it prevented more than 173 million accounts from being created by robots and rejected about 3.5 million ads for violating its policies.
  • TikTok removed 2% of the videos for violating its hate speech policies, up from .8% in the first half of 2020. The company expanded its hate speech ban policies last October.

The big picture: TikTok has long tried to distance itself from politics by rejecting political ads and asserting its platform is built for sharing joy and entertainment. Still, the misinformation problem plaguing the internet has forced the tech giant to weigh in on a few issues.

  • The company built a COVID-19 information hub in August to help provide its users with access to credible information about the pandemic. The company says its hub was viewed more than 2.6 billion times globally. It built a 2020 U/S/ elections guide in September which it says was visited nearly 18 million times.
  • PSAs on hashtags directing users to the the World Health Organization and local public health resources were viewed more than 38 billion times. PSAs on election-related hashtags were viewed more than 78 billion times.
  • The company said it removed over 50,000 videos for promoting COVID-19 misinformation and nearly 350,000 videos in the U.S. for election misinformation, disinformation or manipulated media.
  • Nearly a half billion videos were deemed ineligible for distribution on TikTok's main "For You" feed for featuring inaccurate election information.

What to watch: The tech giant says it's learned a lot over the past year about what works in terms of content moderation and what doesn't.

  • One thing it says worked well was focusing early on both foreign and domestic elections threats leading up the the U.S. election, which it says helped get ahead of efforts to undermine the integrity of the election results.
  • One thing it says could be improved is making more investments to educate creators and brands on disclosure requirements for paid influencer content, particularly around political content.
Go deeper