Jan 8, 2020 - Technology

TikTok expands content rules, cracks down on misinformation

Photo: Chesnot/Getty Images

TikTok on Wednesday published a lengthy update to its rules of conduct, sharpening its definition of unacceptable content and its stance toward misinformation.

Why it matters: The move is an acknowledgment that TikTok's previous standards did not adequately address the onslaught of content-related issues that the video-sharing platform is starting to face as it grows.

What's new: TikTok's updated community standards are three times the length of the old guidelines. The platform has defined 10 issues that its content policies address, including:

  • Dangerous individuals and organizations
  • Illegal activities and regulated goods
  • Violent and graphic content
  • Suicide, self-harm and dangerous acts
  • Hate speech
  • Harassment and bullying
  • Adult nudity and sexual activities
  • Minor safety
  • Integrity and authenticity
  • Threats to platform security

Be smart: While TikTok's old guidelines addressed many of these areas, the new rules go into much greater detail.

  • For example, the company published a lengthy list of who it considers to be dangerous individuals and organizations that can't use its app, including groups affiliated with hate, extortion, organ trafficking, cybercrime and extremism. And in doing so, it defined what it considers to be a terrorist organization.
  • It also greatly expanded its policies around minor safety, an issue that TikTok has had to grapple with in the U.S., especially in terms of children's data privacy. The new policies say explicitly that users must meet minimum age requirements to use TikTok.
  • The new rules don't ban misinformation outright. But they do explicitly say that misinformation that's created to cause harm to users or the larger public is prohibited, including misinformation about elections or other civic processes.

The big picture: TikTok's unprecedented rise has rattled U.S. lawmakers, who fear that the Chinese-owned app's ambiguous standards around content, as well as data privacy, could pose security risks to the U.S.

  • The viral video-sharing app owned by Chinese tech giant ByteDance says it's making these changes to offer "insight into the philosophy behind our moderation decisions and the framework for making such judgements."

Our thought bubble: One lesson TikTok could learn from its social media rivals is that the stronger a policy is, the harder it can be to enforce. It's one thing for TikTok to create a policy requiring the removal of accounts of children younger than 13, but another thing entirely to make the ban stick against one of its most popular user age groups.

Go deeper:

Go deeper