Dec 20, 2022 - Technology

TikTok to explain why videos are recommended to users

Illustration of the TikTok logo made from binary code.

Illustration: Aïda Amer/Axios

TikTok on Tuesday said it will soon start explaining to users why it is recommending a particular video to them.

What's happening: The new feature, which is expected to roll out in users' main "For You" feed in coming weeks, is part of a wider effort by TikTok to be more transparent about how its content-choosing algorithm works.

Details: Users will be able to tap a button on the side of a video labeled with a question mark icon called "Why this video."

  • TikTok will list reasons a certain video was recommended to a user, using categories like the user's previous interactions; the accounts the user follows; content that's been posted recently; or content that's growing popular in a user's region.

Between the lines: TikTok has been pushing to make parts of its algorithm and content moderation decisions more transparent in recent years as it faces increasing scrutiny from lawmakers about its ties to China.

  • Last year, the company said it would adjust its algorithm to avoid showing users the same types of videos too frequently, creating filter bubbles.
  • The year prior, it walked reporters through a tutorial on how its algorithm selects videos to show users.

Yes, but: Lawmakers are still skeptical that TikTok's data security and content recommendation practices are conducted independently of influence from the Chinese state.

  • Some states have begun to ban TikTok from government devices, and there's a new effort to pass such a ban nationally.
  • TikTok is currently undergoing a national security review with the the Committee on Foreign Investment in the United States (CFIUS).
  • As part of that process, TikTok has tapped Oracle to independently vet its algorithms and content moderation models.

The big picture: Lawmakers have been evaluating ways to regulate the algorithmic distribution of content to make sure tech platforms aren't biased in their recommendations and to make it easier to stop misinformation. As a result, more platforms are taking steps to be transparent with users about their algorithms.

  • Meta published new details about how its video algorithm works in August.
  • Google added a new feature to give users more context about their search results last year.

Editor's note: This story has been corrected to say that lawmakers are skeptical of TikTok's practices, not Twitter's.

Go deeper