Exclusive: TikTok tackles filter bubbles
TikTok is adjusting its algorithm to avoid showing users the same types of videos too frequently.
Why it matters: TikTok says the adjustments are being made to ensure it isn't inadvertently reinforcing viewpoints that could be bad for a person's wellbeing.
Details: The company is testing ways to avoid recommending content that, viewed sparingly, isn't harmful, but viewed sequentially could be problematic, like extreme dieting videos.
- It's also evaluating whether its algorithms inadvertently recommend content that could take a toll on someone's health if it becomes the majority of what they watch, like videos about loneliness or extreme weight loss.
- TikTok says these efforts are being informed by experts in medicine, clinical psychology, AI ethics and more.
Be smart: TikTok's algorithm was built to avoid redundancies that could bore users, like showing sequential videos from the same creators.
- New changes would expand these efforts by ensuring that videos aren't just from different creators and using different audio, but also don't feature the same types of topics that, consumed over and over, could be bad.
The big picture: TikTok's stated mission has been on creating a joyful and positive experience. But as it's gotten bigger, it's had to tackle thorny content moderation issues like violence and misinformation.
What to watch: TikTok's elusive algorithm has long been a topic of intrigue because it can be so good at serving users the content they crave.
- But it recognizes that giving people opportunities to customize their experience is important if it wants people to feel comfortable on the platform.
What's next: To address that, TikTok is testing ways to allow users to select certain topics or hashtags to avoid in their main "For You" feed.
- For example, someone whose dog recently died may want to avoid seeing videos about dogs, or a vegetarian may want to avoid videos with meat.
Go deeper: Inside TikTok's killer algorithm