
Illustration: Aïda Amer/Axios
When the dust finally clears from the fight over TikTok, whoever winds up running the burgeoning short-video-sharing service is likely to face a world of trouble trying to manage speech on it.
Why it matters: Facebook’s story already shows us how much can go wrong when online platforms beloved by passionate young users turn into public squares.
- It took more than a decade for Facebook’s college-student connection service to produce the morass of hate speech, misinformation, privacy violations and accusations of bias that so many see in it today.
- TikTok is already accelerating down a similar road.
Driving the news: Tuesday, TikTok head Vanessa Pappas called on a coalition of social media platforms to coordinate ways to stop the redistribution of viral videos depicting self-harm.
- Earlier this month, a video of a man shooting himself spread widely on TikTok.
- TikTok’s European director of public policy told the U.K. Parliament Tuesday that the video was “the result of a coordinated raid from the dark web.”
What they're saying: Back when it looked like Microsoft was going to be the "winner" of the TikTok sweepstakes, founder Bill Gates called the social media business "a poison chalice."
That's what Oracle and Walmart now expect to drink, as they try to finalize a deal to avoid a ban by President Trump, move TikTok to the U.S. and away from its Chinese owner, ByteDance.
The problem: Many U.S. teens are treating TikTok not just as a channel for light-hearted fun but as a space to discuss personal problems, traumas — and politics.
- The more serious the TikTok conversation gets, the more potential mischief and "coordinated inauthentic behavior" its users will face from bad actors.
- In its latest transparency report released Tuesday, TikTok said it took down 104,543,719 videos for violating its standards in the first half of 2020. 90% were removed before being viewed by other users.
How it works: TikTok tripled the length of its community standards in January, aiming to draw lines around the kinds of content it would try to keep away from its young users.
- TikTok's recommendation algorithm is tuned to amplify novelty and blow sparks of engagement into flames of popularity — a dynamic that's easily exploitable by trolls.
- Whoever is running TikTok can certainly take steps to tweak the algorithm and bar undesirable content, but they face a perpetual game of whac-a-mole no matter what.
The traits that set TikTok's platform apart also expose it to its own set of problems.
1. It's entirely video-based. Technology has improved enormously in helping platforms remove harmful text, but video — and particularly manipulated videos — remain a huge challenge for platforms to police misinformation and fraud.
2. It's used mostly by kids and young people. The company has already received the FTC's largest-ever fine for children's privacy violations last year. Children's' advocacy groups are already urging TikTok's new owners to revamp the company's children's privacy rules.
3. It has a filter bubble problem. TikTok has already conceded that its coveted algorithm can algorithm can produce "filter bubbles," reinforcing users' existing preferences. TikTok says it wants to avoid politics, but any user can see that the platform has already become flooded with political accounts and messaging. The platform's role in the 2020 election may not be fully known, and scrutinized, until after the election — as happened with Facebook in 2016.
4. There may be a diversity issue. TikTok has been embraced by the creative community, especially musicians. But according to a report in the Intercept, the company in the past has asked moderators to suppress content from "ugly" or "poor" people to keep undesirable users away from the service. (TikTok told The Intercept that the rules “represented an early blunt attempt at preventing bullying, but are no longer in place.")
5. It's still in early stages of monetization. New owners are going to want to see a bigger financial return, but every effort to wring revenue from users will invite new strategies to game the service's rules.
If the Oracle/Walmart plan for TikTok's future overcomes all the obstacles it faces, the deal's terms will add further dimension's to the company's content moderation problems.
1. It's unclear who will control TikTok's prized recommendation algorithm.
- China has restricted the foreign sale of domestically developed AI technologies.
- The less direct control over the recommendation code the company has, the less power it will have to defend against misinformation attacks and hate speech.
2. The high-profile fight over TikTok's future guarantees that many groups — such as nationalists in China and pro- and anti-Trump factions in the U.S. — will be itching to flood it with their views.
3. The companies inheriting TikTok's management in the U.S., Oracle and Walmart, have no experience in the business of online moderation. They'll also be struggling to manage the service's content while simultaneously defending its security under a merciless international spotlight.
The bottom line: The TikTok saga has regularly been portrayed as a fight for a precious prize, but the winner is likely to face a difficult reckoning.