Mar 18, 2020 - Technology

TikTok forms outside group to help shape content moderation policies

Photo: Lionel Bonaventure/AFP via Getty Images.

TikTok on Wednesday unveiled a group of outside advisers with expertise in child safety, hate speech, misinformation and other areas that will help guide its content moderation policies.

The big picture: Online platforms are facing intense scrutiny from lawmakers and even the Justice Department over how they decide what their users can and can't say and do.

Details: The Content Advisory Council will discuss existing and potential future policies against misinformation and election interference at its first meeting later this month, TikTok U.S. general manager Vanessa Pappas wrote in a blog post.

  • Dawn Nunziato, a George Washington University Law School professor and co-director of the Global Internet Freedom Project who specializes in online free speech issues, will chair the council.

Other members include:

  • Mary Anne Franks, a University of Miami Law School professor and critic of Section 230 of the Communications Decency Act, which gives online companies broad license to moderate their platforms as they see fit;
  • Rob Atkinson, the president of the tech policy think tank Information Technology and Innovation Foundation; and
  • Hany Farid, a UC Berkeley professor who focuses on digital image analysis.

"We want to surround ourselves with experts who can both evaluate the actions we've taken and provide guidance on additional measures we could be pursuing," Pappas said.

Background: TikTok in October said it was working with lawyers from the firm K&L Gates — including former Congressmen Bart Gordon and Jeff Denham — to help form the external advisory group.

  • The company faces criticism from lawmakers on issues including privacy, content moderation, security and its ties to China, with Republican Senators Josh Hawley and Rick Scott introducing a bill that would ban federal government employees from using the video sharing app on their work devices.
  • In January, TikTok updated its community standards, providing more details on content it deems unacceptable and its approach to misinformation.
  • And earlier this month, TikTok said it would open a transparency center in Los Angeles that will allow experts to observe its content-moderation process.

Go deeper

TikTok plans Los Angeles "transparency center" to assuage critics

Illustration: Aïda Amer/Axios

TikTok said Tuesday that it plans to open a "transparency center" in Los Angeles where experts can observe the Chinese-owned platform's moderation processes.

Why it matters: Critics have worried over the degree to which China might influence TikTok's content policies and practices, now or in the future.

GOP senators introduce bill to ban TikTok on government devices

Photo illustration: Omar Marques/SOPA Images/LightRocket via Getty Images.

Republican Sens. Josh Hawley and Rick Scott are introducing legislation to bar federal employees from using TikTok on government devices, citing national security concerns.

The big picture: Chinese tech companies like TikTok parent ByteDance are drawing rising scrutiny from policymakers who argue that Beijing can tap them to harvest vast amounts of data from Americans.

Report urges alternative to tampering with tech's liability shield

Illustration: Aïda Amer/Axios

A new report out Tuesday from a non-profit focused on online free expression is calling on federal lawmakers to mandate more transparency from tech companies rather than weakening the industry's liability shield.

Why it matters: Internet platforms could embrace policies like transparency requirements as a far more palatable alternative to eroding their immunity from lawsuits over user-posted content, which they say is vital to their existence.