YouTube cracks down on anti-vax misinformation
YouTube is beefing up its misinformation policies to crack down on anti-vaccine misinformation beyond COVID-19 vaccinations, executives told Axios.
Why it matters: Under the new policy, YouTube will terminate the channels of what it calls prominent vaccine misinformation spreaders, including the channel of the Robert F. Kennedy Jr.-affiliated Children's Health Defense.
- YouTube will also terminate the channels belonging to Joseph Mercola, Erin Elizabeth and Sherri Tenpenny — all identified by the Center for Countering Digital Hate as among a dozen playing leading roles in spreading online misinformation about COVID vaccines.
Driving the news: YouTube will remove misinformation about currently administered vaccines that are approved and confirmed to be both safe and effective by local health authorities and the World Health Organization, YouTube vice president of global trust and safety Matt Halprin told Axios.
- That means YouTube will take down videos that claim such vaccines are dangerous; cause cancer, infertility or autism; or contain microchips, Halprin said.
The new policy builds on existing rules against COVID-19 vaccine misinformation, which have led to the removal of 130,000 videos since October. YouTube says it has taken down more than 1 million videos for violating its overall COVID-19 medical misinformation policy.
Yes, but: There are exceptions to the rules.
- YouTube will allow scientific discussions, such as content about a specific clinical trial, or videos about historic vaccine successes and failures.
- Personal testimony will be allowed — such as a parent describing their own experience with their child's vaccination — but with limits.
- "If the speaker then goes on to generalize and make calls for all parents not to vaccinate or makes broad claims about vaccines not being safe or effective," that would be removed, Halprin said.
The big picture: Anti-vaccine influencers have been a problem on social media long before the pandemic.
- "In the weeks and months that followed the launch of the COVID-19 vaccine misinformation policy, we observed content and realized that there appeared to be an interaction between general vaccine hesitancy that was being promoted on the platform and COVID-19 vaccine misinformation," Halprin said. "We felt like we need to address both."