Facebook says it will crack down on COVID vaccine misinformation
Facebook says it will take tougher action during the pandemic against claims that vaccines, including the COVID-19 vaccination, are not effective or safe.
Why it matters: It's a partial reversal from Facebook's previous position on vaccine misinformation. In September, Facebook CEO Mark Zuckerberg said the company wouldn't target anti-vaccination posts the same way it has aggressively cracked down on COVID misinformation.
Details: Facebook is doing four things to crack down on misinformation about COVID-19 and vaccinations in general, following consultation with the World Health Organization:
- Updating misinformation policies to bar the posting of debunked claims about the vaccines, like the idea that vaccines are not effective or cause autism. Groups, pages and accounts on Facebook and Instagram that repeatedly share these debunked claims may be removed altogether, the company said.
- Adding directions on how and where to get vaccinated to its COVID-19 information center, tapping information provided by users' local health officials.
- Giving $120 million in ad credits to help health agencies, NGOs and UN agencies reach billions of people with information about the COVID-19 vaccine and preventive health.
- Making it harder to find unchecked vaccine misinformation on its platforms by returning validated vaccine info when users search for terms related to debunked claims.
The big picture: The actions follow a similar playbook that the tech giant has used to squash misinformation around highly sensitive events, including the election.
- Facebook created a voter information center ahead of the 2020 election that provided verified information about voting to users and directed people on how to register to vote and where their local polling places were.