Facebook says it sees signs that COVID vaccine hesitancy is declining
Facebook said Wednesday it's seeing signs that resistance to the COVID-19 vaccine is weakening both in the United States and abroad, though it acknowledged it still doesn't have hard numbers on how frequently misinformation is being shared on its platforms.
Why it matters: Facebook touts a survey showing improved attitudes toward the vaccines, but that survey finding raises questions, as other polling has shown significant and entrenched hesitancy, especially in the U.S. It also doesn't show that Facebook or other social media can be credited for any shift.
When it comes to the spread of hate speech, child exploitation and other types of problematic content, Facebook says the key metric is prevalence — that is, how often such content is shown — and points to declines in such information appearing on its site. Hate speech, for example, represents just five posts in 10,000.
- But Facebook said it doesn't yet have data to share when it comes to COVID-19 misinformation.
- Instead, Facebook points to other statistics, such as how many times people viewed authoritative content about the vaccine, or how many users posted a profile picture frame highlighting their vaccination status.
- The company acknowledged that anti-vaccine information is often less overt than changing a profile picture.
The other side: Critics blasted Facebook, with Accountable Tech executive director Nicole Gill saying that "Facebook is teeming with deadly vaccine misinformation."
- "It is hard to overstate the shamelessness of Facebook executives who continue to portray the company as heroes of this pandemic and dutiful custodians of authoritative information," Gill said in a statement.
- "You don’t have to take my word for it – just spend a few hours on the platform," Gill said. "Or listen to the heartbreaking testimonials from health workers who have lamented their dying patients’ exposure to those viral lies."
- The Biden Administration also criticized Facebook. “In the middle of a pandemic, being honest and transparent about the work that needs to be done to protect public health is absolutely vital, but Facebook still refuses to be straightforward about how much misinformation is circulating – and being actively promoted – on their platform," White House spokesman Michael Gwin said in a statement to Axios.
Facebook also released for the first time a report on the most widely viewed content, including specific links and domains, aiming to show that such content is a mix that includes memes and nonprofit organizations.
- The New York Times' Kevin Roose has used Facebook's CrowdTangle data to show that the content that is most widely engaged with tends to be right-leaning political content.
- Facebook's new data doesn't contradict these findings, but offers up a different metric.
- "The narrative that has emerged is quite frankly wrong," Guy Rosen, Facebook's vice president of Integrity, said on a conference call with reporters. "We think it is really important for this data to be out there."
The big picture: Facebook has long argued that engagement as a proxy alone shouldn’t be used to determine what’s the most vital content in its platform. Instead, it’s argued that reach is a better metric.
- But the tech giant for years has refused to release reach data, frustrating journalists that rely on engagement data from CrowdTangle.
Between the lines: Earlier this summer, the New York Times published a report that said Facebook was moving CrowdTangle over to its community standards enforcement team, to help it better identify threats.
- Sources say this caused frustration among CrowdTangle employees, who worried about whether the data would still be visible enough to the public.
- Facebook is now releasing reach data, but on its own terms.
- The lists don’t show what’s garnering the most reach at any given time, like CrowdTangle does, but rather over larger chunks of time, to show generalized patterns.