COVID-19's misinformation wake-up call
Outrage over misinformation online has been rising for years, but it was the flood of false information surrounding COVID-19 and vaccinations that finally pushed health officials, tech companies and politicians to take strong action.
Why it matters: Political misinformation can sway elections, but COVID misinformation can kill thousands of people a day.
Driving the news: The CEOs of Twitter, Facebook and Google will testify Thursday before the powerful House Energy & Commerce Committee. Chairman Frank Pallone (D-NJ) said law changes should weaken what he describes as the social media giants' financial incentives for amplifying misinformation.
The big picture: Alarm over the potential of COVID-19 misinformation to cause harm spread first from public heath officials to tech firms and then moved to legislators.
The extent of pandemic-related misinformation and disinformation took the public health community by surprise.
- At the start of the shut-downs last year — when many aspects of the virus were still unknown — some researchers were alarmed to see that people weren't dismissing bogus claims about COVID-19.
- "I thought people are surely going to be engaged in science and trust science now," said Jon Agley, associate professor in Indiana University's School of Public Health. He took an informal survey of a few hundred people and found he was wrong.
- What he did: He then led a larger study trying to identify profiles of beliefs about different COVID-19 narratives — the scientific narrative vs. misinformation, such as the virus is caused by 5G cellphone networks or that COVID-19 was developed as a military weapon.
- What he found: People who found COVID misinformation to be more believable didn't necessarily reject the scientifically accepted narrative; some who believed conspiracy theories also thought the scientific explanation might be possible. The study also found that most individuals who believe misinformation believe multiple different misinformation narratives.
In a separate study, Emily Vraga, assistant professor of health communication at the University of Minnesota's School of Journalism and Mass Communication, tested whether sharable infographics created by the World Health Organization would effectively debunk misinformation.
- What she found: Exposure to a corrective graphic on social media from an official health source helped people discount a false COVID-19 prevention strategy.
- The lowered misperceptions persisted for more than a week when the graphic was shared either preemptively or in response to false information.
- "Putting high quality information out there that is easy to understand and share is really important to respond effectively to misinformation," Vraga said. "We need experts creating this kind of content that is optimized for social media."
Between the lines: Among healthcare officials, the proliferation of pandemic misinformation highlights the need to be more proactive and consistent in putting factual information from trusted, official sources on social media platforms.
- It also showed the need to be transparent about the scientific process, which is slow and evolves as new data and evidence are gathered. There were so many unknowns about the virus in the early days of the pandemic that people filled the void with speculation and doubts about scientific explanations.
- "When we invest a lot of resources in understanding something from a scientific perspective, it makes sense that we also invest resources in how to communicate effectively," Agley said.
Social media networks have not been able to keep up with the tsunami of misinformation, despite their sizable efforts to remove false information and posts that violate their rules.
- Taking down political misinformation is littered with land mines. For example, removing right-wing conspiracy theories landed the companies in hot water with conservatives claiming the social media sites were deliberately censoring their views (although there has been no evidence to support that claim).
- Tech firms in general are wary of judging the veracity of users' posts. But the significant public health harm wrought by COVID-19 misinformation has been a tipping point in pushing them to take stronger action.
- For example, Mark Zuckerberg told "Axios on HBO" in September that Facebook wouldn't treat anti-vaccination posts as COVID-19 misinformation. By February, the policy had changed — and now, the company is working on a massive study of users who are skeptical of vaccines, per the Washington Post.
For Congress members who'll grill the company CEOs on Thursday, the spread of COVID-19 misinformation is another troubling example of how out-of-control false narratives have become on social media.
- "It's not like it just stays on the internet," Pallone told Axios' Margaret Harding McGill. "It's disinformation and extremism that gives rise to racial tension and the attack on the Capitol on January 6. And so I think our laws have to change."