An "infodemic" of misinformation and disinformation has helped cripple the response to the novel coronavirus.
Why it matters: High-powered social media accelerates the spread of lies and political polarization that motivates people to believe them. Unless the public health sphere can effectively counter misinformation, not even an effective vaccine may be enough to end the pandemic.
Driving the news: This month the WHO is running the first "infodemiology" conference, to study the infodemic of misinformation and disinformation around the coronavirus.
What they're saying: While fake news is anything but new, the difference is the infodemic "can kill people if they don't understand what precautions to take," says Phil Howard, director of the Oxford Internet Institute and author of the new book "Lie Machines."
- Beyond its effect on individuals, the infodemic erodes trust in government and science at the moment when that trust is most needed.
- A study by the Reuters Institute found 39% of English-language misinformation assessed between January and March included false claims about the actions or policies of authorities.
By the numbers: 38% of Americans surveyed by Pew in June said that compared to the first couple of weeks of the pandemic, they found it harder to identify what was true and what was false about the virus.
How it works: Misinformation and disinformation have always been a destabilizing feature of infectious disease outbreaks. But several factors have made the situation worse with COVID-19.
- An evolving outbreak: COVID-19 is new, and as scientists have learned more about the virus, they've had to change recommendations. That's how science works, but "if you're distrustful of authorities, an expert taking a position different than it was three days ago just confirms your bias," says Joe Smyser, CEO of the Public Good Projects.
- Social media: While experts give some credit to companies like Facebook and Twitter for their efforts to stem the spread of coronavirus misinformation, the reality is that platforms built on engagement will often end up as conduits of conspiracy content, which Howard notes tends to be unusually "sticky."
- Disinformation warfare: In June, the European Commission issued a joint communication blaming Russia and China for "targeted influence operations and disinformation campaigns around COVID-19 in the EU." And those campaigns are effective — in a recent study, Howard found disinformation from Russian and Chinese state sources often reached a bigger audience on social media in Europe than reporting by major domestic outlets.
- Political and media polarization: "In our hyper-polarized and politicized climate, many folks just inherently mistrust advice or evidence that comes from an opposing political party," notes Alison Buttenheim of the University of Pennsylvania School of Nursing. Conservatives are particularly vulnerable — an April study found Americans who relied on conservative media were more likely to believe conspiracy theories and rumors about the coronavirus.
What to watch: Whether the infodemic causes a significant chunk of the U.S. public to opt-out of a future COVID-19 vaccine.
- In a CNN poll in May, a third of Americans said they would not try to get vaccinated against COVID-19. If that proportion holds or rises, a vaccine would be "unlikely" to provide herd immunity, warns Anthony Fauci.
- The highly-organized and internet-savvy anti-vaxxer community is already targeting a potential COVID-19 vaccine. That includes attending Black Lives Matter events to convince protesters that "vaccines are part of structural racism," says Smyser.
The bottom line: While the pandemic wasn't human-made, the infodemic surely is. But that means public health experts and the public itself can put a halt to it with the right strategy.
Go deeper: I talk a bit about this at the top of this morning's "Axios Today" podcast. Listen here.