May 28, 2023 - Science

Worries mount about misinformation in science

Illustration: Aïda Amer/Axios

Misinformation and disinformation are diminishing public trust in science and becoming an existential threat to the planet, researchers warned this week.

The big picture: False information about vaccines and climate changeruns rampant on social media, where Americans are increasingly getting their news and information.

  • At the same time, a gap in trust of scientists is widening along political lines: In a Pew Research survey last September, 24% of Republicans said they think scientific experts are typically better than others at making good policy decisions about scientific issues, compared to 55% of Democrats.
  • The causes and the effects of these intersecting trends — with the added force of generative AI — were the focus of a three-day meeting this week in Washington, D.C., organized by the Nobel Foundation and the National Academy of Sciences.

Details: Speakers described how the attention economy's business model intersects with the very human desire to belong — fueling an information environment riddled with misinformation and disinformation.

  • "The real problem isn’t people. It's the reward structure on social platforms," said Gizem Ceylan, a behavioral scientist at Yale University.
  • In a study published this year, Ceylan found the rewards — attracting attention and recognition in the form of likes and comments — of sharing online create habits that drive people's behavior more than their motivations do. For example, the top 15% of habitual news sharers were involved in 30% to 40% of the false news shared in the study.
  • Earlier research zeroed in on people's desire to be a source of novel information — which was more likely to be false — as helping to drive the spread of misinformation and disinformation on Twitter.
  • Ceylan and her collaborators argue "social media sites could be restructured to build habits to share accurate information."

Providing accurate information isn't enough, several speakers emphasized.

  • "Our information problems today are not just technological ones. We aren't going to be able to fact check or innovate or police away racism and information harmful to our digital spaces," said Rachel Kuo, a professor of media studies at the University of Illinois Urbana-Champaign.
  • "Information has always been bound up in social, economic and political systems and structures," she added.

And finding reliable sources of information is difficult because of changes in society in recent decades, said Hahrie Han, a professor of political science at Johns Hopkins University.

  • We now have more social interactions than social relationships and the relationships we do have are more likely to be transactional, she said. And if they are actual social relationships, they are more likely to be with people who are like us because of the disintegration of civic infrastructure, she added. That fuels "echo chambers where disinformation can bounce around."
  • Han conducted research with one of the largest megachurches in the U.S. and says the church motto of "belonging comes before belief" should be leveraged to create relationships of belonging with people across the "social spaces we inhabit."
  • "When we think about combating misinformation, we have to address not only the narratives and the ideas, but also the social infrastructure that underlies how you're interacting with information," said Han.

Of note: Several speakers held up peer review as science's own process of vetting information.

  • But the scientific enterprise — the incentives for scientists to publish and for journals to attract readers, jargon and other features of scientific publishing — can also contribute to the spread of misinformation.

What to watch: Researchers presented findings from the first report of the International Panel on the Information Environment (IPIE), a group formed to assess tools to combat misinformation and disinformation, and inspired by the Intergovernmental Panel on Climate Change (IPCC).

  • Analyzing nearly 600 peer-reviewed studies, they found scientific evidence supports labeling content (tagging about fact-checking, funding and other information that provides context) and publishing corrections to help people assess information on social media. There was less scientific consensus about moderating content.
  • They also made some recommendations for scholars studying misinformation, including standardizing definitions of misleading information and analyzing non-English language sources.

The bottom line: Misinformation is "so far-reaching that it is rapidly becoming an existential threat to the planet," said Sheldon Himelfarb, executive director of IPIE.

Go deeper