Election disinformation cycle isn't slowing
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Maura Losch/Axios
Nation-state election disinformation won't end on Nov. 5 as government officials prepare to fend off a wave of lies about the outcome.
Why it matters: Conspiracy theories and partisan social media posts peddled by Russia, Iran and China now have a longer shelf life.
- Adversarial nations have a lofty goal this time around: to incite another event like the Jan. 6, 2021, attack on the U.S. Capitol.
Driving the news: Russia, China and Iran are better prepared than they were four years ago to flood the internet with disinformation after the polls close on Election Day, intelligence officials said in an assessment this week.
- The intelligence community now believes that these countries will keep conducting information operations through Inauguration Day on Jan. 20.
- Microsoft said in its own report this week that Russia, Iran and China will continue their disinformation campaigns to cast doubt on the U.S. elections' outcomes.
The big picture: Fake news stories, partisan-leaning social media posts and website defacements are going to be key for nation-state adversaries in the months leading up to the inauguration, according to officials.
- China, Russia and Iran are likely to amplify posts or spread lies that look to undermine confidence in the election and chip away at trust in the democratic process.
Zoom in: Russia and Iran are "willing to at least consider tactics that could foment or contribute to violent protests," according to the intelligence community assessment.
- China, Russia and Iran could deface and take down election websites to feed unfounded concerns that votes are being tampered with.
- Some actors may also use AI and other tools to publish fake election results or create deepfake audio and video that report unofficial results.
Between the lines: Concerns about disinformation spreading after Election Day are unique to this year's elections, Robert Johnston, CEO of Adlumin, told Axios.
- "That was not the narrative in 2020," said Johnston, who helped the Democratic National Committee investigate the 2016 Russia hack.
- The intelligence community feels "a sense of urgency to get something out before the election in hopes to let Americans know to be mindful of what you see," he added.
Flashback: In 2020, social media platforms had tougher content moderation policies in place to attempt to stop the spread of election lies.
- Facebook reduced the visibility of posts and comments that could incite violence, and it stopped suggesting groups for users to join that they might be interested in.
- But even then, Facebook groups remained a major recruitment tool for the "Stop the Steal" movement.
Yes, but: While social media sites are still investigating nation-state disinformation campaigns, they've also taken new stances on how to moderate political misinformation.
- Meta has since started allowing political ads that make incorrect statements about the outcome of the 2020 election and voter fraud.
- X owner Elon Musk has spread his fair share of conspiracy theories.
The bottom line: The intelligence community says getting ahead of conspiracy theories and proactive communications from local and state officials are the best ways to stymie the impact of nation-state disinformation.
- Voters should also put more trust in news from reputable news sources over random sources found on social media, Johnston said.
