It's harder than you might think to reliably measure its scale or impact.Dec 9, 2020 - Technology
Each time a domestic politician embraces a disinformation campaign, it proves that they work.Aug 5, 2020 - Politics & Policy
Political strategists to find ways to navigate the new rules of Big Tech.Jan 14, 2020 - Economy & Business
Facebook, TikTok and Reddit all updated their policies on misinformation this week.Jan 10, 2020 - Technology
It's switching from employees to volunteers.Oct 17, 2019 - Technology
Congress' effort to squelch misinformation is broadening to target the cable companies that bring right-wing networks like Newsmax, OANN and Fox News to Americans' screens.
Why it matters: Conspiracy theories, false election claims, anti-vaccination propaganda and other kinds of misinformation spread through a complex ecosystem: Lies bubble up online, then get amplified when cable news channels repeat them, then spread further via social media. Breaking the cycle will require more than just stricter content moderation by online platforms.
Wall Street's populist uprising, the Capitol siege and a strong U.S. anti-vaccination movement show the power of memes in spreading misinformation and influencing communities online.
Why it matters: For years, there's been growing concern that deepfakes (doctored pictures and videos) would become truth's greatest threat. Instead, memes have proven to be a more effective tool in spreading misinformation because they're easier to produce and harder to moderate using artificial intelligence.
Health officials are worried that misinformation about the COVID-19 vaccines and infertility will drive down vaccination rates among women, the Washington Post reports.
Why it matters: False claims about the vaccines are rampant, and threaten to prevent the U.S. from vaccinating enough people to put the pandemic safely behind us.
China, Russia and Iran — drawing on one another’s online disinformation — amplified false theories that the COVID-19 virus originated in a U.S. bioweapons lab or was designed by Washington to weaken their countries, according to a nine-month investigation by AP and the Atlantic Council’s DFRLab.
Why it matters: Through a series of overlapping, if slapdash, efforts, America's global adversaries benefited from mutually reinforcing counter-narratives propagated online that aimed to falsely place responsibility for the pandemic on the U.S. and often to sow doubt on its actual origin within China.
Facebook on Monday became the latest in a run of tech firms and media outlets taking action to stem the tide of COVID-19 vaccine misinformation, but experts worry the scramble to limit vaccination skepticism may be too little, too late.
Why it matters: "With all of these press releases, what we don't understand is, how is it actually going to be operationalized?" says Claire Wardle, the U.S. director of anti-misinformation nonprofit First Draft. "Anti-vaxxers have historically always figured out where the policy guidelines are and figure out a way around them."
It will take an all-out national effort to dismantle the radicalization pipeline that has planted conspiracy theories in the heads of millions of Americans and inspired last month's attack on the Capitol, experts tell Axios.
Two key measures that could make a difference:
A majority of Americans think social media "has played a role in radicalizing people," according to a new poll from Accountable Tech and Greenberg Quinlan Rosner shared exclusively with Axios.
The big picture: As misinformation proliferates online about COVID-19, vaccines and politics, social platforms are walking a tightrope between protecting freedom of speech and tamping down the flow of misleading content.
Facebook says it will take tougher action during the pandemic against claims that vaccines, including the COVID-19 vaccination, are not effective or safe.
Why it matters: It's a partial reversal from Facebook's previous position on vaccine misinformation. In September, Facebook CEO Mark Zuckerberg said the company wouldn't target anti-vaccination posts the same way it has aggressively cracked down on COVID misinformation.
Verified accounts on Twitter shared more content from deceptive websites than ever in 2020, according to new research from the German Marshall Fund shared exclusively with Axios.
Why it matters: Verified accounts are supposed to help social media users seek out trustworthy information and know who they're hearing from. If verified users constantly share false information, it defeats the purpose and reinforces false narratives.
Supporters of former President Donald Trump who thought he was about to stop the inauguration, seize power and crush his enemies were left blinking in the sunlight Wednesday as President Biden took the oath of office.
Why it matters: It's an inflection point for anyone who realizes they've been strung along by QAnon and related strands of pro-Trump magical thinking. They could either retreat from conspiracy theories or tumble deeper down the rabbit hole.