Facebook on Monday became the latest in a run of tech firms and media outlets taking action to stem the tide of COVID-19 vaccine misinformation, but experts worry the scramble to limit vaccination skepticism may be too little, too late.
Why it matters: "With all of these press releases, what we don't understand is, how is it actually going to be operationalized?" says Claire Wardle, the U.S. director of anti-misinformation nonprofit First Draft. "Anti-vaxxers have historically always figured out where the policy guidelines are and figure out a way around them."
It will take an all-out national effort to dismantle the radicalization pipeline that has planted conspiracy theories in the heads of millions of Americans and inspired last month's attack on the Capitol, experts tell Axios.
Two key measures that could make a difference:
A majority of Americans think social media "has played a role in radicalizing people," according to a new poll from Accountable Tech and Greenberg Quinlan Rosner shared exclusively with Axios.
The big picture: As misinformation proliferates online about COVID-19, vaccines and politics, social platforms are walking a tightrope between protecting freedom of speech and tamping down the flow of misleading content.
Facebook says it will take tougher action during the pandemic against claims that vaccines, including the COVID-19 vaccination, are not effective or safe.
Why it matters: It's a partial reversal from Facebook's previous position on vaccine misinformation. In September, Facebook CEO Mark Zuckerberg said the company wouldn't target anti-vaccination posts the same way it has aggressively cracked down on COVID misinformation.
Verified accounts on Twitter shared more content from deceptive websites than ever in 2020, according to new research from the German Marshall Fund shared exclusively with Axios.
Why it matters: Verified accounts are supposed to help social media users seek out trustworthy information and know who they're hearing from. If verified users constantly share false information, it defeats the purpose and reinforces false narratives.
Supporters of former President Donald Trump who thought he was about to stop the inauguration, seize power and crush his enemies were left blinking in the sunlight Wednesday as President Biden took the oath of office.
Why it matters: It's an inflection point for anyone who realizes they've been strung along by QAnon and related strands of pro-Trump magical thinking. They could either retreat from conspiracy theories or tumble deeper down the rabbit hole.
The online far right is about to face a cold reality it long denied was a possibility: the post-Trump era.
What's happening: Fringe-right internet users are broadly poised to enter the Biden era in one of three states: Denial, disenchantment or determination to use the moment to their advantage.
Last week's riot at the Capitol was many things, but perhaps chiefly it was the culmination of four years of information warfare waged against the country from within the Oval Office.
Why it matters: A sprawling disinformation campaign led by President Trump — and buttressed by his allies in the media, online and in Congress — has severely destabilized the U.S. and makes further acts of violence and would-be insurrection a near certainty.
YouTube said Tuesday that it has taken down newly posted video content from President Trump for violating its policies against inciting violence. In addition, it has assessed a "strike" against the account, which means the president can't upload new videos or livestream to the account for a minimum of 7 days.
In a court filing late Tuesday, Amazon said it booted right-wing social network Parler from its AWS cloud service after flagging dozens of pieces of violent content starting in November.
Why it matters: Parler is suing Amazon, saying its expulsion violates antitrust laws. In its response, Amazon cites the violent content as well as its protection under section 230 of the Communications Decency Act among its defenses.