Verified accounts on Twitter shared more content from deceptive websites than ever in 2020, according to new research from the German Marshall Fund shared exclusively with Axios.
Why it matters: Verified accounts are supposed to help social media users seek out trustworthy information and know who they're hearing from. If verified users constantly share false information, it defeats the purpose and reinforces false narratives.
Supporters of former President Donald Trump who thought he was about to stop the inauguration, seize power and crush his enemies were left blinking in the sunlight Wednesday as President Biden took the oath of office.
Why it matters: It's an inflection point for anyone who realizes they've been strung along by QAnon and related strands of pro-Trump magical thinking. They could either retreat from conspiracy theories or tumble deeper down the rabbit hole.
The online far right is about to face a cold reality it long denied was a possibility: the post-Trump era.
What's happening: Fringe-right internet users are broadly poised to enter the Biden era in one of three states: Denial, disenchantment or determination to use the moment to their advantage.
Last week's riot at the Capitol was many things, but perhaps chiefly it was the culmination of four years of information warfare waged against the country from within the Oval Office.
Why it matters: A sprawling disinformation campaign led by President Trump — and buttressed by his allies in the media, online and in Congress — has severely destabilized the U.S. and makes further acts of violence and would-be insurrection a near certainty.
YouTube said Tuesday that it has taken down newly posted video content from President Trump for violating its policies against inciting violence. In addition, it has assessed a "strike" against the account, which means the president can't upload new videos or livestream to the account for a minimum of 7 days.
In a court filing late Tuesday, Amazon said it booted right-wing social network Parler from its AWS cloud service after flagging dozens of pieces of violent content starting in November.
Why it matters: Parler is suing Amazon, saying its expulsion violates antitrust laws. In its response, Amazon cites the violent content as well as its protection under section 230 of the Communications Decency Act among its defenses.
Amazon's decision to boot conservative chat site Parler from its hosting platform, on the heels of Twitter and many other services banishing President Trump, brings three decades of hot argument over online speech to a boil.
Why it matters: Four years of a president who behaved like a boundary-pushing online troll, fostering mayhem that culminated in Wednesday's assault on the Capitol, finally forced the executives who control today's internet to draw lines.
Wednesday's assault on the U.S. Capitol was an appalling shock to most Americans, but to far-right true believers it was the culmination of a long-unfolding epic.
The big picture: A growing segment of the American far right, radicalized via social media and private online groups, views anyone who bucks President Trump's will as evil. That includes Democrats, the media, celebrities, judges and officeholders — even conservatives, should they cross the president.
A large collection of nonprofits is sending an open letter today calling on the incoming Biden-Harris administration to do a better job of both educating the public on misinformation and taking stronger action to protect the health care system, voting process and other critical institutions.
Why it matters: Misinformation amplified on social media has worn down the factual foundations of democracy and led to an upsurge in conspiracy theories on everything from the 2020 election results to how COVID-19 spreads.
Facebook and other big online platforms insist they're removing more and more misinformation. But they can't say whether they're actually stemming the tide of lies, and neither can we, because the deluge turns out to be impossible to define or measure.
Why it matters: The tech companies mostly won't share data that would let researchers better track the scale, spread and impact of misinformation. So the riddle remains unsolved, and the platforms can't be held accountable.