Get the latest market trends in your inbox

Stay on top of the latest market trends and economic insights with the Axios Markets newsletter. Sign up for free.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Minneapolis-St. Paul

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa-St. Petersburg news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa-St. Petersburg

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Illustration: Eniola Odetunde/Axios

Silicon Valley's platforms are relieved to see Election Day slip into the past and feel they did a much better job than in 2016 at deflecting foreign meddling and disinformation, even as critics continue to point out new failures and President Trump's refusal to concede has laid new challenges in their path.

Driving the news: With online polarization deepening after a close election, the CEOs of Facebook and Twitter will face hostile Senate questioning Tuesday from both sides of the aisle.

The big picture: Tech companies took unprecedented steps to curb the spread of electionmisinformation and are breathing a sigh of relief that they averted their biggest nightmare — a repeat of 2016's foreign-disinformation debacle.

But a different test came in the days after the polls closed, as the companies faced constant judgment calls over posts from political leaders, most prominently Trump himself, claiming victory without basis and charging fraud without evidence.

Before and after Election Day, the platforms found themselves reacting, adapting and sometimes improvising new rules on the fly, despite their lengthy preparations.

Facebook
  • Shortly after Election Day, while votes were still being tallied, Facebook announced new steps, including the temporary demotion of posts containing election-related misinformation and limits on the distribution of some election-related live streams.
  • The company faced criticism for being slow to respond to groups and events proliferating on Facebook that suggested that the election was being "stolen" by Democrats through vote-by-mail fraud.
  • Along with Google, Facebook confirmed last week that it would extend its political advertising ban to cover a longer post-election period during which one side was challenging the results. That means these bans will remain in place as millions of dollars start flowing into two Georgia Senate runoffs in early January that will determine which party controls the Senate.
Twitter
  • Twitter, which was the first online platform to ban political ads, was widely seen as taking the toughest measures against misinformation.
  • The company said in an election post-mortem post that approximately 74% of the people who viewed problematic tweets saw them after it had applied a label or warning message.
  • It said it estimated a 29% decrease in shares via "quote tweets" of these labeled tweets, due in part to a warning prompt it placed on them.
YouTube
  • Google-owned YouTube allows "discussion of election results and the process of counting votes." Under that policy, the online video giant left up videos from news channels like OANN that have pushed false stories on election fraud and postal ballots.
  • YouTube says its panels linking to Google’s election results, featured on the YouTube homepage, on videos and in searches, were shown "billions of times."
  • "The most popular videos about the election are from authoritative news organizations," said YouTube spokesperson Ivy Choi.
  • According to Choi, when users search for election-related content, 88% of the videos shown in their top-10 results in the U.S. come from "authoritative sources."
Snapchat

Unlike other platforms, Snapchat did not have to rush out a number of new policies in the weeks before the election, because its policies already covered most of the cases that its rivals rushed to handle.

  • It also didn't need to make any adjustments to its political ad policies, because the company fact-checks all of its ads.
  • Over 30 million Snapchat users used its voter engagement programs, which helped users register to vote or make a plan to vote, with over 1.2 million users registering to vote via these programs.
  • Snapchat, by design, doesn't have the same type of publicly-facing viral news environment that Facebook and Twitter have, shielding it from some of its peers' woes.
Tiktok

The short video-sharing platform has tried to remain apolitical, but it ended up struggling to contain election-related misinformation anyway.

  • Shortly after Election Day, TikTok took action against several election-related hashtags, like #RiggedElection, but left others up, TechCrunch reported. The platform also took down a number of popular videos touting unsupported election-fraud allegations after a report by Media Matters.
  • TikTok didn't have election-related statistics to share, but spokesperson Jamie Favazza said TikTok would provide a recap of its election integrity efforts "in the future" and is still removing misleading information, videos and violence-inciting accounts.
Labeling

The platforms' chief remedy for unsupported claims of election fraud was to append posts with labels pointing to authoritative information.

But, but, but: Misinformation experts say that labeling policies did not deter people from sharing knowingly false information.

  • Platforms applied labels to false information, including from the president, but in many cases did not apply more "friction" to prevent sharing, meaning labeled content still spread wildly, said Alex Stamos, former Facebook CISO and head of the Stanford Internet Observatory.
  • "This should not be happening," Stamos said. "It's the weakness in the protections the companies have in place. The label isn't doing anything."
  • A lack of aggressive down-ranking and limitations on re-shares will continue to be a significant issue, he said.

The bottom line: Under normal circumstances, election-related debate is an asset of democracy, and the last kind of content that a tech platform would block. The combination of a deeply polarized nation and a chief executive prone to tweeting falsehoods pushed these companies' systems beyond anything they were built for.

  • Ten days after the election, misleading or deceptive new narratives continue to bubble up online, according to experts from the Election Integrity Project: misleading charts and statistics about voting, anecdotes about dead voters, false storylines about voting machine "glitches" affecting election outcomes.

What's next: Twitter CEO Jack Dorsey and Facebook CEO Mark Zuckerberg will be testifying before the Senate Judiciary Committee on Tuesday.

Go deeper

More than 20,000 users submit cases to Facebook oversight board

Illustration: Aïda Amer/Axios

More than 20,000 people have submitted cases to Facebook's independent Oversight Board since the board started accepting user appeals in October, the organization announced Monday, and it has selected six initial cases for review.

Why it matters: The number of submissions speaks to the multitude of people who feel the platform's moderation of their content has wronged them. The tiny number of cases getting reviewed speaks to the limits of human oversight on a platform the size of Facebook, as well as to the novelty of the board's process and the complex nature of the cases chosen.

Updated 18 hours ago - Politics & Policy

Barr says DOJ has not seen evidence of fraud that would change election results

Photo: Kent Nishimura/Los Angeles Times via Getty Images

Attorney General Bill Barr told AP on Tuesday that the Department of Justice has not uncovered evidence of widespread voter fraud that would change the outcome of the 2020 presidential election.

Why it matters: It's a direct repudiation of President Trump's baseless claims of a "rigged" election from one of the most loyal members of his Cabinet.

Dec 1, 2020 - Technology

Facebook, Google push deals despite antitrust scrutiny

Illustration: Eniola Odetunde/Axios

Facebook announced Monday that it has purchased a customer service chatbot startup called Kustomer. The app reportedly cost Facebook $1 billion, the same amount it paid for Instagram in 2012.

Why it matters: The deal is the latest sign that the world's biggest tech companies, despite facing enormous antitrust scrutiny globally, will not stop buying up other companies.
.