Get the latest market trends in your inbox

Stay on top of the latest market trends and economic insights with the Axios Markets newsletter. Sign up for free.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Minneapolis-St. Paul

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa-St. Petersburg news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa-St. Petersburg

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Photo illustration: Manish Rajput/SOPA Images/LightRocket via Getty Images

Facebook will enlist academics to study whether and how its platforms end up influencing the 2020 U.S. presidential election, the company announced Monday.

Between the lines: Facebook is trying to show it's being mindful of its potential to amplify election-related misinformation. In 2016, CEO Mark Zuckerberg famously said it was a "pretty crazy idea" that Facebook had any influence over that election, which was quickly proven wrong.

Details: A group of 17 outside academics will work with Facebook on experiments in which users will see tweaked News Feeds and ad experiences.

  • Those participants will then be surveyed on their experiences and asked about their viewpoints. The idea will be to assess whether exposure to different experiences on Facebook and Instagram sways people's views.
  • Facebook plans on publishing results next year. It won't have any veto power over the results.

Context: Foreign-fueled misinformation, along with increasingly volatile political discourse, spread on Facebook before the 2016 presidential election, something Facebook has had to reckon with since, facing Congressional hearings and other scrutiny.

What they're saying: "We need to better understand whether social media... largely reflects the divisions that already exist; if it helps people to become better informed about politics, or less; or if it affects people’s attitudes towards government and democracy, including whether and how they vote," Nick Clegg, Facebook's VP of global affairs and communications and Chaya Nayak, head of Facebook’s open research and transparency team, wrote.

Go deeper

Nov 19, 2020 - Technology

Facebook says very few people actually see hate speech on its platform

Photo: Jaap Arriens/NurPhoto via Getty Images

Facebook said it took action on 22.1 million pieces of hate speech content to its platform globally last quarter and about 6.5 million pieces of hate speech content on Instagram. On both platforms, it says about 95% of that hate speech was proactively identified and stopped by artificial intelligence.

Details: In total, the company says that there are 10–11 views of hate speech for every 10,000 views of content uploaded to the site globally — or .1%. It calls this metric — how much problematic content it doesn't catch compared to how much is reported and removed — "prevalence."

Nov 20, 2020 - Technology

Biden's Day 1 challenges: Misinformation flood control

Illustration: Eniola Odetunde/Axios

President-elect Joe Biden will enter office with no fast fixes at hand to stem a tide of online misinformation that has shaped election-year politics and, unchecked, could undermine his presidency.

Where it stands: Election and coronavirus misinformation spreading widely on digital platforms has already done serious damage to the U.S., and it's bound to go into overdrive as the Biden administration starts enacting its agenda.

Wisconsin recount reaffirms Biden's victory in the state

Photo by Mark Makela/Getty Images

The two recounts in Wisconsin requested by the Trump campaign were completed Sunday and confirmed that President-elect Joe Biden won the state, the Washington Post reports.

Driving the news: Biden won Wisconsin by more than 20,000 votes. Recounts in the state's most populous and liberal areas — Dane and Milwaukee counties — netted him an additional 87 votes.