Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Stay on top of the latest market trends
Subscribe to Axios Markets for the latest market trends and economic insights. Sign up for free.
Sports news worthy of your time
Binge on the stats and stories that drive the sports world with Axios Sports. Sign up for free.
Tech news worthy of your time
Get our smart take on technology from the Valley and D.C. with Axios Login. Sign up for free.
Get the inside stories
Get an insider's guide to the new White House with Axios Sneak Peek. Sign up for free.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Want a daily digest of the top Denver news?
Get a daily digest of the most important stories affecting your hometown with Axios Denver
Want a daily digest of the top Des Moines news?
Get a daily digest of the most important stories affecting your hometown with Axios Des Moines
Want a daily digest of the top Twin Cities news?
Get a daily digest of the most important stories affecting your hometown with Axios Twin Cities
Want a daily digest of the top Tampa Bay news?
Get a daily digest of the most important stories affecting your hometown with Axios Tampa Bay
Want a daily digest of the top Charlotte news?
Get a daily digest of the most important stories affecting your hometown with Axios Charlotte
The Facebook logos. Photo: Chesnot/Getty Images
Facebook will begin informing people who have engaged with coronavirus misinformation on its main Facebook app, the company announced Thursday. It will guide those people to resources from the World Health Organization.
Why it matters: The tech giant typically doesn't inform users if they've engage in debunked content, aside from informing readers about Russian disinformation.
Details: Facebook will use its News Feeds to notify users who have liked, commented or reacted to a coronavirus post on Facebook that has been debunked through its fact-checking process.
- The company is still testing different variations of what the notifications will look like, according to a spokesperson.
Our thought bubble: Facebook and other platforms have become a lot more aggressive in their misinformation policies during the coronavirus pandemic.
- It's hard to believe that after the pandemic, these policies will go away.
By the numbers: During the month of March, the tech giant says it displayed warnings on about 40 million posts on Facebook, as a result of 4,000 fact checks from Facebook’s fact checking partners.
- When a user encounters misinformation, Facebook has been directing them to resources with accurate information about the virus.
- To date, Facebook says it's directed over 2 billion people to resources from the WHO and other health authorities through its COVID-19 Information Center and pop-ups on Facebook and Instagram.
- Facebook says that more than 350 million people globally have clicked through those pop-ups to learn more.
Between the lines: In the past, Facebook has struggled to figure out the best way to flag misinformation without incentivizing people to click further into it.
- In 2017 it said it would no longer use "Disputed Flags" — red flags next to fake news articles — to identify fake news for users because it caused more people to click on the debunked posts.
- But now Facebook says that the warning labels seem to be working. According to the company, only 5% of people were exposed to those labels went on to view the original content.
Go deeper: