Apr 16, 2020 - Health

Facebook will notify users who engaged with coronavirus misinformation

The Facebook logos. Photo: Chesnot/Getty Images

Facebook will begin informing people who have engaged with coronavirus misinformation on its main Facebook app, the company announced Thursday. It will guide those people to resources from the World Health Organization.

Why it matters: The tech giant typically doesn't inform users if they've engage in debunked content, aside from informing readers about Russian disinformation.

Details: Facebook will use its News Feeds to notify users who have liked, commented or reacted to a coronavirus post on Facebook that has been debunked through its fact-checking process.

  • The company is still testing different variations of what the notifications will look like, according to a spokesperson.

Our thought bubble: Facebook and other platforms have become a lot more aggressive in their misinformation policies during the coronavirus pandemic.

  • It's hard to believe that after the pandemic, these policies will go away.

By the numbers: During the month of March, the tech giant says it displayed warnings on about 40 million posts on Facebook, as a result of 4,000 fact checks from Facebook’s fact checking partners.

  • When a user encounters misinformation, Facebook has been directing them to resources with accurate information about the virus.
  • To date, Facebook says it's directed over 2 billion people to resources from the WHO and other health authorities through its COVID-19 Information Center and pop-ups on Facebook and Instagram.
  • Facebook says that more than 350 million people globally have clicked through those pop-ups to learn more. 

Between the lines: In the past, Facebook has struggled to figure out the best way to flag misinformation without incentivizing people to click further into it.

  • In 2017 it said it would no longer use "Disputed Flags" — red flags next to fake news articles — to identify fake news for users because it caused more people to click on the debunked posts.
  • But now Facebook says that the warning labels seem to be working. According to the company, only 5% of people were exposed to those labels went on to view the original content.

Go deeper:

Go deeper