Stories

Facebook's next fake news headache: messaging

While much of the fake news conversation in the U.S. has been around fake news spreading on Facebook's News Feed, reports of misinformation spreading globally on Facebook's messaging properties, Whatsapp and Messenger, demonstrate another major problem for the global tech company.

Why it matters: Facebook's messaging footprint is larger than its main app, globally, and an increasing number of people in developing countries are reliant on its messaging platforms for news and information. If Facebook is struggling to contain fake news on its algorithmically-programmed News Feed, you can imagine how difficult it must be to clamp down fake news on intentionally opaque and encrypted platforms that are meant to protect individual conversations.

Data: eMarketer, Line, Tencent, pymnts.com; Chart: Lazaro Gamio/Axios
  • In India, the Minister for Electronics and Information Technology said the country is "helpless" to stop the fake news epidemic on Whatsapp because it can't access content through Whatsapp's encryption, per CNET.
  • In Kenya, government officials accused the managers of 21 WhatsApp groups of spreading hate, per Quartz.
  • In Catalonia, a journalist tells The Washington Post that they are aware of fake news being spread on Whatsapp, "but we've only been able to debunk those pieces that were sent to us by our users since Whatsapp is such a private system."
  • In the U.S., Facebook admitted earlier this month that some of the 470 accounts Facebook identified at the time as Russian-backed were used used its Messenger product as a part of efforts to meddle in the 2016 election, per Recode.

Why messaging is so hard to monitor: WhatsApp and Messenger both allow users to use end-to-end encryption on mobile, meaning stories, posts, pictures and videos cannot be viewed by anyone outside of an individual or group conversation — making fake news harder to track. As WhatsApp notes in its post about end-to-end encryption, not even WhatsApp can read what is shared between users on its platform.

  • The privacy problem: "The ability of WhatsApp to police fake news is difficult because the end-to-end encryption only allows the sender and receiver to see the message," says Matthew Heiman, chairman of the Federalist Society Regulatory Transparency Project on Cybersecurity and a former Department of Justice lawyer. "This means that WhatsApp is not currently able to see the content."
  • The catch: "For some customers, that's a key feature of the service," said Heiman. "If WhatsApp were to alter the end-to-end encryption, privacy-minded customers would likely seek other platforms."
  • The chat room problem: "With WhatsApp chat rooms being invitation only and the publishers of the news remaining unknown, it makes the distribution of news that much more of an issue to trust and maintain." said Stefanie Sena, associate director of digital account management with MNI Targeted Media Inc.

Reach: To date, there are more users of messaging apps globally than social media apps, and Facebook owns an overwhelming majority of that traffic. While Facebook has more than 2 billion monthly active users, Messenger and Whatsapp have a combined reach of roughly 2.5 billion monthly active users. Facebook-owned WhatsApp is the number one messaging app in 107 countries around the world, and Facebook's Messenger is number one in 58 countries, according to a SimilarWeb study.

Facebook's approach: WhatsApp is working to educate people on how to stay safe on our platform, including tips on how to report bad content to WhatsApp and block unknown users. But so far, the tech giant hasn't publicly revealed many details around greater efforts to fight fake news on messaging platforms. A Whatsapp spokesperson told Axios in response to fake news spreading in India: "We recognize that this is a challenge and we're thinking through ways we can continue to keep WhatsApp safe."

  • In October, Vice President of Messaging Products at Facebook David Marcus said the company needs to do better to weed out malicious use. "In the future we need to increase our level of scrutiny and challenge ourselves to understand the ways people might use a platform in the ways it wasn't designed for," Marcus said at The Wall Street Journal's WSJD.Live technology conference.
  • This isn't just a Facebook messaging problem: Columbia Journalism Review wrote a piece Tuesday about how the Chinese social messaging behemoth WeChat, owned by Tencent, was used to spread misinformation during and after the 2016 election season.