Photo: SOPA Images/Getty Images
Facebook announced Tuesday that it is changing its algorithm to weed out news that is misleading about causes and cures around health conditions. The announcement was made following a Wall Street Journal report detailing examples of posts that promote spammy or misleading health care cures.
Why it matters: Experts cite online misinformation on Facebook and other platforms for creating real-world health problems. Most recently, platforms like Facebook have been blamed for harboring anti-vaccination content, which many argue helped lead to an outbreak of measles cases in the U.S.
Driving the news: Facebook said last month it made two updates to the way it ranks content in its News Feed to reduce posts with exaggerated or sensational health claims and posts attempting to sell products or services based on health-related claims.
- It now considers down-ranking posts about health care that exaggerate or mislead, like sensational claims about miracle cures.
- It also now considers if a post promotes a product or service based on a health-related claim, like promoting a medication claiming to help someone lose weight.
Between the lines: YouTube has also been taking some action against bad health care content by cutting off advertising for bogus cancer-treatment channels, according to the Wall Street Journal.
- Facebook and YouTube tell the Journal that they're taking specific action against cancer-related content after the Journal presented the companies with examples of promotional and spammy posts it found on their platforms that promised miracle cures or bogus health services.
Be smart: Many of these scams and hoaxes, like clickbait, are created to drive clicks that can lead to ad farms or are meant to scam people into buying products or services that are not approved by regulators for consumption.
The big picture: Facebook has mostly figured out how to weed out scam posts that have been uploaded by bots, but it's had a much harder time filtering out content uploaded by humans that doesn't explicitly violate its rules.
- Facebook is now establishing more boundaries around what's acceptable on its platform and what's not to be able to tailor filter algorithms around those standards.
Go deeper: Anti-vaccination content haunts Big Tech