Apr 3, 2019

YouTube let extremist content flourish despite warnings

Photo: Carsten Rehder/picture alliance via Getty Images

A damning report from Bloomberg Tuesday revealed that top YouTube executives debated for years whether extremist viral videos on its platform were really a problem — often rejecting solutions to manage the situation — in an effort to maximize growth and profits.

Why it matters: Tech companies have long been criticized for harboring hate, but as the consequences of their inactions begin to unfold more visibly in the real world, companies like YouTube are facing more pressure to address whether their ignorance was actually malpractice.

Driving the news: The most poignant aspect of the Bloomberg report is a narrative similar to one that's been reported about Facebook's handling of Russian misinformation: top executives were repeatedly briefed that there was a problem, and chose to downplay it for the sake of focusing on business outcomes.

  • The Bloomberg story alleges that the company focused on platform "engagement" above all other goals, which deterred corporate leadership from taking action against internal alarms about the ways hate content was flourishing on the platform.
  • It details ways YouTube's "neural network" AI system acted like an "addiction engine," pushing users to consume more videos, regardless of the fringe nature of their content.
  • The report says that after the 2016 election, YouTube, under the helm of CEO Susan Wojcicki, attempted to mitigate the problem by adding a not-well-known measure of “social responsibility” to its recommendation algorithm.
  • It explains that YouTube dissuaded employees from looking for bad videos, because doing so would expose YouTube to more legal liability.

The big picture: The report comes as Facebook is scrambling to manage hateful content and misinformation on its platforms ahead of upcoming elections in India and the coming round of U.S. presidential primaries.

  • The tech giant extended its ban on hate speech to speech that promoted or supported white nationalism and white separatism.
  • It announced Tuesday that it added a tip line for misinformation on its popular messaging app WhatsApp, since the encrypted network makes it nearly impossible to track misinformation.

Be smart: Calls for change have started to pick up in the wake of real-world outcomes occurring as the result of people who have been radicalized by hateful or conspiracy-minded content. As Axios has previously noted:

  • Anti-vaccination content that's long appeared in search results and on social media is now being regulated by social platforms after the U.S. government attributed recent measles outbreaks in part to reduced vaccination levels in some areas.
  • Terrorist attacks and mass shootings, like the recent New Zealand mosque attack, highlight ways that extremists are using social media channels to inspire hate and spread horrifying footage of mass killings.

Bottom line: Two years after the 2016 election, it has become increasingly apparent that Google and Facebook, despite warnings about ways their platforms' algorithms allowed bad content to flourish, shied away from doing much about it for business reasons. Now, facing elections and misinformation crises around the world, they are being forced to reckon with those decisions.

Go deeper

Stocks fall 4% as sell-off worsens

A trader on the floor of the New York Stock Exchange. Photo: Johannes Eisele/AFP via Getty Images

Stocks fell more than 4% on Thursday, extending the market’s worst week since the financial crisis in 2008 following a spike in coronavirus cases around the world.

The big picture: All three indices closed in correction territory on Thursday, down over 10% from their recent record-highs amid a global market rout.

Coronavirus updates: California monitors 8,400 potential cases

Data: The Center for Systems Science and Engineering at Johns Hopkins, the CDC, and China's Health Ministry. Note: China numbers are for the mainland only and U.S. numbers include repatriated citizens.

33 people in California have tested positive for the coronavirus, and health officials are monitoring 8,400 people who have recently returned from "points of concern," Gov. Gavin Newsom said Thursday.

The big picture: COVID-19 has killed more than 2,800 people and infected over 82,000 others in some 50 countries and territories. The novel coronavirus is now affecting every continent but Antarctica, and the WHO said Wednesday the number of new cases reported outside China has exceeded those inside the country for the first time.

Go deeperArrowUpdated 1 hour ago - Health

Watchdog opens probe into VA secretary over handling of sexual assault claim

VA Secretary Robert Wilkie on Fox Business Network’s "The Evening Edit" on Jan. 7. Photo: Steven Ferdman/Getty Images

The Department of Veterans Affairs Inspector General Michael Missal said Thursday he had opened an investigation into VA Secretary Robert Wilkie after lawmakers demanded an inquiry into his handling of a sexual misconduct report, the Washington Post reports.

Context: Wilkie allegedly "worked to discredit" the credibility of Democratic aide and veteran Andrea Goldstein after she reported last fall "that a man groped and propositioned her in the main lobby of the agency's D.C. Medical Center," a senior VA official told the Post.