Inside the White House with D.C.'s most wired reporter. Sign up for Mike Allen's Axios AM.

Stories

The Russian social media disease spread beyond Facebook and Google

Illustration: Rebecca Zisser/Axios

The Russian propaganda campaign to disrupt our elections and divide Americans went far beyond Google and Facebook, infiltrating and infecting everything from Pinterest to PayPal. 

Driving the news: New reports out Monday about Russia’s online disinformation efforts suggest that all of the major social media platforms, ranging from Facebook and Google's empires to Reddit and Tumblr, were weaponized over the past two years. Facebook-owned Instagram was particularly "underestimated."

Why it matters: Policymakers have been public about efforts to understand how big platforms can be weaponized, but they've failed to address how widespread these campaigns are on other platforms, as well as the systemic problems fueling them.

Details:

  • The rise of Instagram: A report from the nonprofit think tank New Knowledge found that in 2017, the Russian troll farm known as the Internet Research Agency (IRA) moved the bulk of its misinformation efforts to less-policed platforms, primarily Instagram, after more surveillance practices took hold on Facebook and Twitter.
  • Offline manipulation: It also suggests that the Russians used other platforms, like fake websites and PayPal accounts, to manipulate users to participate in hyper-political behavior offline, like protests or marches.
  • Luring "assets": The report details how the IRA tried to lure people into doing tasks for them, like soliciting videos or legal requests, by using information against people with personal struggles around things like their sexuality.
  • Selling merchandise: They also set up accounts to promote socially divisive merchandise, like "LGBT-positive sex toys" on Instagram and Facebook.

Be smart: These findings demonstrate how quickly and easily bad actors are able to move manipulation efforts to new platforms when one site clamps down on malicious behavior. They also suggest how severely some features on these platforms, like messaging, can be abused by actors with bad intentions.

Between the lines: Some of the most eye-opening findings from the new reports are the ones that show how Russians exploited existing divisions around key moments or movements in the U.S. without being fully noticed at the time.

  • Political events: A study from Oxford University’s Computational Propaganda Project and network analysis firm Graphika shows that Russians exploited existing divisions between Americans by targeting them at key political moments online, like during the 2016 party conventions.
  • Disenfranchised voters: The Oxford study also points out that the IRA tried to campaign for black voters to boycott elections or follow the wrong voting procedures in 2016. In a statement, the Congressional Black Caucus says this is particularly concerning because "black voter turnout declined in 2016 — for the first time in 20 years."
  • Racial tensions: The New Knowledge report shows that the IRA focused much of its attention on sowing discord among black audiences, particularly around the height of the Black Lives Matter movement in 2016 and the NFL national anthem controversy in 2017.

The big picture: The studies commissioned by the Senate come on the heels of other reports about ways other repressive regimes, in places like Iran and Myanmar, also use social media to exploit existing divisions within vulnerable populations.

  • Political referendums, in particular, tend to be a hot target. Reports over the past year also suggest that Russian actors sought to rile up citizens around referendums in places like Spain, Britain and Macedonia.

The bottom line: One of the reasons social media platforms continue to be exploited is because the opacity of algorithms being used make many of these fictitious posts or misleading campaigns go viral.

  • There have been calls for greater transparency into the algorithms of social platforms after reports have found that on platforms like YouTube, viewers that watch one sensationalist video are then prompted to watch more, furthering radicalization.
  • But tech giants worry that exposing such information makes their platforms more open to abuse. Instead, they have been urging law enforcement officials to work more closely with them to address the intentions of bad actors before they blow up on their platforms.

Go deeper:

More stories loading.