Aug 9, 2021 - Technology

Facebook's accountability bind

Animated illustration of spotlights moving around to reveal pieces of the Facebook logo

Illustration: Annelise Capossela/Axios

Facebook's leaders know they have to demonstrate accountability to the world, but they're determined to do so on their own terms and timetable.

Why it matters: Since the 2018 Cambridge Analytica affair, Facebook has moved to provide more transparency and oversight, but its programs are limited, selective and slow, leaving journalists and scholars as the de facto whistleblowers for problems on its platform.

Driving the news: Last week Facebook shut down the accounts of New York University researchers whose tools for studying political advertising on the social network, the company said, violated its rules.

Details: NYU's Ad Observatory offers users a browser extension that supplies researchers with data on the circulation and targeting of political ads on Facebook — information that can be more timely and complete than what Facebook itself provides through its public library of political ads.

  • Facebook says the observatory's tools are "scraping" data, breaking its privacy rules and violating terms of the company's privacy settlement with the Federal Trade Commission. (The FTC has disputed that last point.)
  • Mozilla, the nonprofit behind the Firefox browser, released a statement saying that Facebook's objections to the Observatory's extension "simply do not hold water."

The big picture: Facebook has become a sort of global public square that's owned and operated by a private company whose decisions can shape political conflicts, cultural controversies and public-health outcomes.

  • Facebook itself remains the only entity with comprehensive, real-time insight into the flow of information and money on its platform.
  • It has tried to offer more transparency than most of its competitors through projects like its ad library.
  • Yet many of its efforts at accountability — from its reports on content takedowns to its creation of the independent Oversight Board to review content moderation decisions — are slow and retroactive in nature.

As a result, the typical pattern for airing and solving problems on Facebook today is:

  1. A journalist writes an article or a researcher publishes a report documenting something on Facebook that should not be happening.
  2. Facebook takes some action in response.
  3. Everyone wonders how much else is happening on Facebook that shouldn't be happening, since Facebook is so vastly larger than any arm of the press or the academy can possibly monitor.

Our thought bubble: When Facebook locks out any agent of accountability, given their scarcity, it leaves the impression it's more focused on limiting PR damage than actually stopping misinformation and manipulation of its platform. It's another manifestation of "see no evil" as a corporate reflex.

Between the lines: Laura Edelson, one of the NYU researchers, told The Markup's Julia Angwin she was flummoxed that Facebook shut down the researchers' personal accounts.

  • "If Facebook honestly thought that our browser extension compromised user privacy in any way, they would have taken action to stop our browser extension. They would have sued us. They would have taken technological measures. They would have tried to get our extension kicked out of the Chrome and Firefox store," Edelson said.

What's next: Democratic senators Amy Klobuchar, Mark Warner and Chris Coons sent Facebook a letter Monday grilling the company on its decision to ban the Ad Observatory scholars.

Go deeper