A phone with a lock symbol is seen with a laptop computer with a Facebook login page. Photo: Jaap Arriens/NurPhoto via Getty Images

We know that Facebook, YouTube, Twitter, Instagram, Snapchat, and all the other social media platforms "moderate" the content users post, typically aiming to remove material that violates either a host country's law or the platform's own standards.

The big picture: Moderation is usually understood to be the onerous and thankless cleanup task that these social media giants have had to shoulder as they scaled up to global ubiquity. But the choices companies make about what to delete and who to boot are actually central to their identities, argues scholar Tarleton Gillespie in a new book on the subject.

Gillespie's title, "Custodians of the Internet," points to the ambivalence about the labor of content moderation that's shared by platforms and users. "Custodians" are workers responsible for maintenance and cleanup; they're also keepers of a trust, protectors of a person, place, or institution.

Somebody's got to do it: Either way, it's a tough job — and one that the platform companies prefer to keep mostly out of view. That serves their desire to be seen as neutral arbiters of what's fit to post.

  • But it's increasingly untenable in a world where the platforms have become essential public forums, and where their choices affect livelihoods, elections, and even life or death.

Edge cases: Gillespie assembles a rogue's gallery of the toughest challenges in the annals of moderation:

  • The celebrated Vietnam-war news photo of a crying, naked girl running down a road after a napalm attack, which Facebook moderators repeatedly removed for violating the rules against underage nudity.
  • The long-running fight between Facebook and breastfeeding mothers who wished to post photos but ran afoul of a "no nipples" rule.
  • Violent images used in terrorist recruiting pitches are taken down, but similar images in news coverage (or scholarship about terrorism) may then also face censorship.

Both Twitter and Facebook have faced persistent problems by not being strict enough to satisfy users who have been harassed, yet still triggering outrage from other users who feel they've been censored.

To sort out such dilemmas, social networks typically employ several tiers of labor:

  • a small cadre of company employees who set policy and deal with the toughest calls;
  • larger pools of contractors, overseas workers, and gig-economy laborers who process images and posts at punishing speed;
  • and the entire base of users, who end up on volunteer community patrol as they flag objectionable content with a click.

All this work is an after-thought to the social networks, which devote their resources to building products and selling ads. But marginalizing moderation has only helped mire Facebook, Twitter, and the rest of the social-network business in a swamp of controversy and complaint.

Why it matters: Moderation, the promise that an online space will be managed and made to conform to some set of rules narrower than those that prevail on the open web, isn't a sideshow at all. It's what makes platforms unique, Gillespie argues — different both from publishers who create content and from common-carrier internet service providers who simply transmit it.

The boundaries platforms set on expression are how they distinguish themselves from one another, creating different kinds of spaces with different formal rules and social practices. "Custodians of the Internet" makes a strong case that Facebook and its competitors should start to treat moderation as a defining service rather than a necessary evil.

Go deeper

Kayleigh McEnany: Trump will accept "free and fair" election, no answer on if he loses

White House press secretary Kayleigh McEnany said Thursday that President Trump will "accept the results of a free and fair election," but did not specify whether he will commit to a peaceful transfer of power if he loses to Joe Biden.

Why it matters: Trump refused to say on Wednesday whether he would commit to a peaceful transition of power, instead remarking: "we're going to have to see what happens."

Sanders: "This is an election between Donald Trump and democracy"

Photo: BernieSanders.com

In an urgent appeal on Thursday, Sen. Bernie Sanders (I-Vt.) said President Trump presented "unique threats to our democracy" and detailed a plan to ensure the election results will be honored and that voters can cast their ballots safely.

Driving the news: When asked yesterday whether he would commit to a peaceful transfer of power if he loses, Trump would not, and said: "We're going to have to see what happens."

Ina Fried, author of Login
Updated 48 mins ago - Technology

Amazon launches new Alexa-enabled hardware

Amazon's new spherical Echo smart speaker. Screenshot: Axios

Amazon debuted a range of new Ring, Fire TV and Echo hardware on Thursday, including more environmentally sustainable versions of its audio and video gear. Among the products introduced are a cloud gaming service, a home monitoring drone and new spherical designs for its Echo and Echo dot smart speakers.

Why it matters: Amazon, like rivals Google and Apple, typically gives its consumer hardware a launch ahead of the holidays. Apple has already introduced new iPads, while Google has scheduled a Sept. 30 event, where it is expected to debut new audio and video gear, alongside updated Pixel phones.

Get Axios AM in your inbox

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Subscription failed
Thank you for subscribing!