Updated Aug 12, 2018 - Technology

Confronting demons of the computer age

A laptop displays menacing eyes on a red screen.

Illustration: Sarah Grillo/Axios

In a shift that is roiling typically cocooned computer scientists, some researchers — uneasy in part about the role of technology in the 2016 election — are urging colleagues to determine and mitigate the societal impact of their peer-reviewed work before it's published.

The big picture: The push — meant to shake computer scientists out of their labs and into the public sphere — comes as academics and scientists are suffering the same loss of popular faith as other major institutions.

What's going on: By sensibility, computer science researchers prefer to leave it to philosophers and policymakers to interpret the societal repercussions of their work. At most, their papers cite a few potential positive impacts of the findings — and avoid mentioning the negatives altogether.

But now, they should be more active to renew public faith, said Jack Clark, strategy and communications director at OpenAI. "We need to regain that trust by showing we're conscious of the impact of what we do," he tells Axios.

  • The highest-profile push has come from a group of scientists in the Association for Computing Machinery, a major professional organization that publishes dozens of academic journals. The group has proposed that peer reviewers assess whether papers adequately consider their negative implications.
  • The proposal has provoked a lengthy, combative discussion thread on Hacker News.
  • Wrote one commenter: "Engineers are not philosophers and should not be placed in this role. We do not have the tools to do it."

Researchers whose oppose greater oversight say it's not possible to guess whether their work will be repurposed for ill, or to prevent it from being misused.

  • "Bad guys always can do bad things by using new technologies," said Zhedong Zheng, a PhD candidate at the University of Technology Sydney. "We cannot control them."
  • Technology that's benign on its face can be adapted for surveillance, propaganda, misinformation, cyberattacks, or any number of other uses.
  • Zheng's own research focuses on re-identification, a surveillance tech that follows people from one camera feed to another. Among its ramifications, it could allow paranoid government to create a digital panopticon — or help a mother find a lost child at an amusement park.

But the proposal's 12 co-authors say the need to try harder is palpable. There is "a serious and embarrassing intellectual lapse" between researchers' positive outlook of their own work and reality, they wrote.

"Probably like those in the oil and tobacco industries, we can tell our kids and grandkids that throughout our careers, we were just doing our jobs. … Or we can tell our kids and grandkids that ours was the generation in computing to take responsibility for the effects of our innovations."
— Brent Hecht, lead author of the proposal, in a July speech

Why now? Several researchers said the tipping point was a combination of concern over advances in artificial intelligence and the 2016 election.

  • Part of the growing concern is that half-baked or biased AI algorithms could harm society.
  • And computer scientists watching tech take center stage — via email servers and leaks, Twitter and Facebook — underwent a "sea change" regarding their own ethical responsibility, according to Hecht, a professor at Northwestern.

Clark says the reluctance to engage with the ethical repercussions of research is an "abdicating of responsibility that is frankly shocking." And Joseph Redmon, a PhD candidate at the University of Washington, called it a sign of moral laziness.

  • In a recent paper describing updates to a computer-vision algorithm, Redmon wrote, "As researchers, we have a responsibility to at least consider the harm our work might be doing and think of ways to mitigate it. We owe the world that much."
  • Redmon said he decided to include the note after a member of the military approached him at a conference to express excitement about using the algorithm to analyze drone footage.

Go deeper: Computer science's public safety question.

Go deeper