Updated Aug 12, 2018

Confronting demons of the computer age

Illustration: Sarah Grillo/Axios

In a shift that is roiling typically cocooned computer scientists, some researchers — uneasy in part about the role of technology in the 2016 election — are urging colleagues to determine and mitigate the societal impact of their peer-reviewed work before it's published.

The big picture: The push — meant to shake computer scientists out of their labs and into the public sphere — comes as academics and scientists are suffering the same loss of popular faith as other major institutions.

What's going on: By sensibility, computer science researchers prefer to leave it to philosophers and policymakers to interpret the societal repercussions of their work. At most, their papers cite a few potential positive impacts of the findings — and avoid mentioning the negatives altogether.

But now, they should be more active to renew public faith, said Jack Clark, strategy and communications director at OpenAI. "We need to regain that trust by showing we're conscious of the impact of what we do," he tells Axios.

  • The highest-profile push has come from a group of scientists in the Association for Computing Machinery, a major professional organization that publishes dozens of academic journals. The group has proposed that peer reviewers assess whether papers adequately consider their negative implications.
  • The proposal has provoked a lengthy, combative discussion thread on Hacker News.
  • Wrote one commenter: "Engineers are not philosophers and should not be placed in this role. We do not have the tools to do it."

Researchers whose oppose greater oversight say it's not possible to guess whether their work will be repurposed for ill, or to prevent it from being misused.

  • "Bad guys always can do bad things by using new technologies," said Zhedong Zheng, a PhD candidate at the University of Technology Sydney. "We cannot control them."
  • Technology that's benign on its face can be adapted for surveillance, propaganda, misinformation, cyberattacks, or any number of other uses.
  • Zheng's own research focuses on re-identification, a surveillance tech that follows people from one camera feed to another. Among its ramifications, it could allow paranoid government to create a digital panopticon — or help a mother find a lost child at an amusement park.

But the proposal's 12 co-authors say the need to try harder is palpable. There is "a serious and embarrassing intellectual lapse" between researchers' positive outlook of their own work and reality, they wrote.

"Probably like those in the oil and tobacco industries, we can tell our kids and grandkids that throughout our careers, we were just doing our jobs. … Or we can tell our kids and grandkids that ours was the generation in computing to take responsibility for the effects of our innovations."
— Brent Hecht, lead author of the proposal, in a July speech

Why now? Several researchers said the tipping point was a combination of concern over advances in artificial intelligence and the 2016 election.

  • Part of the growing concern is that half-baked or biased AI algorithms could harm society.
  • And computer scientists watching tech take center stage — via email servers and leaks, Twitter and Facebook — underwent a "sea change" regarding their own ethical responsibility, according to Hecht, a professor at Northwestern.

Clark says the reluctance to engage with the ethical repercussions of research is an "abdicating of responsibility that is frankly shocking." And Joseph Redmon, a PhD candidate at the University of Washington, called it a sign of moral laziness.

  • In a recent paper describing updates to a computer-vision algorithm, Redmon wrote, "As researchers, we have a responsibility to at least consider the harm our work might be doing and think of ways to mitigate it. We owe the world that much."
  • Redmon said he decided to include the note after a member of the military approached him at a conference to express excitement about using the algorithm to analyze drone footage.

Go deeper: Computer science's public safety question.

Go deeper

Federal court temporarily halts "Remain in Mexico" program

Migrant wearing a cap with U.S. flagin front of the border between Guatemala and Mexico. Photo: Jair Cabrera Torres/picture alliance via Getty Image

The 9th Circuit Court of Appeals upheld a lower court's earlier injunction on Friday, temporarily stopping the Trump administration from enforcing the Migrant Protection Protocols (MPP) — known as the "Remain in Mexico" policy.

Why it matters: Tens of thousands of migrants seeking asylum have been forced to wait out their U.S. immigration court cases across the border in Mexico under the policy. The Trump administration has long credited this program for the decline in border crossings following record highs last summer.

Go deeperArrowUpdated 2 hours ago - Politics & Policy

Coronavirus updates: WHO raises global threat level to "very high"

Data: The Center for Systems Science and Engineering at Johns Hopkins, the CDC, and China's Health Ministry. Note: China numbers are for the mainland only and U.S. numbers include repatriated citizens.

The World Health Organization raised its global risk assessment for the novel coronavirus to "very high" Friday, its highest risk level as countries struggle to contain it. Meanwhile, National Economic Council director Larry Kudlow this morning tried to reassure the markets, which continued to correct amid growing fears of a U.S. recession.

The big picture: COVID-19 has killed more than 2,860 people and infected about 83,800 others in almost 60 countries and territories outside the epicenter in mainland China. The number of new cases reported outside China now exceed those inside the country.

Go deeperArrowUpdated 3 hours ago - Health

Bernie's plan to hike taxes on some startup employees

Illustration: Sarah Grillo/Axios

Sens. Bernie Sanders (D-VT) and Chris Van Hollen (D-MD) introduced legislation that would tax nonqualified stock options at vesting, rather than at exercise, for employees making at least $130,000 per year.

The big picture: Select employees at private companies would be taxed on monies that they hadn't yet banked.