Illustration: Sarah Grillo/Axios
For the better part of a decade, artificial intelligence has been propelled by a rocket fuel in seemingly endless supply. Deep learning, a method that allows machines to identify hidden patterns in data, has powered commercial applications like autonomous vehicles and voice assistants, and it's potentially worth trillions of dollars a year.
The other side: The rosy portrait of unstoppable progress belies a fear among some AI luminaries that things are not on the right path. In a new sort of resource curse, they say that deep learning has sucked energy away from other strains of inquiry without which AI may never approach even a child's intellectual capabilities.
The big picture: For the past 5 years, Elon Musk and others have warned of a future disaster resulting from unchecked superintelligent AI. But today, much of the field is caught in a rather more elementary tug-of-war over which avenue will imbue AI even with the capacity for basic understanding.
- "This is a struggle that goes back decades and is at the heart of the field," says Pedro Domingos, a University of Washington professor and the head of machine learning at D.E. Shaw, an investment firm. "People on either side of it think the others are crazy."
- "The field has gotten ahead of its skis," argues NYU's Gary Marcus, a dogged critic of the field's obsession with deep learning. "We think that if people don't change course, it's going to be destructive."
Since the field took shape in the 1950s, artificial intelligence has advanced in fits and starts, with various tribes claiming the vanguard at different points. The current period began in the early 2010s, when a trio of researchers in Canada brought AI out of a decadeslong funk by reviving deep learning, aided by new and powerful hardware.
Now, some of those same pioneers are warning against leaning too heavily on their contributions, and researchers with one foot in adjacent fields are sounding an increasingly insistent alarm about AI’s trajectory.
- "We are telling the young generation to not take our words for the gospel — because some do, unfortunately — and explore new things, because this is how science progresses," says the University of Montreal's Yoshua Bengio, who this year shared the Turing Award, the highest accolade in computing, with Yann LeCun and Geoffrey Hinton for reinventing deep learning.
- "The field has become a little too focused on deep learning, which means a lot of other things that should be explored aren't being explored," Domingos says. New students are "very seduced by deep learning," he says. "Everybody wants to do it."
- Next month, Marcus will publish a book with his NYU colleague Ernest Davis called "Rebooting AI," laying out in excruciating detail where deep learning falls short.
What's missing is common sense. Without it, argues Marcus, machines will never be able to actually comprehend a passage of fiction or navigate a cluttered home to tidy up before guests arrive.
- Right now, computers can answer basic questions about a text, but usually stumble when asked about something that isn't explicitly spelled out in the excerpt.
- That's because deep learning systems are built on statistics and patterns, but don't have background knowledge outside the data they've been fed. By contrast, humans bring decades of understanding about the world into every single interaction.
- Marcus and others want to build these crucial foundations, or “priors,” into AI systems. That will likely require borrowing from other approaches to AI — especially symbolic reasoning, now sometimes called “good old-fashioned AI,” which explicitly spells out relationships between things.
But, but, but: If indeed the field is headed for a dead end, the impasse certainly isn't around the next bend. "The data at this point supports continuing success," says Oren Etzioni, CEO of the Allen Institute for AI. "But it's impossible to tell how far that goes."
- Domingos predicts deep learning alone could drive another 20 years of advances; Frank Chen, a partner at the VC firm Andreessen Horowitz, guessed 40 years in an interview with Axios last year.
- If the progress does peter out, though, it could mean that years of single-minded effort will have crowded out the next crucial breakthrough.