Talk of artificial intelligence goes almost inexorably to the very large. Companies, it is said, must embrace big data, along with future human-like AI, or be lost to history. But some tech leaders are going the other way, urging businesses to start thinking small.
What's happening: This growing mantra speaks of the upside of narrow AI ambitions and small data. But by small bore, they don't mean small results. AI and data minimalism, they say, could be what revolutionizes business, industry, war and more, writes Axios' Kaveh Waddell.
- The dream of most AI researchers is "general intelligence" — the broad human capacity to cook, do a crossword, work at a computer and carry on conversations with friends, all seamlessly, one after the other.
- But that's a distant dream — researchers are at least decades away from general intelligence.
- And that's not even contemplating super-human intelligence, the holy grail.
Instead, today’s best AI algorithms are one-trick ponies that have each been taught an extremely useful, single trick, Andrew Ng, an AI pioneer at Stanford University, tells Axios.
- Examples are algorithms that drive cars, read chest X-rays and translate languages.
- None can do any other thing — only its one task.
But that's not something to sneer at: These ponies have already created more than $1 trillion of business value, according to Gartner — with many trillions more up for grabs for those who can figure out how to apply the young technology.
- We’ve written of some of the problems that confound the best AI systems, but in many areas, AI can outperform humans.
- "Almost anything you can do with less than a second of mental thought, we can probably now automate" with AI, says Ng, who co-founded Google Brain and Baidu's AI program.
The big picture: Driving the optimism is deep learning, the technique that has in recent years been the jet fuel accelerating AI applications, and it shows few signs of slowing.
- "We're in year two or year three of a good, 40-year run," says Frank Chen, a partner at Andreessen Horowitz, a prominent Silicon Valley VC firm.
- Even if, by some dictatorial decree, all deep learning research stopped today, developers would be writing new software using today’s technology for decades, says Chen. "We have a long way to go just harnessing the existing techniques."
Deep learning has its own drawbacks. Among the most constraining is its never-ending hunger for more and more data.
But some problems don’t come with a lot of data. Ng offers some examples:
- A factory that wants to nab defective parts before they roll off the manufacturing line may have just a handful of examples with which to train an algorithm.
- A hospital that wants to detect diseases from medical imaging may only have a small number of scans on hand for a particularly rare condition.
The bottom line: “I think big data is overhyped,” says Ng.
- A lot of problems will be solved using small data. The only problem? It takes “a much more skilled team to do things with small data,” he said.
- One way to overcome the challenge is to pair deep learning, which uses correlations to make predictions, with explicit rules that can clear up ambiguity, writes Virginia Dignum, a computer science professor at Delft University of Technology.