Dec 21, 2019 - Technology

Decades-old analog ideas could buoy modern AI

Photo illustration of analog computers.

Illustration: Aïda Amer/Axios. Photos: Authenticated News/Getty Staff, GraphicaArtis/Getty Contributor

Returning to a technology largely discarded since the 1960s, scientists are betting on analog computing to wean AI systems off the monstrous amounts of electricity they currently require.

Why it matters: AI is on track to use up a tenth of the world's electricity by 2025, by one estimate. Cutting back on this consumption has huge climate implications — plus it’s essential for mobile devices and autonomous cars to do complex calculations on the fly.

The background: Analog computing was dethroned by today's dominant digital machines in the 1960s. Since then, computing has been about "higher speed, higher precision, higher throughput," says IBM's Jeff Welser. That's where digital tech shines.

  • But as AI becomes omnipresent, some of those core requirements of computers are being reconsidered.
  • A realization is dawning in some corners of the tech world that "maybe we were too quick to dispense with analog 60 years ago," says Eli Yablonovitch, a professor at Berkeley.

What's happening: The neural networks that drive most AI systems rely on multiplying numbers really, really fast. They currently use the precision and power of digital computing for the job. But AI computations may not need to be so precise.

  • "When you start getting pushed to the limits of what [digital computing] can offer, when you have a new class of problems, then it becomes interesting to revisit analog," says Shahin Farshchi, a computer scientist and VC at Lux Capital.
  • IBM, several startups, academic researchers and others are doing just that.

How it works: In a digital computer, everything runs on 1s and 0s — a universal, highly exact human-made language.

  • But an analog computer is built on the physical properties of its components. It can perform multiplication, for example, by utilizing the properties of transistors.
  • “The idea is to let the natural dynamics of the physical system solve the problem,” says Garrett Kenyon of the Los Alamos National Laboratory.
  • These systems come with obstacles: They can be inconsistent and difficult to program, Kenyon says.

Modern experiments with analog technology likely won’t result in a completely analog computer but a hybrid, with an analog portion that approximates an answer that can be fed into a digital part for refinement.

The big picture: There’s a broader resurgence of interest in new and forgotten approaches to computing.

  • "Both of the most futuristic areas we're looking at are actually not all digital," Welser says of analog and quantum computing.
  • Researchers at Los Alamos and elsewhere are developing neuromorphic chips, a subset of analog computing that more closely mirrors neurons in the brain.

"We use ideas regarding the principle of analog computing from the old days, but had to invent completely different ways of implementing them on a modern silicon chip, and had to come up with some completely new ideas as well," says Columbia University's Yannis Tsividis, whose lab is designing hybrid technologies for scientific computing.

What’s next: Analog computing is vying to be a part of the AI explosion. "AI is obviously already a very, very huge thing," says Yablonovitch. "If analog is contributing to that, then it means it has come back after 60 years in the wilderness."

Editor's note: This story has been updated to correct the spelling of Jeff Welser's surname.

Go deeper