Dec 21, 2019

Decades-old analog ideas could buoy modern AI

Illustration: Aïda Amer/Axios. Photos: Authenticated News/Getty Staff, GraphicaArtis/Getty Contributor

Returning to a technology largely discarded since the 1960s, scientists are betting on analog computing to wean AI systems off the monstrous amounts of electricity they currently require.

Why it matters: AI is on track to use up a tenth of the world's electricity by 2025, by one estimate. Cutting back on this consumption has huge climate implications — plus it’s essential for mobile devices and autonomous cars to do complex calculations on the fly.

The background: Analog computing was dethroned by today's dominant digital machines in the 1960s. Since then, computing has been about "higher speed, higher precision, higher throughput," says IBM's Jeff Welser. That's where digital tech shines.

  • But as AI becomes omnipresent, some of those core requirements of computers are being reconsidered.
  • A realization is dawning in some corners of the tech world that "maybe we were too quick to dispense with analog 60 years ago," says Eli Yablonovitch, a professor at Berkeley.

What's happening: The neural networks that drive most AI systems rely on multiplying numbers really, really fast. They currently use the precision and power of digital computing for the job. But AI computations may not need to be so precise.

  • "When you start getting pushed to the limits of what [digital computing] can offer, when you have a new class of problems, then it becomes interesting to revisit analog," says Shahin Farshchi, a computer scientist and VC at Lux Capital.
  • IBM, several startups, academic researchers and others are doing just that.

How it works: In a digital computer, everything runs on 1s and 0s — a universal, highly exact human-made language.

  • But an analog computer is built on the physical properties of its components. It can perform multiplication, for example, by utilizing the properties of transistors.
  • “The idea is to let the natural dynamics of the physical system solve the problem,” says Garrett Kenyon of the Los Alamos National Laboratory.
  • These systems come with obstacles: They can be inconsistent and difficult to program, Kenyon says.

Modern experiments with analog technology likely won’t result in a completely analog computer but a hybrid, with an analog portion that approximates an answer that can be fed into a digital part for refinement.

The big picture: There’s a broader resurgence of interest in new and forgotten approaches to computing.

  • "Both of the most futuristic areas we're looking at are actually not all digital," Welser says of analog and quantum computing.
  • Researchers at Los Alamos and elsewhere are developing neuromorphic chips, a subset of analog computing that more closely mirrors neurons in the brain.

"We use ideas regarding the principle of analog computing from the old days, but had to invent completely different ways of implementing them on a modern silicon chip, and had to come up with some completely new ideas as well," says Columbia University's Yannis Tsividis, whose lab is designing hybrid technologies for scientific computing.

What’s next: Analog computing is vying to be a part of the AI explosion. "AI is obviously already a very, very huge thing," says Yablonovitch. "If analog is contributing to that, then it means it has come back after 60 years in the wilderness."

Editor's note: This story has been updated to correct the spelling of Jeff Welser's surname.

Go deeper

Apple confirms purchase of AI startup Xnor.ai

Screenshot: Xnor.ai

Apple confirmed to Axios it has purchased Xnor.ai, a Seattle-based startup that specializes in putting artificial intelligence on devices rather than via centralized servers.

Why it matters: Doing AI work on devices is a key trend, especially for Apple, as it makes it easier to offer privacy protections. Apple, for example, does all of its automated tagging and categorization of photos on its devices.

Go deeperArrowJan 15, 2020

The stakes of a swift U.S.-China decoupling

Illustration: Sarah Grillo/Axios

As the U.S. and China rewrite their rules of engagement, the open exchange of scientific research and talent between the two powers is under scrutiny.

The big picture: Experts warn a "decoupling" of the two global powers — unwinding economic and technological dependencies, as well as raising barriers to collaboration — would destabilize the world and put the U.S.'s innovation edge at risk.

Go deeperArrowJan 11, 2020

AI's health care hype

Illustration: Sarah Grillo/Axios

Two new studies highlight artificial intelligence's potential to improve patient care, specifically by aiding or improving cancer detection.

Why it matters: AI could create enormous benefits for patients and the doctors who treat them, but some experts warn that the explosion of new health technology could put some patients in danger, as the L.A. Times and Kaiser Health News recently reported.

Go deeperArrowJan 7, 2020