Updated Dec 2, 2023 - Science

An AI boost for developing new materials

Robotic lab

Robots synthesize materials in the A-Lab at Lawrence Berkeley National Laboratory. Photo: Marilyn Sargent/Berkeley Lab

AI tools and sophisticated robotics are speeding the quest to engineer urgently needed new materials.

Driving the news: Google DeepMind researchers reported this week that a new AI model discovered more than 2.2 million hypothetical materials.

  • Of the millions of structures the AI predicted, 381,000 were stable new materials, making them top candidates for scientists to attempt to make and test in a lab.

What's happening: The batteries, solar cells, semiconductor chips and other devices required for the next generations of the electrical grid, computing and other technologies hinge on the development of novel materials.

  • That's prompted countries worldwide, including the U.S., China and India, to invest heavily in materials science and engineering.
  • AI and materials science top the list of U.S. federal grants to industry over the past six years, according to a report published this week from Georgetown's Center for Security and Emerging Technology.
  • China now dominates the field of materials engineering by several key metrics, including publications, employment and degrees awarded in the field.

Background: For the last century, new materials were discovered by taking existing ones known to be stable and substituting elements in the molecular structure to try to create a novel material.

  • That approach predicted about 48,000 stable inorganic crystal structures, more than half of them in the past decade.
  • But the process is expensive, time-consuming and, because it builds on known materials, is less likely to come up with radically different structures.
  • DeepMind achieved its feat by leveraging existing data from the Materials Project at Lawrence Berkeley National Laboratory (LBNL) and other databases to train the AI, which then expanded the dataset as it learned.

How it works: DeepMind's tool — called Graph Networks for Materials Exploration (GNoME) — uses two deep learning models that represent the atoms and bonds in a molecule as a graph.

  • One starts with known crystal structures and substitutes elements in place of others to create candidate structures. To try to move beyond what's known and come up with more diverse materials, the other model uses only a chemical formula or composition of a candidate material to predict if it is stable.
  • The pool of candidates is filtered and the energy of each structure —a measurement of stability — is determined.
  • The most promising structures are evaluated with quantum mechanics simulations and that data is put back into the model again — a training loop known as active learning.

The intrigue: GNoME appeared to learn some aspects of quantum mechanics and make predictions about structures it hadn't seen.

  • For example, the crystals the AI system was trained on consisted of only up to four elements, but it could discover five- and six-element materials, "which have proven to be difficult for human scientists," Ekin Dogus Cubuk, who leads the materials discovery team at DeepMind, said in a press briefing.
  • An AI being able to generalize beyond what it is trained on is important. "If we really want to use these models for discovery, we want to look where we haven't looked before," said Keith Butler, a professor of computational materials chemistry at University College London who wasn't involved in the research.

What's next: Predicting that a possible structure is stable doesn't guarantee it can actually be made.

  • In another paper published this week, researchers at LBNL report results from a lab outfitted with robotics guided by AI to autonomously synthesize crystals.
  • The recipes for the materials were suggested by AI models that use natural language processing to analyze existing scientific papers and were then optimized as the AI system learned from its failures.
  • Over 17 days of 24/7 operation, the A-Lab synthesized 41 of 58 materials they attempted to make — or more than two materials per day.

Yes, but: Many routes of synthesis may be difficult or expensive to automate because they involve intricate glassware, moving across a lab or other more complicated steps, Butler said.

  • The A-Lab failed to make 17 materials, some of them because they needed to be heated to higher temperatures or the materials had to be ground better — standard steps in a lab that are outside the AI's current purview.

Ultimately, a material has to perform — conduct heat, be electronically insulating or carry out other functions.

  • But the synthesis of a material and testing of its properties are also costly and slow steps in the process of developing a new material.

The models, like many other AI systems, also don't explain how they arrived at their decisions, Butler said.

  • "If you can ask the model, 'What is it that made you make this decision?' That is where you can get new knowledge," he said.

The big picture: Competitions to predict new structures and large language models (LLM) that power ChatGPT and other generative AI are starting to make an impact in the field, Butler said.

  • The Materials Innovation Factory at the University of Liverpool, the University of Toronto's Acceleration Consortium and others are developing self-driving labs.
  • Some types of scientific experimentation — not everything — are "really amenable for automation with machine learning and AI," said Olexandr Isayev, a professor of chemistry at Carnegie Mellon University who works on their automated Cloud Lab.
  • "Software plus hardware" is where "the next advances in the sciences are going to be," he added.
Go deeper