AI

Google’s AI brainstormed thousands of potential new materials

Could some of them make for better batteries or chips?
article cover

DeepMind

3 min read

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

Google’s DeepMind unit has unveiled an AI that can more easily conjure up crystal structures in what researchers say could be a major stride toward discovering new materials for uses like batteries, superconductors, and solar panels.

DeepMind’s scientists published a paper this week claiming that the system has already predicted 2.2 million potential new materials and that 380,000 of them are stable enough to be promising candidates for actually being created in a lab. Researchers say the discovery is the equivalent of “nearly 800 years’ worth of knowledge” and multiplies the number of stable materials known to humanity.

It’s the latest big scientific breakthrough from Google’s DeepMind, which is dedicated to putting neural networks to work on some of the world’s biggest problems. The division has previously used deep learning in a similar manner to predict millions of protein structures in a database that scientists hope will fuel drug discovery. More recently, DeepMind’s scientists used AI to more accurately predict weather than leading forecasting systems.

DeepMind said the goal of its new system, called Graph Networks for Materials Exploration, or GNoME, is to revamp the way scientists discover new crystal structures that might make for more efficient batteries or better superconductors.

Historically, that’s been a painstaking experimentation process of altering known materials or combining different elements, and it’s yielded limited results, the researchers wrote in a blog post. Various existing computational approaches have helped researchers expand the list of known materials by 28,000. But GNoME can now supercharge that progress, the researchers said.

The system works through two pipelines of neural networks: One tweaks the compositions of known stable crystal structures, while the other takes a more randomized approach. Results from both are then evaluated for stability by a series of calculations, and the results are added to the database in a feedback loop that then informs the next round of learning.

Google DeepMind also collaborated with scientists at UC Berkeley and Lawrence Berkeley National Laboratory, who published their own paper on how an autonomous lab could use DeepMind’s findings to help create new compounds.

Google DeepMind will now share GNoME’s extensive results with the scientific community through an open-access database of known materials founded by Lawrence Berkeley called The Materials Project, which was also used to train GNoME.

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

T
B