Quantum machine learning for chemistry and physics DOI Creative Commons
Manas Sajjan, Junxu Li, Raja Selvarajan

et al.

Chemical Society Reviews, Journal Year: 2022, Volume and Issue: 51(15), P. 6475 - 6573

Published: Jan. 1, 2022

Machine learning (ML) has emerged into formidable force for identifying hidden but pertinent patterns within a given data set with the objective of subsequent generation automated predictive behavior. In recent years, it is safe to conclude that ML and its close cousin deep (DL) have ushered unprecedented developments in all areas physical sciences especially chemistry. Not only classical variants , even those trainable on near-term quantum hardwares been developed promising outcomes. Such algorithms revolutionzed material design performance photo-voltaics, electronic structure calculations ground excited states correlated matter, computation force-fields potential energy surfaces informing chemical reaction dynamics, reactivity inspired rational strategies drug designing classification phases matter accurate identification emergent criticality. this review we shall explicate subset such topics delineate contributions made by both computing enhanced machine over past few years. We not present brief overview well-known techniques also highlight their using statistical insight. The foster exposition aforesaid empower promote cross-pollination among future-research chemistry which can benefit from turn potentially accelerate growth algorithms.

Language: Английский

Machine Learning Force Fields DOI Creative Commons
Oliver T. Unke, Stefan Chmiela, Huziel E. Sauceda

et al.

Chemical Reviews, Journal Year: 2021, Volume and Issue: 121(16), P. 10142 - 10186

Published: March 11, 2021

In recent years, the use of machine learning (ML) in computational chemistry has enabled numerous advances previously out reach due to complexity traditional electronic-structure methods. One most promising applications is construction ML-based force fields (FFs), with aim narrow gap between accuracy ab initio methods and efficiency classical FFs. The key idea learn statistical relation chemical structure potential energy without relying on a preconceived notion fixed bonds or knowledge about relevant interactions. Such universal ML approximations are principle only limited by quality quantity reference data used train them. This review gives an overview ML-FFs insights that can be obtained from core concepts underlying described detail, step-by-step guide for constructing testing them scratch given. text concludes discussion challenges remain overcome next generation ML-FFs.

Language: Английский

Citations

946

E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials DOI Creative Commons
Simon Batzner, Albert Musaelian, Lixin Sun

et al.

Nature Communications, Journal Year: 2022, Volume and Issue: 13(1)

Published: May 4, 2022

Abstract This work presents Neural Equivariant Interatomic Potentials (NequIP), an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations molecular dynamics simulations. While most contemporary symmetry-aware models use invariant convolutions and only act on scalars, NequIP employs interactions of geometric tensors, resulting in a more information-rich faithful representation atomic environments. The method achieves state-of-the-art accuracy challenging diverse set molecules materials while exhibiting remarkable data efficiency. outperforms existing with up to three orders magnitude fewer training data, the widely held belief that deep networks require massive sets. high efficiency allows construction accurate using high-order quantum chemical level theory as reference enables high-fidelity simulations over long time scales.

Language: Английский

Citations

909

Four Generations of High-Dimensional Neural Network Potentials DOI
Jörg Behler

Chemical Reviews, Journal Year: 2021, Volume and Issue: 121(16), P. 10037 - 10072

Published: March 29, 2021

Since their introduction about 25 years ago, machine learning (ML) potentials have become an important tool in the field of atomistic simulations. After initial decade, which neural networks were successfully used to construct for rather small molecular systems, development high-dimensional network (HDNNPs) 2007 opened way application ML simulations large systems containing thousands atoms. To date, many other types been proposed continuously increasing range problems that can be studied. In this review, methodology family HDNNPs including new recent developments will discussed using a classification scheme into four generations potentials, is also applicable potentials. The first generation formed by early designed low-dimensional systems. High-dimensional established second and are based on three key steps: first, expression total energy as sum environment-dependent atomic contributions; second, description environments atom-centered symmetry functions descriptors fulfilling requirements rotational, translational, permutation invariance; third, iterative construction reference electronic structure data sets active learning. third-generation HDNNPs, addition, long-range interactions included employing partial charges expressed networks. fourth-generation just emerging, nonlocal phenomena such charge transfer included. applicability remaining limitations along with outlook at possible future developments.

Language: Английский

Citations

599

Physics-Inspired Structural Representations for Molecules and Materials DOI Creative Commons
Félix Musil, Andrea Grisafi, Albert P. Bartók

et al.

Chemical Reviews, Journal Year: 2021, Volume and Issue: 121(16), P. 9759 - 9815

Published: July 26, 2021

The first step in the construction of a regression model or data-driven analysis, aiming to predict elucidate relationship between atomic-scale structure matter and its properties, involves transforming Cartesian coordinates atoms into suitable representation. development representations has played, continues play, central role success machine-learning methods for chemistry materials science. This review summarizes current understanding nature characteristics most commonly used structural chemical descriptions atomistic structures, highlighting deep underlying connections different frameworks ideas that lead computationally efficient universally applicable models. It emphasizes link their physical chemistry, mathematical description, provides examples recent applications diverse set science problems, outlines open questions promising research directions field.

Language: Английский

Citations

439

The MLIP package: moment tensor potentials with MPI and active learning DOI Creative Commons

Ivan S. Novikov,

Konstantin Gubaev, Evgeny V. Podryabinkin

et al.

Machine Learning Science and Technology, Journal Year: 2020, Volume and Issue: 2(2), P. 025002 - 025002

Published: Nov. 12, 2020

The subject of this paper is the technology (the 'how') constructing machine-learning interatomic potentials, rather than science 'what' and 'why') atomistic simulations using potentials. Namely, we illustrate how to construct moment tensor potentials active learning as implemented in MLIP package, focusing on efficient ways automatically sample configurations for training set, expanding set changes error predictions, up ab initio calculations a cost-effective manner, etc. package (short Machine-Learning Interatomic Potentials) available at https://mlip.skoltech.ru/download/.

Language: Английский

Citations

429

Machine Learning for Electronically Excited States of Molecules DOI Creative Commons
Julia Westermayr, Philipp Marquetand

Chemical Reviews, Journal Year: 2020, Volume and Issue: 121(16), P. 9873 - 9926

Published: Nov. 19, 2020

Electronically excited states of molecules are at the heart photochemistry, photophysics, as well photobiology and also play a role in material science. Their theoretical description requires highly accurate quantum chemical calculations, which computationally expensive. In this review, we focus on not only how machine learning is employed to speed up such excited-state simulations but branch artificial intelligence can be used advance exciting research field all its aspects. Discussed applications for include dynamics simulations, static calculations absorption spectra, many others. order put these studies into context, discuss promises pitfalls involved techniques. Since latter mostly based chemistry provide short introduction electronic structure methods approaches nonadiabatic describe tricks problems when using them molecules.

Language: Английский

Citations

343

Graph neural networks for materials science and chemistry DOI Creative Commons
Patrick Reiser,

Marlen Neubert,

André Eberhard

et al.

Communications Materials, Journal Year: 2022, Volume and Issue: 3(1)

Published: Nov. 26, 2022

Abstract Machine learning plays an increasingly important role in many areas of chemistry and materials science, being used to predict properties, accelerate simulations, design new structures, synthesis routes materials. Graph neural networks (GNNs) are one the fastest growing classes machine models. They particular relevance for as they directly work on a graph or structural representation molecules therefore have full access all relevant information required characterize In this Review, we provide overview basic principles GNNs, widely datasets, state-of-the-art architectures, followed by discussion wide range recent applications GNNs concluding with road-map further development application GNNs.

Language: Английский

Citations

339

Learning local equivariant representations for large-scale atomistic dynamics DOI Creative Commons
Albert Musaelian, Simon Batzner, Anders Johansson

et al.

Nature Communications, Journal Year: 2023, Volume and Issue: 14(1)

Published: Feb. 3, 2023

A simultaneously accurate and computationally efficient parametrization of the potential energy surface molecules materials is a long-standing goal in natural sciences. While atom-centered message passing neural networks (MPNNs) have shown remarkable accuracy, their information propagation has limited accessible length-scales. Local methods, conversely, scale to large simulations but suffered from inferior accuracy. This work introduces Allegro, strictly local equivariant deep network interatomic architecture that exhibits excellent accuracy scalability. Allegro represents many-body using iterated tensor products learned representations without passing. obtains improvements over state-of-the-art methods on QM9 revMD17. single product layer outperforms existing MPNNs transformers QM9. Furthermore, displays generalization out-of-distribution data. Molecular recover structural kinetic properties an amorphous electrolyte agreement with ab-initio simulations. Finally, we demonstrate parallelization simulation 100 million atoms.

Language: Английский

Citations

319

OrbNet: Deep learning for quantum chemistry using symmetry-adapted atomic-orbital features DOI
Zhuoran Qiao, Matthew Welborn,

Animashree Anandkumar

et al.

The Journal of Chemical Physics, Journal Year: 2020, Volume and Issue: 153(12)

Published: Sept. 25, 2020

We introduce a machine learning method in which energy solutions from the Schrödinger equation are predicted using symmetry adapted atomic orbital features and graph neural-network architecture. OrbNet is shown to outperform existing methods terms of efficiency transferability for prediction density functional theory results while employing low-cost that obtained semi-empirical electronic structure calculations. For applications datasets drug-like molecules, including QM7b-T, QM9, GDB-13-T, DrugBank, conformer benchmark dataset Folmsbee Hutchison [Int. J. Quantum Chem. (published online) (2020)], predicts energies within chemical accuracy at computational cost 1000-fold or more reduced.

Language: Английский

Citations

231

An accurate and transferable machine learning potential for carbon DOI
P.N. Rowe, Volker L. Deringer, Piero Gasparotto

et al.

The Journal of Chemical Physics, Journal Year: 2020, Volume and Issue: 153(3)

Published: July 15, 2020

We present an accurate machine learning (ML) model for atomistic simulations of carbon, constructed using the Gaussian approximation potential (GAP) methodology. The potential, named GAP-20, describes properties bulk crystalline and amorphous phases, crystal surfaces, defect structures with accuracy approaching that direct ab initio simulation, but at a significantly reduced cost. combine structural databases carbon graphene, which we extend substantially by adding suitable configurations, example, defects in graphene other nanostructures. final is fitted to reference data computed optB88-vdW density functional theory (DFT) functional. Dispersion interactions, are crucial describe multilayer carbonaceous materials, therefore implicitly included. additionally account long-range dispersion interactions semianalytical two-body term show improved can be obtained through optimization many-body smooth overlap atomic positions descriptor. rigorously test on lattice parameters, bond lengths, formation energies, phonon dispersions numerous allotropes. compare energies extensive set structures, surface reconstructions DFT calculations. work demonstrates ability combine, same ML model, previously attained flexibility required [V. L. Deringer G. Csányi, Phys. Rev. B 95, 094203 (2017)] high numerical necessary [Rowe et al., 97, 054303 (2018)], thereby providing interatomic will applicable wide range applications concerning diverse forms nanostructured carbon.

Language: Английский

Citations

230