Global
optimization
with
first-principles
energy
expressions
(GOFEE)
is
an
efficient
method
for
identifying
low-energy
structures
in
computationally
expensive
landscapes
such
as
the
ones
described
by
density
functional
theory
(DFT),
van
der
Waals
enabled
DFT,
or
even
methods
beyond
DFT.
GOFEE
evolutionary
algorithm,
that
order
to
explore
configuration
space
creates
several
candidates
parallel.
These
are
treated
approximately
using
a
machine
learned
surrogate
model
of
energies
and
forces,
trained
on
fly,
eliminating
need
relaxations
methods.
Eventually,
Bayesian
statistics,
chooses
one
candidate
treats
at
full
level.
In
this
paper
we
elaborate
importance
use
Gaussian
kernel
two
length
scales
process
regression
model.
We
further
role
lower
confidence
bound
relaxation
selection
structures.
addition,
present
details
sampling
scheme
obtaining
parent
evolution.
Using
learning
clustering
entire
pool
ever
calculated,
choosing
most
stable
member
from
each
cluster,
ensures
highly
diverse
sample
plays
population.
The
versatility
demonstrated
applying
it
identify
gas-phase
fullerene-type
24-atom
carbon
clusters
dome-shaped
18-atom
supported
Ir(111).
Chemical Reviews,
Journal Year:
2021,
Volume and Issue:
121(16), P. 10073 - 10141
Published: Aug. 16, 2021
We
provide
an
introduction
to
Gaussian
process
regression
(GPR)
machine-learning
methods
in
computational
materials
science
and
chemistry.
The
focus
of
the
present
review
is
on
atomistic
properties:
particular,
construction
interatomic
potentials,
or
force
fields,
Approximation
Potential
(GAP)
framework;
beyond
this,
we
also
discuss
fitting
arbitrary
scalar,
vectorial,
tensorial
quantities.
Methodological
aspects
reference
data
generation,
representation,
regression,
as
well
question
how
a
data-driven
model
may
be
validated,
are
reviewed
critically
discussed.
A
survey
applications
variety
research
questions
chemistry
illustrates
rapid
growth
field.
vision
outlined
for
development
methodology
years
come.
Advanced Materials,
Journal Year:
2019,
Volume and Issue:
31(46)
Published: Sept. 5, 2019
Abstract
Atomic‐scale
modeling
and
understanding
of
materials
have
made
remarkable
progress,
but
they
are
still
fundamentally
limited
by
the
large
computational
cost
explicit
electronic‐structure
methods
such
as
density‐functional
theory.
This
Progress
Report
shows
how
machine
learning
(ML)
is
currently
enabling
a
new
degree
realism
in
modeling:
“learning”
data,
ML‐based
interatomic
potentials
give
access
to
atomistic
simulations
that
reach
similar
accuracy
levels
orders
magnitude
faster.
A
brief
introduction
tools
given,
then,
applications
some
select
problems
science
highlighted:
phase‐change
for
memory
devices;
nanoparticle
catalysts;
carbon‐based
electrodes
chemical
sensing,
supercapacitors,
batteries.
It
hoped
present
work
will
inspire
development
wider
use
diverse
areas
research.
Chemical Reviews,
Journal Year:
2021,
Volume and Issue:
121(16), P. 9759 - 9815
Published: July 26, 2021
The
first
step
in
the
construction
of
a
regression
model
or
data-driven
analysis,
aiming
to
predict
elucidate
relationship
between
atomic-scale
structure
matter
and
its
properties,
involves
transforming
Cartesian
coordinates
atoms
into
suitable
representation.
development
representations
has
played,
continues
play,
central
role
success
machine-learning
methods
for
chemistry
materials
science.
This
review
summarizes
current
understanding
nature
characteristics
most
commonly
used
structural
chemical
descriptions
atomistic
structures,
highlighting
deep
underlying
connections
different
frameworks
ideas
that
lead
computationally
efficient
universally
applicable
models.
It
emphasizes
link
their
physical
chemistry,
mathematical
description,
provides
examples
recent
applications
diverse
set
science
problems,
outlines
open
questions
promising
research
directions
field.
The Journal of Physical Chemistry Letters,
Journal Year:
2020,
Volume and Issue:
11(6), P. 2336 - 2347
Published: March 3, 2020
As
the
quantum
chemistry
(QC)
community
embraces
machine
learning
(ML),
number
of
new
methods
and
applications
based
on
combination
QC
ML
is
surging.
In
this
Perspective,
a
view
current
state
affairs
in
exciting
research
field
offered,
challenges
using
are
described,
potential
future
developments
outlined.
Specifically,
examples
how
used
to
improve
accuracy
accelerate
chemical
shown.
Generalization
classification
existing
techniques
provided
ease
navigation
sea
literature
guide
researchers
entering
field.
The
emphasis
Perspective
supervised
learning.
The Journal of Chemical Physics,
Journal Year:
2020,
Volume and Issue:
152(5)
Published: Feb. 5, 2020
The
use
of
supervised
machine
learning
to
develop
fast
and
accurate
interatomic
potential
models
is
transforming
molecular
materials
research
by
greatly
accelerating
atomic-scale
simulations
with
little
loss
accuracy.
Three
years
ago,
Jörg
Behler
published
a
perspective
in
this
journal
providing
an
overview
some
the
leading
methods
field.
In
perspective,
we
provide
updated
discussion
recent
developments,
emerging
trends,
promising
areas
for
future
We
include
three
approaches
developing
machine-learned
that
have
not
been
extensively
discussed
existing
reviews:
moment
tensor
potentials,
message-passing
networks,
symbolic
regression.
Chemical Reviews,
Journal Year:
2022,
Volume and Issue:
122(12), P. 10970 - 11021
Published: May 16, 2022
Rechargeable
batteries
have
become
indispensable
implements
in
our
daily
life
and
are
considered
a
promising
technology
to
construct
sustainable
energy
systems
the
future.
The
liquid
electrolyte
is
one
of
most
important
parts
battery
extremely
critical
stabilizing
electrode–electrolyte
interfaces
constructing
safe
long-life-span
batteries.
Tremendous
efforts
been
devoted
developing
new
solvents,
salts,
additives,
recipes,
where
molecular
dynamics
(MD)
simulations
play
an
increasingly
role
exploring
structures,
physicochemical
properties
such
as
ionic
conductivity,
interfacial
reaction
mechanisms.
This
review
affords
overview
applying
MD
study
electrolytes
for
rechargeable
First,
fundamentals
recent
theoretical
progress
three-class
summarized,
including
classical,
ab
initio,
machine-learning
(section
2).
Next,
application
exploration
electrolytes,
probing
bulk
structures
3),
deriving
macroscopic
conductivity
dielectric
constant
4),
revealing
mechanisms
5),
sequentially
presented.
Finally,
general
conclusion
insightful
perspective
on
current
challenges
future
directions
provided.
Machine-learning
technologies
highlighted
figure
out
these
challenging
issues
facing
research
promote
rational
design
advanced
next-generation
The Journal of Chemical Physics,
Journal Year:
2020,
Volume and Issue:
152(4)
Published: Jan. 27, 2020
We
introduce
the
FCHL19
representation
for
atomic
environments
in
molecules
or
condensed-phase
systems.
Machine
learning
models
based
on
are
able
to
yield
predictions
of
forces
and
energies
query
compounds
with
chemical
accuracy
scale
milliseconds.
is
a
revision
our
previous
work
[F.
A.
Faber
et
al.,
J.
Chem.
Phys.
148,
241717
(2018)]
where
discretized
individual
features
rigorously
optimized
using
Monte
Carlo
optimization.
Combined
Gaussian
kernel
function
that
incorporates
elemental
screening,
reached
energy
QM7b
QM9
datasets
after
training
minutes
hours,
respectively.
The
model
also
shows
good
performance
non-bonded
interactions
condensed
phase
set
water
clusters
mean
absolute
error
(MAE)
binding
less
than
0.1
kcal/mol/molecule
3200
samples.
For
force
MD17
dataset,
similarly
displays
state-of-the-art
regressor
process
regression.
When
revised
combined
operator
quantum
machine
regressor,
can
be
predicted
only
few
milliseconds
per
atom.
presented
herein
fast
lightweight
enough
use
general
chemistry
problems
as
well
molecular
dynamics
simulations.
We
develop
a
neuroevolution-potential
(NEP)
framework
for
generating
neural
network-based
machine-learning
potentials.
They
are
trained
using
an
evolutionary
strategy
performing
large-scale
molecular
dynamics
(MD)
simulations.
A
descriptor
of
the
atomic
environment
is
constructed
based
on
Chebyshev
and
Legendre
polynomials.
The
method
implemented
in
graphic
processing
units
within
open-source
gpumd
package,
which
can
attain
computational
speed
over
${10}^{7}$
atom-step
per
second
one
Nvidia
Tesla
V100.
Furthermore,
per-atom
heat
current
available
NEP,
paves
way
efficient
accurate
MD
simulations
transport
materials
with
strong
phonon
anharmonicity
or
spatial
disorder,
usually
cannot
be
accurately
treated
either
traditional
empirical
potentials
perturbative
methods.
The Journal of Chemical Physics,
Journal Year:
2022,
Volume and Issue:
157(11)
Published: Aug. 24, 2022
We
present
our
latest
advancements
of
machine-learned
potentials
(MLPs)
based
on
the
neuroevolution
potential
(NEP)
framework
introduced
in
[Fan
et
al.,
Phys.
Rev.
B
104,
104309
(2021)]
and
their
implementation
open-source
package
GPUMD.
increase
accuracy
NEP
models
both
by
improving
radial
functions
atomic-environment
descriptor
using
a
linear
combination
Chebyshev
basis
extending
angular
with
some
four-body
five-body
contributions
as
atomic
cluster
expansion
approach.
also
detail
efficient
approach
graphics
processing
units
well
workflow
for
construction
models,
we
demonstrate
application
large-scale
atomistic
simulations.
By
comparing
to
state-of-the-art
MLPs,
show
that
not
only
achieves
above-average
but
is
far
more
computationally
efficient.
These
results
GPUMD
promising
tool
solving
challenging
problems
requiring
highly
accurate,
To
enable
MLPs
minimal
training
set,
propose
an
active-learning
scheme
latent
space
pre-trained
model.
Finally,
introduce
three
separate
Python
packages,
GPYUMD,
CALORINE,
PYNEP,
which
integration
into
workflows.