Journal of materials research/Pratt's guide to venture capital sources,
Journal Year:
2023,
Volume and Issue:
38(24), P. 5136 - 5150
Published: Sept. 18, 2023
Abstract
Machine
learning
(ML)
enables
the
development
of
interatomic
potentials
with
accuracy
first
principles
methods
while
retaining
speed
and
parallel
efficiency
empirical
potentials.
While
ML
traditionally
use
atom-centered
descriptors
as
inputs,
different
models
such
linear
regression
neural
networks
map
to
atomic
energies
forces.
This
begs
question:
what
is
improvement
in
due
model
complexity
irrespective
descriptors?
We
curate
three
datasets
investigate
this
question
terms
ab
initio
energy
force
errors:
(1)
solid
liquid
silicon,
(2)
gallium
nitride,
(3)
superionic
conductor
Li
$$_{10}$$
10
Ge(PS
$$_{6}$$
6
)
$$_{2}$$
2
(LGPS).
further
how
these
errors
affect
simulated
properties
verify
if
fitting
corresponds
measurable
property
prediction.
By
assessing
models,
we
observe
correlations
between
quantity
(e.g.
force)
error
respect
values.
Graphical
abstract
The Journal of Chemical Physics,
Journal Year:
2023,
Volume and Issue:
158(20)
Published: May 22, 2023
A
reliable
uncertainty
estimator
is
a
key
ingredient
in
the
successful
use
of
machine-learning
force
fields
for
predictive
calculations.
Important
considerations
are
correlation
with
error,
overhead
during
training
and
inference,
efficient
workflows
to
systematically
improve
field.
However,
case
neural-network
fields,
simple
committees
often
only
option
considered
due
their
easy
implementation.
Here,
we
present
generalization
deep-ensemble
design
based
on
multiheaded
neural
networks
heteroscedastic
loss.
It
can
efficiently
deal
uncertainties
both
energy
forces
take
sources
aleatoric
affecting
data
into
account.
We
compare
metrics
deep
ensembles,
committees,
bootstrap-aggregation
ensembles
using
an
ionic
liquid
perovskite
surface.
demonstrate
adversarial
approach
active
learning
progressively
refine
fields.
That
workflow
realistically
possible
thanks
exceptionally
fast
enabled
by
residual
nonlinear
learned
optimizer.
Chemical Reviews,
Journal Year:
2024,
Volume and Issue:
124(24), P. 13681 - 13714
Published: Nov. 21, 2024
The
field
of
data-driven
chemistry
is
undergoing
an
evolution,
driven
by
innovations
in
machine
learning
models
for
predicting
molecular
properties
and
behavior.
Recent
strides
ML-based
interatomic
potentials
have
paved
the
way
accurate
modeling
diverse
chemical
structural
at
atomic
level.
key
determinant
defining
MLIP
reliability
remains
quality
training
data.
A
paramount
challenge
lies
constructing
sets
that
capture
specific
domains
vast
space.
This
Review
navigates
intricate
landscape
essential
components
integrity
data
ensure
extensibility
transferability
resulting
models.
We
delve
into
details
active
learning,
discussing
its
various
facets
implementations.
outline
different
types
uncertainty
quantification
applied
to
atomistic
acquisition
correlations
between
estimated
true
error.
role
samplers
generating
informative
structures
highlighted.
Furthermore,
we
discuss
via
modified
surrogate
potential
energy
surfaces
as
innovative
approach
diversify
also
provides
a
list
publicly
available
cover
The Journal of Physical Chemistry Letters,
Journal Year:
2024,
Volume and Issue:
15(30), P. 7539 - 7547
Published: July 18, 2024
Ionic
liquids
(ILs)
are
an
exciting
class
of
electrolytes
finding
applications
in
many
areas
from
energy
storage
to
solvents,
where
they
have
been
touted
as
"designer
solvents"
can
be
mixed
precisely
tailor
the
physiochemical
properties.
As
using
machine
learning
interatomic
potentials
(MLIPs)
simulate
ILs
is
still
relatively
unexplored,
several
questions
need
answered
see
if
MLIPs
transformative
for
ILs.
Since
often
not
pure,
but
either
together
or
contain
additives,
we
first
demonstrate
that
a
MLIP
trained
compositionally
transferable;
i.e.,
applied
mixtures
ions
directly
on,
while
only
being
on
few
same
ions.
We
also
investigated
accuracy
novel
IL,
which
experimentally
synthesize
and
characterize.
Our
∼200
DFT
frames
reasonable
agreement
with
our
experiments
DFT.
npj Computational Materials,
Journal Year:
2024,
Volume and Issue:
10(1)
Published: April 29, 2024
Abstract
Efficiently
creating
a
concise
but
comprehensive
data
set
for
training
machine-learned
interatomic
potentials
(MLIPs)
is
an
under-explored
problem.
Active
learning,
which
uses
biased
or
unbiased
molecular
dynamics
(MD)
to
generate
candidate
pools,
aims
address
this
objective.
Existing
and
MD-simulation
methods,
however,
are
prone
miss
either
rare
events
extrapolative
regions—areas
of
the
configurational
space
where
unreliable
predictions
made.
This
work
demonstrates
that
MD,
when
by
MLIP’s
energy
uncertainty,
simultaneously
captures
regions
events,
crucial
developing
uniformly
accurate
MLIPs.
Furthermore,
exploiting
automatic
differentiation,
we
enhance
bias-forces-driven
MD
with
concept
bias
stress.
We
employ
calibrated
gradient-based
uncertainties
yield
MLIPs
similar
or,
sometimes,
better
accuracy
than
ensemble-based
methods
at
lower
computational
cost.
Finally,
apply
uncertainty-biased
alanine
dipeptide
MIL-53(Al),
generating
represent
both
spaces
more
accurately
models
trained
conventional
MD.
The Journal of Physical Chemistry Letters,
Journal Year:
2024,
Volume and Issue:
15(14), P. 3740 - 3747
Published: March 28, 2024
Machine
learning
interatomic
potentials
(MLIPs)
have
emerged
as
a
technique
that
promises
quantum
theory
accuracy
for
reduced
cost.
It
has
been
proposed
[J.
Chem.
Phys.
2023,
158,
084111]
MLIPs
trained
on
solely
liquid
water
data
cannot
accurately
transfer
to
the
vapor-liquid
equilibrium
while
recovering
many-body
decomposition
(MBD)
analysis
of
gas-phase
clusters.
This
suggests
do
not
directly
learn
physically
correct
interactions
molecules,
limiting
transferability.
In
this
work,
we
show
using
equivariant
architecture
and
3200
structures
reproduces
liquid-phase
properties
(e.g.,
density
within
0.003
g/cm3
between
230
365
K),
up
550
K,
MBD
cluster
six-body
interactions,
relative
energy
vibrational
states
ice
phases.
We
developed
allow
transferability
arbitrary
phases
remain
stable
in
nanosecond
long
simulations.
Machine Learning Science and Technology,
Journal Year:
2024,
Volume and Issue:
5(3), P. 035006 - 035006
Published: June 17, 2024
Abstract
Statistical
learning
algorithms
provide
a
generally-applicable
framework
to
sidestep
time-consuming
experiments,
or
accurate
physics-based
modeling,
but
they
introduce
further
source
of
error
on
top
the
intrinsic
limitations
experimental
theoretical
setup.
Uncertainty
estimation
is
essential
quantify
this
error,
and
make
application
data-centric
approaches
more
trustworthy.
To
ensure
that
uncertainty
quantification
used
widely,
one
should
aim
for
are
accurate,
also
easy
implement
apply.
In
particular,
including
an
existing
architecture
be
straightforward,
add
minimal
computational
overhead.
Furthermore,
it
manipulate
combine
multiple
machine-learning
predictions,
propagating
over
modeling
steps.
We
compare
several
well-established
frameworks
against
these
requirements,
propose
practical
approach,
which
we
dub
direct
propagation
shallow
ensembles,
provides
good
compromise
between
ease
use
accuracy.
present
benchmarks
generic
datasets,
in-depth
study
applications
field
atomistic
machine
chemistry
materials.
These
examples
underscore
importance
using
formulation
allows
errors
without
making
strong
assumptions
correlations
different
predictions
model.
Journal of the American Ceramic Society,
Journal Year:
2024,
Volume and Issue:
unknown
Published: June 9, 2024
Abstract
The
emergence
of
artificial
intelligence
has
provided
efficient
methodologies
to
pursue
innovative
findings
in
material
science.
Over
the
past
two
decades,
machine‐learning
potential
(MLP)
emerged
as
an
alternative
technology
density
functional
theory
(DFT)
and
classical
molecular
dynamics
(CMD)
simulations
for
computational
modeling
materials
estimation
their
properties.
MLP
offers
more
computation
compared
DFT,
while
providing
higher
accuracy
CMD.
This
enables
us
conduct
realistic
using
models
with
atoms
longer
simulation
times.
Indeed,
number
research
studies
utilizing
MLPs
significantly
increased
since
2015,
covering
a
broad
range
structures,
ranging
from
simple
complex,
well
various
chemical
physical
phenomena.
As
result,
there
are
high
expectations
further
applications
field
science
industrial
development.
review
aims
summarize
applications,
particularly
ceramics
glass
science,
fundamental
theories
facilitate
future
progress
utilization.
Finally,
we
provide
summary
discuss
perspectives
on
next
challenges
development
application
MLPs.
This
work
brings
the
leading
accuracy,
sample
efficiency,
and
robustness
of
deep
equivariant
neural
networks
to
extreme
computational
scale.
is
achieved
through
a
combination
innovative
model
architecture,
massive
parallelization,
models
implementations
optimized
for
efficient
GPU
utilization.
The
resulting
Allegro
architecture
bridges
accuracy-speed
tradeoff
atomistic
simulations
enables
description
dynamics
in
structures
unprecedented
complexity
at
quantum
fidelity.
To
illustrate
scalability
Allegro,
we
perform
nanoseconds-long
stable
protein
scale
up
44-million
atom
structure
complete,
all-atom,
explicitly
solvated
HIV
capsid
on
Perlmutter
supercomputer.
We
demonstrate
excellent
strong
scaling
100
million
atoms
70%
weak
5120
A100
GPUs.