arXiv (Cornell University),
Journal Year:
2023,
Volume and Issue:
unknown
Published: Jan. 1, 2023
We
present
a
novel
probabilistic
deep
learning
approach,
the
'Stochastic
Latent
Transformer'
(SLT),
designed
for
efficient
reduced-order
modelling
of
stochastic
partial
differential
equations.
Stochastically
driven
flow
models
are
pertinent
to
diverse
range
natural
phenomena,
including
jets
on
giant
planets,
ocean
circulation,
and
variability
midlatitude
weather.
However,
much
recent
progress
in
has
predominantly
focused
deterministic
systems.
The
SLT
comprises
stochastically-forced
transformer
paired
with
translation-equivariant
autoencoder,
trained
towards
Continuous
Ranked
Probability
Score.
showcase
its
effectiveness
by
applying
it
well-researched
zonal
jet
system,
where
interaction
between
stochastically
forced
eddies
mean
results
rich
low-frequency
variability.
accurately
reproduces
system
dynamics
across
various
integration
periods,
validated
through
quantitative
diagnostics
that
include
spectral
properties
rate
transitions
distinct
states.
achieves
five-order-of-magnitude
speedup
emulating
zonally-averaged
compared
direct
numerical
simulations.
This
acceleration
facilitates
cost-effective
generation
large
ensembles,
enabling
exploration
statistical
questions
concerning
probabilities
spontaneous
transition
events.
Physics of Fluids,
Journal Year:
2024,
Volume and Issue:
36(2)
Published: Feb. 1, 2024
Traditional
fluid–structure
interaction
(FSI)
simulation
is
computationally
demanding,
especially
for
bi-directional
FSI
problems.
To
address
this,
a
masked
deep
neural
network
(MDNN)
developed
to
quickly
and
accurately
predict
the
unsteady
flow
field.
By
integrating
MDNN
with
structural
dynamic
solver,
an
system
proposed
perform
of
flexible
vertical
plate
oscillation
in
fluid
large
deformation.
The
results
show
that
both
field
prediction
structure
response
are
consistent
traditional
system.
Furthermore,
method
highly
effective
mitigating
error
accumulation
during
temporal
predictions,
making
it
applicable
various
deformation
Notably,
model
reduces
computational
time
millisecond
scale
each
step
regarding
part,
resulting
increase
nearly
two
orders
magnitude
speed,
which
greatly
enhances
speed
Physics of Fluids,
Journal Year:
2023,
Volume and Issue:
35(7)
Published: July 1, 2023
Data-driven
prediction
of
laminar
flow
and
turbulent
in
marine
aerospace
engineering
has
received
extensive
research
demonstrated
its
potential
real-time
recently.
However,
usually
large
amounts
high-fidelity
data
are
required
to
describe
accurately
predict
the
complex
physical
information,
while
reality,
only
limited
available
due
high
experimental/computational
cost.
Therefore,
this
work
proposes
a
novel
multi-fidelity
learning
method
based
on
Fourier
neural
operator
by
jointing
abundant
low-fidelity
under
transfer
paradigm.
First,
as
resolution-invariant
operator,
is
first
gainfully
applied
integrate
directly,
which
can
utilize
simultaneously.
Then,
framework
developed
for
current
task
extracting
rich
knowledge
assist
modeling
training,
further
improve
data-driven
accuracy.
Finally,
three
application
problems
chosen
validate
accuracy
proposed
model.
The
results
demonstrate
that
our
effectiveness
when
compared
with
other
models
99%
all
selected
field
problems.
Additionally,
model
without
86%.
Significantly,
simple
structure
precision
fluid
problems,
provide
reference
construction
subsequent
Physics of Fluids,
Journal Year:
2024,
Volume and Issue:
36(2)
Published: Feb. 1, 2024
Porosity,
as
a
key
parameter
to
describe
the
properties
of
rock
reservoirs,
is
essential
for
evaluating
permeability
and
fluid
migration
performance
underground
rocks.
In
order
overcome
limitations
traditional
logging
porosity
interpretation
methods
in
face
geological
complexity
nonlinear
relationships,
this
study
introduces
CNN
(convolutional
neural
network)-transformer
model,
which
aims
improve
accuracy
generalization
ability
prediction.
CNNs
have
excellent
spatial
feature
capture
capabilities.
The
convolution
operation
can
effectively
learn
mapping
relationship
local
features,
so
better
correlation
well
log.
Transformer
models
are
able
complex
sequence
relationships
between
different
depths
or
time
points.
This
enables
model
integrate
information
from
times,
prediction
accuracy.
We
trained
on
log
dataset
ensure
that
it
has
good
ability.
addition,
we
comprehensively
compare
CNN-transformer
with
other
machine
learning
verify
its
superiority
Through
analysis
experimental
results,
shows
task
introduction
will
bring
new
perspective
development
technology
provide
more
efficient
accurate
tool
field
geoscience.
Physics of Fluids,
Journal Year:
2025,
Volume and Issue:
37(2)
Published: Feb. 1, 2025
Reduced-order
modeling
of
fluid
flows
has
been
an
active
area
research.
It
approximates
the
evolution
physical
systems
in
time
terms
coherent
patterns
and
structures
that
generally
consist
a
dimensionality
reduction
mechanism
dynamical
model
reduced
state
space.
This
paper
proposes
deep
learning-based
reduced-order
composed
β-variational
autoencoder,
multilayer
perceptron,
transformer
architectures
for
problems
governed
by
parameterized
convection-dominated
partial
differential
equations.
In
our
approach,
autoencoder
is
utilized
as
mechanism,
trained
to
predict
future
system,
perceptron
applied
learn
relationship
between
different
parameter
values
latent
space
representations.
Therefore,
system
can
be
obtained
online
phase.
The
proposed
method
tested
on
several
benchmark
equations,
such
Burgers'
equation,
traffic
flow
problem,
shallow
water
Navier–Stokes
equation.
results
demonstrate
applicability
effectiveness
Computer Methods in Applied Mechanics and Engineering,
Journal Year:
2024,
Volume and Issue:
426, P. 116983 - 116983
Published: April 13, 2024
Developing
fast
surrogates
for
Partial
Differential
Equations
(PDEs)
will
accelerate
design
and
optimization
in
almost
all
scientific
engineering
applications.
Neural
networks
have
been
receiving
ever-increasing
attention
demonstrated
remarkable
success
computational
modeling
of
PDEs,
however;
their
prediction
accuracy
is
not
at
the
level
full
deployment.
In
this
work,
we
utilize
transformer
architecture,
backbone
numerous
state-of-the-art
AI
models,
to
learn
dynamics
physical
systems
as
mixing
spatial
patterns
learned
by
a
convolutional
autoencoder.
Moreover,
incorporate
idea
multi-scale
hierarchical
time-stepping
increase
speed
decrease
accumulated
error
over
time.
Our
model
achieves
similar
or
better
results
predicting
time-evolution
Navier–Stokes
equations
compared
powerful
Fourier
Operator
(FNO)
two
transformer-based
neural
operators
OFormer
Galerkin
Transformer.
The
code
data
are
available
on
https://github.com/BaratiLab/MST_PDE.
Machine Learning Science and Technology,
Journal Year:
2024,
Volume and Issue:
5(1), P. 015032 - 015032
Published: Feb. 9, 2024
Abstract
Solving
partial
differential
equations
(PDEs)
is
the
core
of
many
fields
science
and
engineering.
While
classical
approaches
are
often
prohibitively
slow,
machine
learning
models
fail
to
incorporate
complete
system
information.
Over
past
few
years,
transformers
have
had
a
significant
impact
on
field
Artificial
Intelligence
seen
increased
usage
in
PDE
applications.
However,
despite
their
success,
currently
lack
integration
with
physics
reasoning.
This
study
aims
address
this
issue
by
introducing
Physics
Informed
Token
Transformer
(PITT).
The
purpose
PITT
knowledge
embedding
PDEs
into
process.
uses
an
equation
tokenization
method
learn
analytically-driven
numerical
update
operator.
By
tokenizing
derivatives,
transformer
become
aware
underlying
behind
physical
processes.
To
demonstrate
this,
tested
challenging
1D
2D
operator
tasks.
results
show
that
outperforms
popular
neural
has
ability
extract
physically
relevant
information
from
governing
equations.
Data-Centric Engineering,
Journal Year:
2025,
Volume and Issue:
6
Published: Jan. 1, 2025
Abstract
Surrogate
models
of
turbulent
diffusive
flames
could
play
a
strategic
role
in
the
design
liquid
rocket
engine
combustion
chambers.
The
present
article
introduces
method
to
obtain
data-driven
surrogate
for
coaxial
injectors,
by
leveraging
an
inductive
transfer
learning
strategy
over
U-Net
with
available
multifidelity
Large
Eddy
Simulations
(LES)
data.
resulting
preserve
reasonable
accuracy
while
reducing
offline
computational
cost
data-generation.
First,
database
about
100
low-fidelity
LES
simulations
shear-coaxial
operating
gaseous
oxygen
and
methane
as
propellants,
has
been
created.
experiments
explores
three
variables:
chamber
radius,
recess-length
oxidizer
post,
mixture
ratio.
Subsequently,
U-Nets
were
trained
upon
this
dataset
provide
approximations
temporal-averaged
two-dimensional
flow
field.
Despite
fact
that
neural
networks
are
efficient
non-linear
data
emulators,
purely
approaches
their
quality
is
directly
impacted
precision
they
upon.
Thus,
high-fidelity
(HF)
created,
made
10
simulations,
much
greater
per
sample.
amalgamation
low
HF
during
transfer-learning
process
enables
improvement
model’s
fidelity
without
excessive
additional
cost.