IET Networks,
Год журнала:
2024,
Номер
13(4), С. 301 - 312
Опубликована: Март 12, 2024
Abstract
Industrial
IoT
(IIoT)
applications
are
widely
used
in
multiple
use
cases
to
automate
the
industrial
environment.
Industry
4.0
presents
challenges
numerous
areas,
including
heterogeneous
data,
efficient
data
sensing
and
collection,
real‐time
processing,
higher
request
arrival
rates,
due
massive
amount
of
data.
Building
a
time‐sensitive
network
that
supports
voluminous
dynamic
traffic
from
is
complex.
Therefore,
authors
provide
insights
into
networks
propose
strategy
for
enhanced
management.
An
multivariate
forecasting
model
adapts
Multivariate
Singular
Spectrum
Analysis
employed
an
SDN‐based
IIoT
network.
The
proposed
method
considers
flow
parameters,
such
as
packet
sent
received,
bytes
source
rate,
round
trip
time,
jitter,
rate
duration
predict
future
flows.
experimental
results
show
can
effectively
by
contemplating
every
possible
variation
observed
samples
average
load,
delay,
inter‐packet
sending
with
improved
accuracy.
forecast
shows
reduced
error
estimation
when
compared
existing
methods
Mean
Absolute
Percentage
Error
1.64%,
Squared
11.99,
Root
3.46
2.63.
Neurocomputing,
Год журнала:
2023,
Номер
545, С. 126327 - 126327
Опубликована: Май 15, 2023
Deep
neural
networks
(DNNs)
are
currently
being
deployed
as
machine
learning
technology
in
a
wide
range
of
important
real-world
applications.
DNNs
consist
huge
number
parameters
that
require
millions
floating-point
operations
(FLOPs)
to
be
executed
both
and
prediction
modes.
A
more
effective
method
is
implement
cloud
computing
system
equipped
with
centralized
servers
data
storage
sub-systems
high-speed
high-performance
capabilities.
This
paper
presents
an
up-to-date
survey
on
current
state-of-the-art
for
computing.
Various
DNN
complexities
associated
different
architectures
presented
discussed
alongside
the
necessities
using
We
also
present
extensive
overview
platforms
deployment
discuss
them
detail.
Moreover,
applications
already
systems
reviewed
demonstrate
advantages
DNNs.
The
emphasizes
challenges
deploying
provides
guidance
enhancing
new
deployments.
The
use
of
sophisticated
algorithms
has
radically
altered
cloud
computing
predictive
auto-scaling
and
upkeep
approaches.
Recurrent
Neural
Networks
(RNNs),
the
Prophet
Algorithm,
K-Means
Clustering,
Seasonal
Autoregressive
Integrated
Moving-Average
(SARIMA)
all
play
a
role
in
improving
infrastructures,
their
interactions
are
studied
here.
By
capitalizing
on
superiority
processing
sequential
data,
RNNs
can
deduce
accurate
workload
forecasts
from
past
patterns.
Concurrently,
Algorithm
records
seasonal
annual
patterns,
which
adds
depth
to
forecasts.
grouping
servers
into
clusters
with
similar
consumption
Clustering
improves
resource
allocation
efficiency
paves
way
for
precise
auto-scaling.
SARIMA
models
capture
nuanced
fluctuations,
lead
reliable
demand
This
explores
state-of-the-art
future
directions
these
techniques,
illuminating
potential
revolutionize
current
approaches
management.
When
methods
combined,
service
providers
better
able
proactively
scale
resources,
hence
reducing
likelihood
bottlenecks
outages.
It
foresees
development
subsequent
widespread
variety
fields
outside
computing,
such
as
Internet
Things
(IoT)
networks
edge
infrastructures.
IEEE Transactions on Parallel and Distributed Systems,
Год журнала:
2024,
Номер
35(3), С. 499 - 516
Опубликована: Янв. 23, 2024
Workload
prediction
plays
a
crucial
role
in
resource
management
of
large
scale
cloud
datacenters.
Although
quite
number
methods/algorithms
have
been
proposed,
long-term
changes
not
explicitly
identified
and
considered.
Due
to
shifty
user
demands,
workload
re-locations,
or
other
reasons,
the
”resource
usage
pattern”
workload,
which
is
usually
stable
short-term
view,
may
change
dynamically
range.
Such
dynamic
cause
significant
accuracy
degradation
for
algorithms.
How
handle
such
an
open
challenging
issue.
In
this
paper,
we
propose
Evolution
Graph
Prediction
(EvoGWP),
novel
method
that
can
predict
using
delicately
designed
graph-based
evolution
learning
algorithm.
EvoGWP
automatically
extracts
shapelets
identify
patterns
workloads
fine-grained
level,
predicts
by
considering
factors
both
temporal
spatial
dimensions.
We
design
two-level
importance
based
shapelet
extraction
mechanism
mine
new
pattern
dimension,
graph
model
fuse
interference
among
different
dimension.
By
combining
from
each
single
workloads,
then
spatio-temporal
GNN-based
encoder-decoder
workloads.
Experiments
real
trace
data
Alibaba,
Tencent
Google
show
improves
up
58.6%
over
state-of-the-art
methods.
Moreover,
outperform
methods
terms
convergence.
To
best
our
knowledge,
first
work
identifies
accurately
Computers & Electrical Engineering,
Год журнала:
2024,
Номер
119, С. 109506 - 109506
Опубликована: Июль 26, 2024
Cloud
computing
has
revolutionized
the
way
businesses
and
organizations
manage
their
computational
workloads.
However,
massive
data
centers
that
support
cloud
services
consume
a
lot
of
energy,
making
energy
sustainability
critical
concern.
To
address
this
challenge,
article
introduces
an
innovative
approach
to
optimize
consumption
in
environments
through
knowledge
acquisition.
The
proposed
method
uses
Knowledge
Acquisition
version
Gray
Wolf
Optimizer
(KAGWO)
algorithm
collect
on
availability
use
renewable
within
centers,
contributing
improved
computing.
KAGWO
is
introduced
provide
systematic
for
addressing
complex
problems
by
integrating
global
optimization
principles,
enhancing
decision-making
processes
with
fewer
configuration
parameters.
This
conducts
comparative
analysis
between
Swarm
Intelligence
Approach
(KASIA)
Genetic
Algorithm
(Pittsburgh)
highlight
benefits
advantages
former.
By
comparing
performance
KAGWO,
Pittsburgh
KASIA
terms
sustainability,
study
offers
valuable
insights
into
effectiveness
knowledge-acquisition-based
algorithms
optimizing
usage
environments.
results
demonstrate
outperforms
offering
more
accurate
acquisition
capabilities,
resulting
enhanced
sustainability.
Overall,
demonstrates
substantial
improvements
ranging
from
0.53%
5.23%
over
previous
paper
baselines,
particular
significance
found
slightly
outperforming
new
small,
medium
large
scenarios.
Transactions on Emerging Telecommunications Technologies,
Год журнала:
2025,
Номер
36(3)
Опубликована: Март 1, 2025
ABSTRACT
Workload
prediction
is
the
necessary
factor
in
cloud
data
center
for
maintaining
elasticity
and
scalability
of
resources.
However,
accuracy
workload
very
low,
because
redundancy,
noise,
low
center.
In
this
manuscript,
Prediction
Cloud
Data
Centers
using
Complex‐Valued
Spatio‐Temporal
Graph
Convolutional
Neural
Network
Optimized
with
Gazelle
Optimization
Algorithm
(CVSTGCN‐WLP‐CDC)
proposed.
Initially,
input
collected
from
two
standard
datasets
such
as
NASA
Saskatchewan
HTTP
traces
dataset.
Then,
preprocessing
Multi‐Window
Savitzky–Golay
Filter
(MWSGF)
used
to
remove
noise
redundant
data.
The
preprocessed
fed
CVSTGCN
a
dynamic
environment.
work,
proposed
Approach
(GOA)
enhance
weight
bias
parameters.
CVSTGCN‐WLP‐CDC
technique
executed
efficacy
based
on
structure
evaluated
several
performances
metrics
accuracy,
recall,
precision,
energy
consumption
correlation
coefficient,
sum
index
(SEI),
root
mean
square
error
(RMSE),
squared
(MPE),
percentage
(PER).
provides
23.32%,
28.53%
24.65%
higher
accuracy;
22.34%,
25.62%,
22.84%
lower
when
comparing
existing
methods
Artificial
Intelligence
augmented
evolutionary
approach
espoused
centres
architecture
(TCNN‐CDC‐WLP),
Performance
analysis
machine
learning
centered
techniques
(PA‐BPNN‐CWPC),
Machine
effectual
utilization
centers
(ARNN‐EU‐CDC)
respectively.