IEEE Open Journal of the Communications Society,
Journal Year:
2024,
Volume and Issue:
5, P. 2039 - 2057
Published: Jan. 1, 2024
The
advent
of
the
Internet
Everything
and
new
Ultra-Reliable
Low-Latency
Communication
(URLLC)
services
has
resulted
in
an
exponential
growth
data
demands
at
network's
edge.
To
meet
stringent
performance
requirements
evolving
5G
(and
beyond)
applications,
deploying
dedicated
resources
closer
to
mobile
users
is
essential.
Multi-Access
Edge
Computing
(MEC)
a
promising
technology
for
bringing
computational
users.
However,
distributed
limited
MEC
must
be
effectively
optimized
maximize
number
benefiting
from
low-latency
each
time
slot
highly
congested,
large-scale,
dynamic
wireless
network
scenarios.
In
this
research,
we
propose
evaluate
novel
Artificial
Intelligence-Defined
Wireless
Networking
(AIDWN)
approach
that
builds
on
conventional
Software-Defined
(SDN),
implementing
AI-defined
application
plane
offloading
resource
allocation
MEC-enabled
networks.
AIDWN
implements
deep
reinforcement
learning
framework
neural
networks
dynamically
adapt
optimal
decisions
while
considering
handover,
mobility,
coordinated
challenges
multi-MEC
server
environments.
Compared
recent
state-of-the-art
proposals,
proposed
demonstrates
substantial
improvement,
utilizing
more
than
90%
per
across
all
servers.
It
also
accommodates
significantly
congested
We
identified
various
future
research
directions
highlighting
potential
simplifying
management
next-generation
ACM Computing Surveys,
Journal Year:
2023,
Volume and Issue:
55(13s), P. 1 - 37
Published: Jan. 11, 2023
Wireless
Powered
Mobile
Edge
Computing
(WPMEC)
is
an
integration
of
(MEC)
and
Power
Transfer
(WPT)
technologies,
to
both
improve
computing
capabilities
mobile
devices
energy
compensation
for
their
limited
battery
capabilities.
Generally,
transmitters,
devices,
edge
servers
form
a
WPMEC
system
that
realizes
closed
loop
sending
collecting
as
well
offloading
receiving
task
data.
Due
constraints
time-varying
network
environments,
time-coupled
levels,
the
half-duplex
character
joint
design
computation
resource
allocation
solutions
in
systems
has
become
extremely
challenging,
great
number
studies
have
been
devoted
it
recent
years.
In
this
article,
we
first
introduce
basic
model
system.
Then,
present
key
issues
techniques
related
WPMEC.
addition,
summarize
solve
critical
networks.
Finally,
discuss
some
research
challenges
open
issues.
IEEE Communications Surveys & Tutorials,
Journal Year:
2023,
Volume and Issue:
25(4), P. 2088 - 2132
Published: Jan. 1, 2023
The
sixth
generation
(6G)
wireless
systems
are
envisioned
to
enable
the
paradigm
shift
from
"connected
things"
intelligence",
featured
by
ultra
high
density,
large-scale,
dynamic
heterogeneity,
diversified
functional
requirements,
and
machine
learning
capabilities,
which
leads
a
growing
need
for
highly
efficient
intelligent
algorithms.
classic
optimization-based
algorithms
usually
require
precise
mathematical
model
of
data
links
suffer
poor
performance
with
computational
cost
in
realistic
6G
applications.
Based
on
domain
knowledge
(e.g.,
optimization
models
theoretical
tools),
(ML)
stands
out
as
promising
viable
methodology
many
complex
large-scale
problems
6G,
due
its
superior
performance,
efficiency,
scalability,
generalizability.
In
this
paper,
we
systematically
review
most
representative
"learning
optimize"
techniques
diverse
domains
networks
identifying
inherent
feature
underlying
problem
investigating
specifically
designed
ML
frameworks
perspective
optimization.
particular,
will
cover
algorithm
unrolling,
branch-and-bound,
graph
neural
network
structured
optimization,
deep
reinforcement
stochastic
end-to-end
semantic
well
federated
distributed
capable
addressing
challenging
arising
variety
crucial
Through
in-depth
discussion,
shed
light
excellent
ML-based
respect
classical
methods,
provide
insightful
guidance
develop
advanced
networks.
Neural
design,
tools
different
implementation
issues,
challenges
future
research
directions
also
discussed
support
practical
use
IEEE Access,
Journal Year:
2023,
Volume and Issue:
11, P. 25329 - 25350
Published: Jan. 1, 2023
An
inflection
point
in
the
computing
industry
is
occurring
with
implementation
of
Internet
Things
and
5G
communications,
which
has
pushed
centralized
cloud
toward
edge
resulting
a
paradigm
shift
computing.
The
purpose
to
provide
computing,
network
control,
storage
accommodate
computationally
intense
latency-critical
applications
at
resource-limited
endpoints.
Edge
allows
devices
offload
their
overflowing
tasks
servers.
This
procedure
may
completely
exploit
server's
computational
capabilities
efficiently
execute
operations.
However,
transferring
all
an
server
leads
long
processing
delays
surprisingly
high
energy
consumption
for
numerous
tasks.
Aside
from
this,
unused
powerful
centers
lead
resource
waste.
Thus,
hiring
collaborative
scheduling
approach
based
on
task
properties,
optimization
targets,
system
status
servers,
centers,
critical
successful
operation
paper
briefly
summarizes
architecture
information
processing.
Meanwhile,
scenarios
are
examined.
Resource
techniques
then
discussed
compared
four
collaboration
modes.
As
part
our
survey,
we
present
thorough
overview
various
offloading
schemes
proposed
by
researchers
Additionally,
according
literature
surveyed,
looked
fairness
load
balancing
indicators
scheduling.
Finally,
issues,
challenges,
future
directions
have
discussed.
IEEE Communications Surveys & Tutorials,
Journal Year:
2024,
Volume and Issue:
26(3), P. 2146 - 2175
Published: Jan. 1, 2024
Deep
learning
shows
immense
potential
for
strengthening
the
cyber-resilience
of
renewable
energy
supply
chains.
However,
research
gaps
in
comprehensive
benchmarks,
real-world
model
evaluations,
and
data
generation
tailored
to
domain
persist.
This
study
explores
applying
state-of-the-art
deep
techniques
secure
chains,
drawing
insights
from
over
300
publications.
We
aim
provide
an
updated,
rigorous
analysis
applications
this
field
guide
future
research.
systematically
review
literature
spanning
2020-2023,
retrieving
relevant
articles
major
databases.
examine
learning's
role
intrusion/anomaly
detection,
chain
cyberattack
detection
frameworks,
security
standards,
historical
attack
analysis,
management
strategies,
architectures,
cyber
datasets.
Our
demonstrates
enables
anomaly
by
processing
massively
distributed
data.
highlight
crucial
design
factors,
including
accuracy,
adaptation
capability,
communication
security,
resilience
adversarial
threats.
Comparing
18
attacks
informs
risk
analysis.
also
showcase
evaluating
their
relative
strengths
limitations
applications.
Moreover,
our
emphasizes
best
practices
curation,
considering
quality,
labeling,
access
efficiency,
governance.
Effective
integration
necessitates
tuning
guidance,
generation.
multi-dimensional
motivates
focused
efforts
on
enhancing
explanations,
securing
communications,
continually
retraining
models,
establishing
standardized
assessment
protocols.
Overall,
we
a
roadmap
progress
leveraging
potential.