Biomedical Engineering and Computational Biology,
Journal Year:
2025,
Volume and Issue:
16
Published: Feb. 1, 2025
This
work
presents
an
enhanced
identification
procedure
utilising
bioinformatics
data,
employing
optimisation
techniques
to
tackle
crucial
difficulties
in
healthcare
operations.
A
system
model
is
designed
essential
by
analysing
major
contributions,
including
risk
factors,
data
integration
and
interpretation,
error
rates
wastage
gain.
Furthermore,
all
aspects
are
integrated
with
deep
learning
optimisation,
encompassing
normalisation
hybrid
methodologies
efficiently
manage
large-scale
resulting
personalised
solutions.
The
implementation
of
the
suggested
technology
real
time
addresses
significant
disparity
between
data-driven
applications,
hence
facilitating
seamless
genetic
insights.
contributions
illustrated
time,
results
presented
through
simulation
experiments
4
scenarios
2
case
studies.
Consequently,
comparison
research
reveals
that
efficacy
for
enhancing
routes
stands
at
7%,
while
complexity
diminish
1%,
thereby
indicating
operations
can
be
transformed
computational
biology.
Scientific Reports,
Journal Year:
2025,
Volume and Issue:
15(1)
Published: Jan. 28, 2025
Internet
of
Things
(IoT)
is
one
the
most
important
emerging
technologies
that
supports
Metaverse
integrating
process,
by
enabling
smooth
data
transfer
among
physical
and
virtual
domains.
Integrating
sensor
devices,
wearables,
smart
gadgets
into
environment
enables
IoT
to
deepen
interactions
enhance
immersion,
both
crucial
for
a
completely
integrated,
data-driven
Metaverse.
Nevertheless,
because
devices
are
often
built
with
minimal
hardware
connected
Internet,
they
highly
susceptible
different
types
cyberattacks,
presenting
significant
security
problem
maintaining
secure
infrastructure.
Conventional
techniques
have
difficulty
countering
these
evolving
threats,
highlighting
need
adaptive
solutions
powered
artificial
intelligence
(AI).
This
work
seeks
improve
trust
in
edge
integrated
study
revolves
around
hybrid
framework
combines
convolutional
neural
networks
(CNN)
machine
learning
(ML)
classifying
models,
like
categorical
boosting
(CatBoost)
light
gradient-boosting
(LightGBM),
further
optimized
through
metaheuristics
optimizers
leveraged
performance.
A
two-leveled
architecture
was
designed
manage
intricate
data,
detection
classification
attacks
within
networks.
thorough
analysis
utilizing
real-world
network
dataset
validates
proposed
architecture's
efficacy
identification
specific
variants
malevolent
assaults,
classic
multi-class
challenge.
Three
experiments
were
executed
open
public,
where
top
models
attained
supreme
accuracy
99.83%
classification.
Additionally,
explainable
AI
methods
offered
valuable
supplementary
insights
model's
decision-making
supporting
future
collection
efforts
enhancing
systems.
Scientific Reports,
Journal Year:
2025,
Volume and Issue:
15(1)
Published: Feb. 3, 2025
In
the
present
scenario,
Internet
of
Things
(IoT)
and
edge
computing
technologies
have
been
developing
rapidly,
foremost
to
development
new
tasks
in
security
privacy.
Personal
information
privacy
leakage
become
main
concerns
IoT
surroundings.
The
promptly
IoT-connected
devices
below
an
integrated
Machine
Learning
(ML)
method
might
threaten
data
confidentiality.
standard
centralized
ML-assisted
methods
challenging
because
they
require
vast
numbers
a
vital
unit.
Due
rising
distribution
many
systems
linked
devices,
decentralized
ML
solutions
required.
Federated
learning
(FL)
was
proposed
as
optimal
solution
discover
these
issues.
Still,
heterogeneity
environments
poses
essential
task
when
executing
FL.
Therefore,
this
paper
develops
Intelligent
Deep
Model
for
Enhancing
Security
(IDFLM-ES)
approach
IoT-enabled
edge-computing
environment.
presented
IDFLM-ES
aims
identify
unwanted
intrusions
certify
safety
To
accomplish
this,
technique
introduces
federated
hybrid
deep
belief
network
(FHDBN)
model
using
FL
on
time
series
produced
by
devices.
Besides,
uses
normalization
golden
jackal
optimization
(GJO)
based
feature
selection
pre-processing
step.
learns
individual
distributed
representation
over
databases
enhance
convergence
quick
learning.
Finally,
dung
beetle
optimizer
(DBO)
is
utilized
choose
effectual
hyperparameter
FHDBN
model.
simulation
value
methodology
verified
benchmark
database.
experimental
validation
portrayed
superior
accuracy
98.24%
compared
other
models.
Engineering Reports,
Journal Year:
2024,
Volume and Issue:
unknown
Published: Nov. 13, 2024
ABSTRACT
In
the
realm
of
astrophysical
numerical
calculations,
demand
for
enhanced
computing
power
is
imperative.
The
time‐consuming
nature
particularly
in
domain
solar
convection,
poses
a
significant
challenge
Astrophysicists
seeking
to
analyze
new
data
efficiently.
Because
they
let
different
kinds
be
worked
on
separately,
parallel
algorithms
are
good
way
speed
up
this
kind
work.
A
lot
study
about
how
use
both
multi‐core
computers
and
GPUs
do
math
work
energy
at
same
time.
Cutting
down
time
it
takes
with
main
goal.
This
way,
can
looked
more
quickly
without
having
practice
long
It
works
well
when
you
things
parallel,
especially
3D
tasks,
which
speeds
lot.
proof
important
adjust
parallelization
methods
based
size
numbers.
But
2D
math,
than
one
core
better.
results
not
only
fix
bugs
models
but
also
show
that
changes
little
gear
processed.
Future Healthcare Journal,
Journal Year:
2024,
Volume and Issue:
11(3), P. 100182 - 100182
Published: Sept. 1, 2024
The
presence
of
artificial
intelligence
(AI)
in
healthcare
is
a
powerful
and
game-changing
force
that
completely
transforming
the
industry
as
whole.
Using
sophisticated
algorithms
data
analytics,
AI
has
unparalleled
prospects
for
improving
patient
care,
streamlining
operational
efficiency,
fostering
innovation
across
ecosystem.
This
study
conducts
comprehensive
bibliometric
analysis
research
on
healthcare,
utilising
SCOPUS
database
primary
source.