A Smart Irrigation System Using the IoT and Advanced Machine Learning Model
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2024,
Volume and Issue:
10(4)
Published: Nov. 26, 2024
The
rapid
advancement
of
IoT
(Internet
Things)
technologies
and
sophisticated
machine
learning
models
is
driving
innovation
in
irrigation
systems,
laying
the
foundation
for
more
effective
eco-friendly
smart
agricultural
procedures.
This
systematic
literature
review
strives
to
uncover
advancements
challenges
implementation
IoT-based
systems
integrated
with
advanced
techniques.
By
analyzing
43
relevant
studies
published
between
2017
2024,
research
focuses
on
ability
these
have
evolved
meet
modern
agriculture
system.
Predictive
analytics,
anomaly
detection,
adaptive
control—that
enhance
precision
decision-making
processes.
Employing
PRISMA
methodology,
this
uncovers
strengths
limitations
current
highlighting
significant
achievements
real-time
data
utilization
system
responsiveness.
However,
it
also
brings
attention
unresolved
issues,
including
complexities
integration,
network
reliability,
scalability
frameworks.
Additionally,
study
identifies
crucial
gaps
standardization
need
flexible
solutions
that
can
adapt
diverse
environmental
conditions.
offering
a
comprehensive
analysis,
provides
key
insights
advancing
technologies,
emphasizing
importance
continued
overcoming
existing
barriers
wider
adoption
effectiveness
various
settings.
Language: Английский
CoralMatrix: A Scalable and Robust Secure Framework for Enhancing IoT Cybersecurity
Srikanth Reddy Vutukuru,
No information about this author
Srinivasa Chakravarthi Lade
No information about this author
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2025,
Volume and Issue:
11(1)
Published: Jan. 7, 2025
In
the
current
age
of
digital
transformation,
Internet
Things
(IoT)
has
revolutionized
everyday
objects,
and
IoT
gateways
play
a
critical
role
in
managing
data
flow
within
these
networks.
However,
dynamic
extensive
nature
networks
presents
significant
cybersecurity
challenges
that
necessitate
development
adaptive
security
systems
to
protect
against
evolving
threats.
This
paper
proposes
CoralMatrix
Security
framework,
novel
approach
employs
advanced
machine
learning
algorithms.
framework
incorporates
AdaptiNet
Intelligence
Model,
which
integrates
deep
reinforcement
for
effective
real-time
threat
detection
response.
To
comprehensively
evaluate
performance
this
study
utilized
N-BaIoT
dataset,
facilitating
quantitative
analysis
provided
valuable
insights
into
model's
capabilities.
The
results
demonstrate
robustness
across
various
dimensions
cybersecurity.
Notably,
achieved
high
accuracy
rate
approximately
83.33%,
highlighting
its
effectiveness
identifying
responding
threats
real-time.
Additionally,
research
examined
framework's
scalability,
adaptability,
resource
efficiency,
diverse
cyber-attack
types,
all
were
quantitatively
assessed
provide
comprehensive
understanding
suggests
future
work
optimize
larger
adapt
continuously
emerging
threats,
aiming
expand
application
scenarios.
With
proposed
algorithms,
emerged
as
promising,
efficient,
effective,
scalable
solution
Cyber
Security.
Language: Английский
Adaptive Computational Intelligence Algorithms for Efficient Resource Management in Smart Systems
R. Logesh Babu,
No information about this author
K. Tamilselvan,
No information about this author
N. Purandhar
No information about this author
et al.
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2025,
Volume and Issue:
11(1)
Published: Jan. 9, 2025
The
rapid
evolution
of
smart
systems,
including
Internet
Things
(IoT)
devices,
grids,
and
autonomous
vehicles,
has
led
to
the
need
for
efficient
resource
management
optimize
performance,
reduce
energy
consumption,
enhance
system
reliability.
This
paper
presents
adaptive
computational
intelligence
(CI)
algorithms
as
an
effective
solution
addressing
dynamic
challenges
in
systems.
Specifically,
we
explore
application
techniques
such
fuzzy
logic,
genetic
algorithms,
particle
swarm
optimization,
neural
networks
adaptively
manage
resources
like
energy,
bandwidth,
processing
power,
storage
real-time.
These
CI
offer
robust
decision-making
capabilities,
enabling
systems
efficiently
allocate
based
on
environmental
changes,
demands,
user
preferences.
discusses
integration
these
with
real-time
data
acquisition
providing
a
framework
scalable
management.
Additionally,
evaluate
performance
various
environments,
highlighting
their
ability
efficiency,
operational
costs,
improve
overall
experience.
proposed
approach
demonstrates
significant
improvements
over
traditional
techniques,
making
it
promising
next-generation
Language: Английский
AI-Driven Cybersecurity: Enhancing Threat Detection and Mitigation with Deep Learning
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2025,
Volume and Issue:
11(2)
Published: March 23, 2025
AI-driven
cybersecurity
has
emerged
as
a
transformative
solution
for
combating
increasingly
sophisticated
cyber
threats.
This
research
proposes
an
advanced
deep
learning-based
framework
aimed
at
enhancing
threat
detection
and
mitigation
performance.
Leveraging
Convolutional
Neural
Networks
(CNNs)
Long
Short-Term
Memory
(LSTM)
architectures,
the
proposed
model
effectively
identifies
anomalies
classifies
potential
threats
with
high
accuracy
minimal
false
positives.
The
was
rigorously
evaluated
using
real-time
network
traffic
datasets,
demonstrating
notable
increase
in
by
18.5%,
achieving
of
97.4%,
compared
to
traditional
machine
learning
methods
(78.6%).
Additionally,
response
time
significantly
reduced
25%,
while
computational
overhead
decreased
30%,
overall
system
responsiveness.
Experimental
results
further
show
40%
reduction
downtime
incidents
due
faster
identification
proactive
approach
thus
provides
substantial
improvements
security
performance
metrics,
underscoring
its
robust
dynamic
landscapes
Language: Английский
Innovative Computational Intelligence Frameworks for Complex Problem Solving and Optimization
N. Ramesh Babu,
No information about this author
Vidya Kamma,
No information about this author
R. Logesh Babu
No information about this author
et al.
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2025,
Volume and Issue:
11(1)
Published: Jan. 9, 2025
The
rapid
advancement
of
computational
intelligence
(CI)
techniques
has
enabled
the
development
highly
efficient
frameworks
for
solving
complex
optimization
problems
across
various
domains,
including
engineering,
healthcare,
and
industrial
systems.
This
paper
presents
innovative
that
integrate
advanced
algorithms
such
as
Quantum-Inspired
Evolutionary
Algorithms
(QIEA),
Hybrid
Metaheuristics,
Deep
Learning-based
models.
These
aim
to
address
challenges
by
improving
convergence
rates,
solution
accuracy,
efficiency.
In
context
a
framework
was
successfully
used
predict
optimal
treatment
plans
cancer
patients,
achieving
92%
accuracy
rate
in
classification
tasks.
proposed
demonstrate
potential
addressing
broad
spectrum
problems,
from
resource
allocation
smart
grids
dynamic
scheduling
manufacturing
integration
cutting-edge
CI
methods
offers
promising
future
optimizing
performance
real-world
wide
range
industries.
Language: Английский
Metaheuristic-Driven Optimization for Efficient Resource Allocation in Cloud Environments
M. Revathi,
No information about this author
K. Manju,
No information about this author
B. Chitradevi
No information about this author
et al.
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2025,
Volume and Issue:
11(1)
Published: Jan. 7, 2025
Intrusion
Detection
Systems
(IDS)
play
a
pivotal
role
in
safeguarding
networks
against
evolving
cyber
threats.
This
research
focuses
on
enhancing
the
performance
of
IDS
using
deep
learning
models,
specifically
XAI,
LSTM,
CNN,
and
GRU,
evaluated
NSL-KDD
dataset.
The
dataset
addresses
limitations
earlier
benchmarks
by
eliminating
redundancies
balancing
classes.
A
robust
preprocessing
pipeline,
including
normalization,
one-hot
encoding,
feature
selection,
was
employed
to
optimize
model
inputs.
Performance
metrics
such
as
Precision,
Recall,
F1-Score,
Accuracy
were
used
evaluate
models
across
five
attack
categories:
DoS,
Probe,
R2L,
U2R,
Normal.
Results
indicate
that
XAI
consistently
outperformed
other
achieving
highest
accuracy
(91.2%)
Precision
(91.5%)
post-BAT
optimization.
Comparative
analyses
confusion
matrices
protocol
distributions
revealed
dominance
DoS
attacks
highlighted
specific
challenges
with
R2L
U2R
study
demonstrates
effectiveness
optimized
detecting
complex
attacks,
paving
way
for
adaptive
solutions.
Language: Английский
A novel optimized deep learning based intrusion detection framework for an IoT networks
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2024,
Volume and Issue:
10(4)
Published: Nov. 26, 2024
The
burgeoning
importance
of
Internet
Things
(IoT)
and
its
diverse
applications
have
sparked
significant
interest
in
study
circles.
inherent
diversity
within
IoT
networks
renders
them
suitable
for
a
myriad
real-time
applications,
firmly
embedding
into
the
fabric
daily
life.
While
devices
streamline
various
activities,
their
susceptibility
to
security
threats
is
glaring
concern.
Current
inadequacies
measures
render
vulnerable,
presenting
an
enticing
target
attackers.
This
suggests
novel
dealing
address
this
challenge
through
execution
Intrusion
Detection
Systems
(IDS)
leveraging
superior
deep
learning
models.
Inspired
by
benefits
Long
Short
Term
Memory
(LSTM),
we
introduce
Genetic
Bee
LSTM(GBLSTM)
development
intelligent
IDS
capable
detecting
wide
range
cyber-attacks
targeting
area.
methodology
comprises
four
key
execution:
(i)
collection
unit
profiling
normal
device
behavior,
(ii)
Identification
malicious
during
attack,
(iii)
Prediction
attack
types
implemented
network.
Intensive
experimentations
suggested
are
conducted
using
validation
methods
prominent
metrics
across
different
threat
scenarios.
Moreover,
comprehensive
experiments
evaluate
models
alongside
existing
results
demonstrate
that
GBLSTM-models
outperform
other
intellectual
terms
accuracy,
precision,
recall,
underscoring
efficacy
securing
networks.
Language: Английский
Remote Monitoring and Early Detection of Labor Progress Using IoT-Enabled Smart Health Systems for Rural Healthcare Accessibility
D. Jayasutha
No information about this author
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2024,
Volume and Issue:
10(4)
Published: Nov. 25, 2024
Delayed
detection
of
labor
pain
in
pregnant
women,
especially
during
their
first
delivery,
often
leads
to
delays
reaching
healthcare
facilities,
potentially
resulting
complications.
This
research
proposes
an
innovative
IoT-enabled
system
for
remote
monitoring
progress
and
fetal
health,
designed
specifically
address
the
needs
women
areas
within
a
100
km
radius
facilities.
The
includes
wearable
device
integrated
with
sensors
detect
onset
continuously
monitor
heartbeat.
Upon
detecting
pain,
automatically
sends
alert
medical
team,
allowing
timely
intervention.
Experimental
results
demonstrate
system's
efficacy
99.2%
accuracy
98.5%
reliability
heartbeat
monitoring.
latency
transmission
was
measured
at
average
3.2
seconds,
ensuring
prompt
notification
providers.
proposed
solution
enhances
accessibility
maternal
care,
reduces
complications
due
delayed
hospital
admission,
provides
continuous
monitoring,
even
resource-constrained
environments.
innovation
bridges
gap
delivery
underserved
regions,
offering
practical,
cost-effective,
scalable
solution.
.
Language: Английский
Deep Learning Algorithm Design for Discovery and Dysfunction of Landmines
S. Leelavathy,
No information about this author
S. Balakrishnan,
No information about this author
M. Manikandan
No information about this author
et al.
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2024,
Volume and Issue:
10(4)
Published: Dec. 21, 2024
Deep
Learning
is
a
cutting-edge
technology
which
has
noteworthy
impact
in
the
real-world
applications.
The
multi-layer
neural
nets
involved
blueprint
of
deep
learning
enables
it
to
deliver
comprehensive
decision-making
system
with
quality
“think
alike
human
cerebrum”.
assumes
an
essential
part
various
fields
like
horticulture,
medication,
substantial
business
and
so
forth.
can
be
well
prompted
remote
sensing
applications
especially
perilous
military
location
land
mines
detected
using
algorithm
design
technique
aided
distinctive
machine
tools
techniques.
intelligent
designed
by
process
involves
massive
dataset
including
assorted
features
landmines
size,
sort,
dampness,
ground
profundity
on.
Incorporation
Geographical
Information
System
give
prevalent
statistical
analysis
varied
landmines.
multiple
layers
present
schema
may
increase
feature
extraction
knowledge
representation
through
complexities
landmines’
input
sets.
likelihood
brokenness
increased
utilization
prediction
model
enormously
helps
survival
militaries,
creating
social
effect.
Language: Английский
The Impact of Clinical Parameters on LSTM-based Blood Glucose Estimate in Type 1 Diabetes
Sunandha Rajagopal,
No information about this author
N. Thangarasu
No information about this author
International Journal of Computational and Experimental Science and Engineering,
Journal Year:
2024,
Volume and Issue:
10(4)
Published: Dec. 3, 2024
Accurate
forecasting
of
blood
sugar
levels
is
essential
for
managing
diabetes,
especially
Type-1
reducing
incidences,
and
diminishing
care,
costs
in
patients.
In
this
study,
a
Long
Short-Term
Memory
Recurrent
Neural
Network
(LSTM)
model
has
been
employed
to
predict
glucose
using
clinical
data.
The
research
focuses
on
identifying
analyzing
several
key
parameters
that
play
significant
role
determining
future
levels,
ensuring
robust
reliable
prediction
framework.
We
have
considered
patient-specific
features:
Insulin-Sensitivity-Factor
(ISF),
total
daily
dose
(TDD)
insulin,
HbA1C
height
weight
patient,
age
gender
while
the
performance
Blood
Glucose.
thought
training
LSTM
models
large
dataset
studying
most
important
predictors
with
their
predictive
power
would
be
beneficial.
results
indicate
including
these
improves
accuracy
provides
valuable
information
individuals
control
diabetes.
This
analysis
highlights
efficiency
networks
making
use
patient
data
improve
models,
eventually
aiding
more
effective
individualized
treatment
strategies
Type
1
diabetic
patients
(T1D).
work
also
examines
extent
which
each
parameter
influences
providing
deeper
insights
into
relative
impact
significance
model.
Language: Английский