Machine Learning and Deep Learning Paradigms: From Techniques to Practical Applications and Research Frontiers
Computers,
Journal Year:
2025,
Volume and Issue:
14(3), P. 93 - 93
Published: March 6, 2025
Machine
learning
(ML)
and
deep
(DL),
subsets
of
artificial
intelligence
(AI),
are
the
core
technologies
that
lead
significant
transformation
innovation
in
various
industries
by
integrating
AI-driven
solutions.
Understanding
ML
DL
is
essential
to
logically
analyse
applicability
identify
their
effectiveness
different
areas
like
healthcare,
finance,
agriculture,
manufacturing,
transportation.
consists
supervised,
unsupervised,
semi-supervised,
reinforcement
techniques.
On
other
hand,
DL,
a
subfield
ML,
comprising
neural
networks
(NNs),
can
deal
with
complicated
datasets
health,
autonomous
systems,
finance
industries.
This
study
presents
holistic
view
technologies,
analysing
algorithms
application’s
capacity
address
real-world
problems.
The
investigates
application
which
techniques
implemented.
Moreover,
highlights
latest
trends
possible
future
avenues
for
research
development
(R&D),
consist
developing
hybrid
models,
generative
AI,
incorporating
technologies.
aims
provide
comprehensive
on
serve
as
reference
guide
researchers,
industry
professionals,
practitioners,
policy
makers.
Language: Английский
A Bibliometric Review of Trends and Insights of Internet of Things on Cybersecurity Issues
Studies in computational intelligence,
Journal Year:
2025,
Volume and Issue:
unknown, P. 127 - 147
Published: Jan. 1, 2025
Language: Английский
BankNet: Real-Time Big Data Analytics for Secure Internet Banking
Kaushik Sathupadi,
No information about this author
Sandesh Achar,
No information about this author
Shyam Bhaskaran
No information about this author
et al.
Big Data and Cognitive Computing,
Journal Year:
2025,
Volume and Issue:
9(2), P. 24 - 24
Published: Jan. 26, 2025
The
rapid
growth
of
Internet
banking
has
necessitated
advanced
systems
for
secure,
real-time
decision
making.
This
paper
introduces
BankNet,
a
predictive
analytics
framework
integrating
big
data
tools
and
BiLSTM
neural
network
to
deliver
high-accuracy
transaction
analysis.
BankNet
achieves
exceptional
performance,
with
Root
Mean
Squared
Error
0.0159
fraud
detection
accuracy
98.5%,
while
efficiently
handling
rates
up
1000
Mbps
minimal
latency.
By
addressing
critical
challenges
in
operational
efficiency,
establishes
itself
as
robust
support
system
modern
banking.
Its
scalability
precision
make
it
transformative
tool
enhancing
security
trust
financial
services.
Language: Английский
Leveraging machine learning for enhanced cybersecurity: an intrusion detection system
Wurood Mahdi sahib,
No information about this author
Zainab Ali Abd Alhuseen,
No information about this author
Iman Dakhil Idan Saeedi
No information about this author
et al.
Service Oriented Computing and Applications,
Journal Year:
2024,
Volume and Issue:
unknown
Published: Nov. 11, 2024
Language: Английский
Cloud-Based Transaction Fraud Detection: An In-depth Analysis of ML Algorithms
Ali Alhchaimi
No information about this author
Wasit Journal of Computer and Mathematics Science,
Journal Year:
2024,
Volume and Issue:
3(2), P. 19 - 31
Published: June 30, 2024
Context:
Cloud-based
services
are
increasingly
central
in
financial
technology,
enabling
scalable
and
efficient
transactions.
However,
they
also
heighten
vulnerability
to
fraud,
challenging
the
security
of
online
activities.
Traditional
fraud
detection
struggles
against
sophisticated
tactics,
highlighting
need
for
advanced,
cloud-compatible
solutions.
Objectives:
This
study
assesses
machine
learning
(ML)
algorithms'
ability
detect
cloud
environments,
focusing
on
Logistic
Regression
(LR),
Decision
Trees
(DT),
Random
Forest
(RF),
Support
Vector
Machines
(SVM),
XGBoost
(XGB).
It
uses
a
comprehensive
dataset
determine
which
ML
model
best
identifies
fraudulent
transactions,
aiming
optimize
these
models
accuracy,
precision,
efficiency
real-time
detection.
Results:
The
outperformed
others
with
showing
high
effectiveness.
These
were
particularly
good
at
balancing
precision
recall,
minimizing
false
positives,
accurately
identifying
complex
transaction
patterns.
Conclusion:
ML,
especially
ensemble
boosting
like
Forest,
offers
strong
approach
detecting
cloud-based
systems.
Their
capacity
handle
vast
data
volumes
adapt
new
patterns
enhances
security.
Implication:
provides
guide
implement
emphasizes
importance
continual
innovation
tackle
digital
finance
suggesting
that
adopting
advanced
can
significantly
reduce
risks,
ensuring
secure,
efficient,
trustworthy
platform
users.
Language: Английский
Optimized Path Planning and Scheduling in Robotic Mobile Fulfillment Systems Using Ant Colony Optimization and Streamlit Visualization
Isam Sadeq Rasham
No information about this author
Wasit Journal of Computer and Mathematics Science,
Journal Year:
2024,
Volume and Issue:
3(4), P. 40 - 53
Published: Dec. 30, 2024
Context:
In
the
age
of
rapid
e-commerce
growth;
Robotic
Mobile
Fulfillment
Systems
(RMFS)
have
become
major
trend
in
warehouse
automation.
These
systems
involve
use
self-
governed
mobile
chares
to
collect
shelves
as
well
orders
for
deliveries
with
regard
optimization
task
allocation
and
reduced
expenses.
However,
a
manner
implement
such
systems,
one
needs
find
enhanced
algorithms
pertaining
resource
mapping
planning
movement
robots
sensitive
environments.
Problem
Statement:
Despite
RMFS
certain
challenges
especially
when
it
comes
distribution
tasks
overall
distances
that
employees
cover.
Objective:
The
main
goal
this
paper
is
propose
new
compound
model
based
on
RL-ACO
optimize
RMFS’s
assignment
navigation.
Also,
direction
study
investigate
how
methods
can
be
applied
real-life
automation
effective
large
scale.
Methodology:
This
research
introduces
selection
which
integrates
reinforcement
learning
Ant
Colony
Optimization
(ACO).
Specifically,
real
gym
environment
was
created
perform
order
training
way
robotic
movement.
Reinforcement
Learning
(RL)
models
were
trained
Proximal
Policy
(PPO)
improving
dynamic
control
ACO
used
computing
optimal
shelf
trajectories.
performance
also
measured
by
policy
gradient
loss,
travelled
distance
time
taken
complete
tasks.
Results:
proposed
framework
showed
potential
enhancing
efficiency
required
travel
involved.
each
RL
shortest
paths
identified
best
route
determined
total
102.91
units.
other
values
as,
value
function
loss
convergence
iterations.
To
build
global
solution,
integration
went
step
forward
enabling
through
combinatorial
problems
solving.
Implications:
offers
practical,
generalizable
flexible
approach
improvement
operations
thinking
Language: Английский
Identification of Diseases caused by non-Synonymous Single Nucleotide Polymorphism using Machine Learning Algorithms
VFAST Transactions on Software Engineering,
Journal Year:
2024,
Volume and Issue:
12(4), P. 312 - 325
Published: Dec. 31, 2024
The
production
of
vaccines
for
diseases
depends
entirely
on
its
analysis.
However,
to
test
every
disease
extensively
is
costly
as
it
would
involve
the
investigation
known
gene
related
a
disease.
This
issue
further
elevated
when
different
variations
are
considered.
As
such
use
computational
methods
considered
tackle
this
issue.
research
makes
machine
learning
algorithms
in
identification
and
prediction
Single
Nucleotide
Polymorphism.
presents
that
Gradient
Boosting
algorithm
performs
better
comparison
other
genic
variation
predictions
with
an
accuracy
70%.
Language: Английский