Overview of the application of intelligent optimization algorithms in multi-attribute group decision making
Applied Intelligence,
Год журнала:
2025,
Номер
55(6)
Опубликована: Фев. 6, 2025
Язык: Английский
A comprehensive survey of golden jacal optimization and its applications
Computer Science Review,
Год журнала:
2025,
Номер
56, С. 100733 - 100733
Опубликована: Фев. 11, 2025
Язык: Английский
IBBA: an improved binary bat algorithm for solving low and high-dimensional feature selection problems
International Journal of Machine Learning and Cybernetics,
Год журнала:
2025,
Номер
unknown
Опубликована: Март 3, 2025
Язык: Английский
Hybrid strategy collaborative enhancement of white shark optimization algorithm
The Journal of Supercomputing,
Год журнала:
2025,
Номер
81(5)
Опубликована: Март 26, 2025
Язык: Английский
Newton Downhill Optimizer for Global Optimization
Research Square (Research Square),
Год журнала:
2025,
Номер
unknown
Опубликована: Апрель 1, 2025
Abstract
The
study
presents
the
Newton's
Downhill
Optimizer
(NDO),
a
novel
metaheuristic
algorithm
designed
to
address
challenges
of
complex,
high-dimensional,
and
nonlinear
optimization
problems.
Mathematical-Based
Algorithms
(MBAs)
are
category
algorithms
based
on
mathematical
principles.
They
widely
applied
in
numerical
computation,
symbolic
manipulation,
geometric
processing,
problems,
probabilistic
statistics,
offering
efficient
precise
solutions
complex
Inspired
by
Method,
NDO
combines
its
precision
with
downhill
strategy
stochastic
processes,
specifically
real-world
applications
benchmark
method
inspired
enhancing
capability
exploring
solution
space
escaping
local
optima.
In
tests,
demonstrated
exceptional
performance,
surpassing
majority
competing
multiple
test
suites
CEC
2017
2022.
We
conducted
comprehensive
comparison
against
14
well-established
algorithms.
These
include
mathematical-based
approaches
such
as
AOA,
SCHO,
SCA,
SABO,
NRBO,
RUN.
also
compared
it
classical
like
CMA-ES,
ABC,
DE,
PSO.
Additionally,
we
included
advanced
recently
published
WSO,
EHO,
FDB_AGDEand
GQPSO.
results
demonstrate
that
outperforms
most
these
It
exhibits
superior
convergence
speed
remarkable
stability.In
engineering
applications,
outperformed
other
reducer
design
task
step-cone
pulley
delivered
outstanding
disk
clutch
brake
tasks.
A
significant
contribution
is
application
breast
cancer
feature
selection,
tested
two
Breast
datasets.
performance
accuracy,
sensitivity,
specificity,
Matthews
Correlation
Coefficient
(MCC),
achieving
accuracy
across
This
underscores
potential
viable
tool
for
addressing
both
medical
fields.
source
codes
will
be
shared
at
https://github.com/oykc1234/NDO.
Язык: Английский
A diversity enhanced tree-seed algorithm based on double search with genetic and automated learning search strategies for image segmentation
Applied Soft Computing,
Год журнала:
2025,
Номер
unknown, С. 113143 - 113143
Опубликована: Апрель 1, 2025
Язык: Английский
An improved Red-billed blue magpie feature selection algorithm for medical data processing
PLoS ONE,
Год журнала:
2025,
Номер
20(5), С. e0324866 - e0324866
Опубликована: Май 22, 2025
Feature
selection
is
a
crucial
preprocessing
step
in
the
fields
of
machine
learning,
data
mining
and
pattern
recognition.
In
medical
analysis,
large
number
complexity
features
are
often
accompanied
by
redundant
or
irrelevant
features,
which
not
only
increase
computational
burden,
but
also
may
lead
to
model
overfitting,
turn
affects
its
generalization
ability.
To
address
this
problem,
paper
proposes
an
improved
red-billed
blue
magpie
algorithm
(IRBMO),
specifically
optimized
for
feature
task,
significantly
improves
performance
efficiency
on
introducing
multiple
innovative
behavioral
strategies.
The
core
mechanisms
IRBMO
include:
elite
search
behavior,
global
optimization
guiding
expand
more
promising
directions;
collaborative
hunting
quickly
identifies
key
promotes
among
subsets;
memory
storage
leverages
historically
valid
information
improve
accuracy.
adapt
we
convert
continuous
binary
form
via
transfer
function,
further
enhances
applicability
algorithm.
order
comprehensively
verify
IRBMO,
designs
series
experiments
compare
it
with
nine
mainstream
algorithms.
based
12
datasets,
results
show
that
achieves
optimal
overall
metrics
such
as
fitness
value,
classification
accuracy
specificity.
addition,
compared
existing
methods,
demonstrates
significant
advantages
terms
value.
enhance
performance,
constructs
V2IRBMO
variant
combining
S-shaped
V-shaped
functions,
robustness
ability
Experiments
demonstrate
exhibits
high
efficiency,
generality
excellent
tasks.
used
conjunction
KNN
classifier,
accuracy,
average
improvement
43.89%
datasets
original
Red-billed
Blue
Magpie
These
potential
wide
data.
Язык: Английский