Soft Computing, Год журнала: 2024, Номер unknown
Опубликована: Дек. 3, 2024
Язык: Английский
Soft Computing, Год журнала: 2024, Номер unknown
Опубликована: Дек. 3, 2024
Язык: Английский
Scientific Reports, Год журнала: 2024, Номер 14(1)
Опубликована: Апрель 10, 2024
Abstract
To
overcome
the
disadvantages
of
premature
convergence
and
easy
trapping
into
local
optimum
solutions,
this
paper
proposes
an
improved
particle
swarm
optimization
algorithm
(named
NDWPSO
algorithm)
based
on
multiple
hybrid
strategies.
Firstly,
elite
opposition-based
learning
method
is
utilized
to
initialize
position
matrix.
Secondly,
dynamic
inertial
weight
parameters
are
given
improve
global
search
speed
in
early
iterative
phase.
Thirdly,
a
new
optimal
jump-out
strategy
proposed
"premature"
problem.
Finally,
applies
spiral
shrinkage
from
whale
(WOA)
Differential
Evolution
(DE)
mutation
later
iteration
accelerate
speed.
The
further
compared
with
other
8
well-known
nature-inspired
algorithms
(3
PSO
variants
5
intelligent
algorithms)
23
benchmark
test
functions
three
practical
engineering
problems.
Simulation
results
prove
that
obtains
better
for
all
49
sets
data
than
3
variants.
Compared
algorithms,
69.2%,
84.6%,
84.6%
best
function
(
$${f}_{1}-{f}_{13}$$
Язык: Английский
Процитировано
14The Journal of Supercomputing, Год журнала: 2024, Номер 80(13), С. 19274 - 19323
Опубликована: Май 26, 2024
Язык: Английский
Процитировано
6International Journal of Data Science and Analytics, Год журнала: 2025, Номер unknown
Опубликована: Янв. 8, 2025
Язык: Английский
Процитировано
0Scientific Reports, Год журнала: 2025, Номер 15(1)
Опубликована: Янв. 15, 2025
Feature selection (FS) is a critical step in hyperspectral image (HSI) classification, essential for reducing data dimensionality while preserving classification accuracy. However, FS HSIs remains an NP-hard challenge, as existing swarm intelligence and evolutionary algorithms (SIEAs) often suffer from limited exploration capabilities or susceptibility to local optima, particularly high-dimensional scenarios. To address these challenges, we propose GWOGA, novel hybrid algorithm that combines Grey Wolf Optimizer (GWO) Genetic Algorithm (GA), aiming achieve effective balance between exploitation. The innovation of GWOGA lies three core strategies: (1) chaotic map Opposition-Based Learning (OBL) uniformly distributed population initialization, enhancing diversity mitigating premature convergence; (2) elite learning strategy prioritize high-ranking solutions, strengthening the search hierarchy efficiency; (3) optimization mechanism where GWO ensures rapid early-stage convergence, GA refines global later stages escape optima. Experiments on benchmark (i.e., Indian Pines, KSC, Botswana) demonstrate outperforms state-of-the-art algorithms, achieving higher accuracy with fewer selected bands. results highlight GWOGA's robustness, generalizability, potential real-world applications HSI FS.
Язык: Английский
Процитировано
0Knowledge-Based Systems, Год журнала: 2025, Номер unknown, С. 113156 - 113156
Опубликована: Фев. 1, 2025
Язык: Английский
Процитировано
0Cluster Computing, Год журнала: 2025, Номер 28(4)
Опубликована: Фев. 25, 2025
Язык: Английский
Процитировано
0Swarm and Evolutionary Computation, Год журнала: 2025, Номер 95, С. 101927 - 101927
Опубликована: Апрель 15, 2025
Язык: Английский
Процитировано
0Scientific Reports, Год журнала: 2024, Номер 14(1)
Опубликована: Июнь 14, 2024
This paper proposes a novel multi-hybrid algorithm named DHPN, using the best-known properties of dwarf mongoose (DMA), honey badger (HBA), prairie dog optimizer (PDO), cuckoo search (CS), grey wolf (GWO) and naked mole rat (NMRA). It follows an iterative division for extensive exploration incorporates major parametric enhancements improved exploitation operation. To counter local optima problems, stagnation phase CS GWO is added. Six new inertia weight operators have been analyzed to adapt algorithmic parameters, best combination these parameters has found. An analysis suitability DHPN towards population variations higher dimensions performed. For performance evaluation, CEC 2005 2019 benchmark data sets used. A comparison performed with differential evolution active archive (JADE), self-adaptive DE (SaDE), success history based (SHADE), LSHADE-SPACMA, extended (GWO-E), jDE100, others. The also used solve image fusion problem four quality metrics, namely, edge-based similarity index (
Язык: Английский
Процитировано
3Cluster Computing, Год журнала: 2024, Номер 27(10), С. 14417 - 14449
Опубликована: Июль 20, 2024
Язык: Английский
Процитировано
3International Journal of Machine Learning and Cybernetics, Год журнала: 2024, Номер 15(12), С. 6107 - 6148
Опубликована: Авг. 10, 2024
Язык: Английский
Процитировано
1