Targeting wastewater quality variables prediction: Improving sparrow search algorithm towards optimizing echo state network DOI
Yiqi Liu, Yue Sun, Gang Fang

и другие.

Journal of Water Process Engineering, Год журнала: 2024, Номер 65, С. 105717 - 105717

Опубликована: Июль 13, 2024

Язык: Английский

A dual-optimization wind speed forecasting model based on deep learning and improved dung beetle optimization algorithm DOI
Yanhui Li, Kaixuan Sun, Qi Yao

и другие.

Energy, Год журнала: 2023, Номер 286, С. 129604 - 129604

Опубликована: Ноя. 7, 2023

Язык: Английский

Процитировано

69

ESO: An enhanced snake optimizer for real-world engineering problems DOI
Liguo Yao, Panliang Yuan, Chieh-Yuan Tsai

и другие.

Expert Systems with Applications, Год журнала: 2023, Номер 230, С. 120594 - 120594

Опубликована: Июнь 3, 2023

Язык: Английский

Процитировано

67

DTCSMO: An efficient hybrid starling murmuration optimizer for engineering applications DOI
Gang Hu, Jingyu Zhong, Guo Wei

и другие.

Computer Methods in Applied Mechanics and Engineering, Год журнала: 2023, Номер 405, С. 115878 - 115878

Опубликована: Янв. 10, 2023

Язык: Английский

Процитировано

40

A local opposition-learning golden-sine grey wolf optimization algorithm for feature selection in data classification DOI
Li Zhang

Applied Soft Computing, Год журнала: 2023, Номер 142, С. 110319 - 110319

Опубликована: Апрель 22, 2023

Язык: Английский

Процитировано

39

Selection of contributing factors for predicting landslide susceptibility using machine learning and deep learning models DOI
Cheng Chen, Lei Fan

Stochastic Environmental Research and Risk Assessment, Год журнала: 2023, Номер unknown

Опубликована: Сен. 13, 2023

Язык: Английский

Процитировано

23

A feature selection method based on the Golden Jackal-Grey Wolf Hybrid Optimization Algorithm DOI Creative Commons
Guangwei Liu, Zhiqing Guo, Wei Liu

и другие.

PLoS ONE, Год журнала: 2024, Номер 19(1), С. e0295579 - e0295579

Опубликована: Янв. 2, 2024

This paper proposes a feature selection method based on hybrid optimization algorithm that combines the Golden Jackal Optimization (GJO) and Grey Wolf Optimizer (GWO). The primary objective of this is to create an effective data dimensionality reduction technique for eliminating redundant, irrelevant, noisy features within high-dimensional datasets. Drawing inspiration from Chinese idiom “Chai Lang Hu Bao,” mechanisms, cooperative behaviors observed in natural animal populations, we amalgamate GWO algorithm, Lagrange interpolation method, GJO propose multi-strategy fusion GJO-GWO algorithm. In Case 1, addressed eight complex benchmark functions. 2, was utilized tackle ten problems. Experimental results consistently demonstrate under identical experimental conditions, whether solving functions or addressing problems, exhibits smaller means, lower standard deviations, higher classification accuracy, reduced execution times. These findings affirm superior performance, stability

Язык: Английский

Процитировано

11

A Hybrid Golden Jackal Optimization and Golden Sine Algorithm with Dynamic Lens-Imaging Learning for Global Optimization Problems DOI Creative Commons
Panliang Yuan, Taihua Zhang, Liguo Yao

и другие.

Applied Sciences, Год журнала: 2022, Номер 12(19), С. 9709 - 9709

Опубликована: Сен. 27, 2022

Golden jackal optimization (GJO) is an effective metaheuristic algorithm that imitates the cooperative hunting behavior of golden jackal. However, since update prey’s position often depends on male and there insufficient diversity jackals in some cases, it prone to falling into a local optimal optimum. In order address these drawbacks GJO, this paper proposes improved algorithm, called hybrid GJO sine (S) (Gold-SA) with dynamic lens-imaging (L) learning (LSGJO). First, novel dual spiral rules inspired by Gold-SA. These give ability think like human (Gold-SA), making more intelligent process preying, improving efficiency optimization. Second, nonlinear decreasing scaling factor introduced operator maintain population diversity. The performance LSGJO verified through 23 classical benchmark functions 3 complex design problems real scenarios. experimental results show converges faster accurately than 11 state-of-the-art algorithms, global search has significantly, proposed shown superior solving constrained problems.

Язык: Английский

Процитировано

35

Crisscross Harris Hawks Optimizer for Global Tasks and Feature Selection DOI Open Access
Xin Wang, Xiaogang Dong, Yanan Zhang

и другие.

Journal of Bionic Engineering, Год журнала: 2022, Номер 20(3), С. 1153 - 1174

Опубликована: Ноя. 30, 2022

Язык: Английский

Процитировано

29

Modified artificial rabbits optimization combined with bottlenose dolphin optimizer in feature selection of network intrusion detection DOI Creative Commons

Fukui Li,

Hui Xu, Feng Qiu

и другие.

Electronic Research Archive, Год журнала: 2024, Номер 32(3), С. 1770 - 1800

Опубликована: Янв. 1, 2024

<p>For the feature selection of network intrusion detection, issue numerous redundant features arises, posing challenges in enhancing detection accuracy and adversely affecting overall performance to some extent. Artificial rabbits optimization (ARO) is capable reducing can be applied for detection. The ARO exhibits a slow iteration speed exploration phase population prone an iterative stagnation condition exploitation phase, which hinders its ability deliver outstanding aforementioned problems. First, enhance global capabilities further, thinking incorporates mud ring feeding strategy from bottlenose dolphin optimizer (BDO). Simultaneously, adjusting phases, employs adaptive switching mechanism. Second, avoid original algorithm getting trapped local optimum during levy flight adopted. Lastly, dynamic lens-imaging introduced variety facilitate escape optimum. Then, this paper proposes modified ARO, namely LBARO, hybrid that combines BDO model. LBARO first empirically evaluated comprehensively demonstrate superiority proposed algorithm, using 8 benchmark test functions 4 UCI datasets. Subsequently, integrated into process model classification experimental validation. This integration validated utilizing NSL-KDD, UNSW NB-15, InSDN datasets, respectively. Experimental results indicate based on successfully reduces characteristics while detection.</p>

Язык: Английский

Процитировано

6

Feature selection in high-dimensional data: an enhanced RIME optimization with information entropy pruning and DBSCAN clustering DOI

Huangying Wu,

Yi Chen, Wei Zhu

и другие.

International Journal of Machine Learning and Cybernetics, Год журнала: 2024, Номер 15(9), С. 4211 - 4254

Опубликована: Апрель 24, 2024

Язык: Английский

Процитировано

6