Multi-label feature selection with missing features by tolerance implication granularity information and symmetric coupled discriminant weight DOI
Jianhua Dai,

Jie Wang

Pattern Recognition, Journal Year: 2025, Volume and Issue: unknown, P. 111365 - 111365

Published: Jan. 1, 2025

Language: Английский

Cross-to-merge training with class balance strategy for learning with noisy labels DOI Creative Commons
Qian Zhang, Yi Zhu, Ming Yang

et al.

Expert Systems with Applications, Journal Year: 2024, Volume and Issue: 249, P. 123846 - 123846

Published: March 29, 2024

The collection of large-scale datasets inevitably introduces noisy labels, leading to a substantial degradation in the performance deep neural networks (DNNs). Although sample selection is mainstream method field learning with which aims mitigate impact labels during model training, testing these methods exhibits significant fluctuations across different noise rates and types. In this paper, we propose Cross-to-Merge Training (C2MT), novel framework that insensitive prior information progress, enhancing robustness. practical implementation, using cross-divided training data, two are cross-trained co-teaching strategy for several local rounds, subsequently merged into unified by performing federated averages on parameters models periodically. Additionally, introduce new class balance strategy, named Median Balance Strategy (MBS), cross-dividing process, evenly divides data labeled subset an unlabeled based estimated loss distribution characteristics. Extensive experimental results both synthetic real-world demonstrate effectiveness C2MT. Code will be available at: https://github.com/LanXiaoPang613/C2MT.

Language: Английский

Citations

20

Semi-supervised imbalanced multi-label classification with label propagation DOI
Guodong Du, Jia Zhang, Ning Zhang

et al.

Pattern Recognition, Journal Year: 2024, Volume and Issue: 150, P. 110358 - 110358

Published: Feb. 21, 2024

Language: Английский

Citations

19

A survey on multi-label feature selection from perspectives of label fusion DOI
Wenbin Qian, Jintao Huang,

Fankang Xu

et al.

Information Fusion, Journal Year: 2023, Volume and Issue: 100, P. 101948 - 101948

Published: Aug. 2, 2023

Language: Английский

Citations

38

Multi-label feature selection by strongly relevant label gain and label mutual aid DOI
Jianhua Dai, Weiyi Huang, Chucai Zhang

et al.

Pattern Recognition, Journal Year: 2023, Volume and Issue: 145, P. 109945 - 109945

Published: Sept. 9, 2023

Language: Английский

Citations

37

Ensemble of kernel extreme learning machine based elimination optimization for multi-label classification DOI
Qingshuo Zhang, Eric C.C. Tsang, Qiang He

et al.

Knowledge-Based Systems, Journal Year: 2023, Volume and Issue: 278, P. 110817 - 110817

Published: July 24, 2023

Language: Английский

Citations

24

Multilabel Feature Selection via Shared Latent Sublabel Structure and Simultaneous Orthogonal Basis Clustering DOI
Ronghua Shang, Jingyu Zhong, Weitong Zhang

et al.

IEEE Transactions on Neural Networks and Learning Systems, Journal Year: 2024, Volume and Issue: 36(3), P. 5288 - 5303

Published: April 24, 2024

Multilabel feature selection solves the dimension distress of high-dimensional multilabel data by selecting optimal subset features. Noisy and incomplete labels raw hinder acquisition label-guided information. In existing approaches, mapping label space to a low-dimensional latent semantic decomposition mitigate noise is considered an effective strategy. However, decomposed contains redundant information, which misleads capture potential relevance. To eliminate effect information on extraction correlations, novel method named SLOFS via shared sublabel structure simultaneous orthogonal basis clustering for proposed. First, base (LOBSS) term engineered guide construction redundancy-free separated center structure. The LOBSS simultaneously retains Moreover, relevance nonredundant sublabels are fully explored. introduction graph regularization ensures structural consistency in sublabels, thus helping process. employs dynamic obtain high-quality uses constrain correlations projections. Finally, convergence provable optimization scheme proposed solve method. experimental studies 18 datasets demonstrate that presented performs consistently better than previous methods.

Language: Английский

Citations

15

Multi-label feature selection via similarity constraints with non-negative matrix factorization DOI
Zhuoxin He, Yaojin Lin, Zilong Lin

et al.

Knowledge-Based Systems, Journal Year: 2024, Volume and Issue: 297, P. 111948 - 111948

Published: May 18, 2024

Language: Английский

Citations

12

Multi-label Feature selection with adaptive graph learning and label information enhancement DOI

Zhi Qin,

Hongmei Chen,

Yong Mi

et al.

Knowledge-Based Systems, Journal Year: 2024, Volume and Issue: 285, P. 111363 - 111363

Published: Jan. 3, 2024

Language: Английский

Citations

10

Correlation concept-cognitive learning model for multi-label classification DOI
Jiaming Wu, Eric C.C. Tsang, Weihua Xu

et al.

Knowledge-Based Systems, Journal Year: 2024, Volume and Issue: 290, P. 111566 - 111566

Published: Feb. 24, 2024

Language: Английский

Citations

10

LSFSR: Local label correlation-based sparse multilabel feature selection with feature redundancy DOI
Lin Sun, Yuxuan Ma, Weiping Ding

et al.

Information Sciences, Journal Year: 2024, Volume and Issue: 667, P. 120501 - 120501

Published: March 20, 2024

Language: Английский

Citations

9