Partial multi-label feature selection based on label matrix decomposition DOI
Guanghui Liu, Qiaoyan Li, Xiaofei Yang

et al.

Neural Computing and Applications, Journal Year: 2024, Volume and Issue: unknown

Published: Dec. 19, 2024

Language: Английский

Three-way multi-label classification: A review, a framework, and new challenges DOI

Yuanjian Zhang,

Tianna Zhao, Duoqian Miao

et al.

Applied Soft Computing, Journal Year: 2025, Volume and Issue: unknown, P. 112757 - 112757

Published: Jan. 1, 2025

Language: Английский

Citations

2

Partial multi-label learning with label and classifier correlations DOI
Ke Wang,

Y. Guan,

Yang Xie

et al.

Information Sciences, Journal Year: 2025, Volume and Issue: unknown, P. 122101 - 122101

Published: March 1, 2025

Language: Английский

Citations

0

Multi-label feature selection with feature reconstruction and label correlations DOI

Ning Zhang,

A.H.-J. Wang,

P. K. Lu

et al.

Expert Systems with Applications, Journal Year: 2025, Volume and Issue: unknown, P. 127993 - 127993

Published: May 1, 2025

Language: Английский

Citations

0

Distributed Semi-Supervised Partial Multi-Label Learning over Networks DOI Open Access
Zhen Xu, Weibin Chen

Electronics, Journal Year: 2024, Volume and Issue: 13(23), P. 4754 - 4754

Published: Dec. 1, 2024

Inthis paper, a distributed semi-supervised partial multi-label learning (dS2PML) algorithm is proposed, which can be used to address the problem of classification partially multi-labeled data and unlabeled data. In this algorithm, we utilize multi-kernel function together with label correlation term construct discriminant function. addition, obtain decentralized implementation, design reconstructed error on labeling confidence based globally common basic that are selected by strategy. By exploiting similarity structure among feature spaces under sparsity constraint, confidences estimated in manner. Meanwhile, using sparse random map approximate kernel map, classifier trained supervision confidence. Experiments multiple real datasets conducted evaluate performance proposed approach. According experimental results, average ranks all comparison algorithms evaluated five evaluation metrics computed. The ranking results show our terms hamming loss, one error, precision, coverage 3.16, 2.27, 2.15, 2.38, 2.18, respectively. dS2PML second only corresponding centralized S2PML (cS2PML) higher than other existing metrics. rank differences Hamming between closest 0.28, 1.67, 1.80, 1.15, 1.62, Additionally, owing storage processing training data, reduces CPU time more 65% memory consumption 6% compared algorithms. indicate outperforms state-of-the-art accuracy, time, consumption.

Language: Английский

Citations

1

Hierarchical classification with exponential weighting of multi-granularity paths DOI
Yibin Wang, Qing Zhu, Yusheng Cheng

et al.

Information Sciences, Journal Year: 2024, Volume and Issue: 675, P. 120715 - 120715

Published: May 14, 2024

Language: Английский

Citations

0

Partial multi-label feature selection based on label matrix decomposition DOI
Guanghui Liu, Qiaoyan Li, Xiaofei Yang

et al.

Neural Computing and Applications, Journal Year: 2024, Volume and Issue: unknown

Published: Dec. 19, 2024

Language: Английский

Citations

0