Neural Computing and Applications, Journal Year: 2024, Volume and Issue: unknown
Published: Dec. 19, 2024
Language: Английский
Neural Computing and Applications, Journal Year: 2024, Volume and Issue: unknown
Published: Dec. 19, 2024
Language: Английский
Applied Soft Computing, Journal Year: 2025, Volume and Issue: unknown, P. 112757 - 112757
Published: Jan. 1, 2025
Language: Английский
Citations
2Information Sciences, Journal Year: 2025, Volume and Issue: unknown, P. 122101 - 122101
Published: March 1, 2025
Language: Английский
Citations
0Expert Systems with Applications, Journal Year: 2025, Volume and Issue: unknown, P. 127993 - 127993
Published: May 1, 2025
Language: Английский
Citations
0Electronics, Journal Year: 2024, Volume and Issue: 13(23), P. 4754 - 4754
Published: Dec. 1, 2024
Inthis paper, a distributed semi-supervised partial multi-label learning (dS2PML) algorithm is proposed, which can be used to address the problem of classification partially multi-labeled data and unlabeled data. In this algorithm, we utilize multi-kernel function together with label correlation term construct discriminant function. addition, obtain decentralized implementation, design reconstructed error on labeling confidence based globally common basic that are selected by strategy. By exploiting similarity structure among feature spaces under sparsity constraint, confidences estimated in manner. Meanwhile, using sparse random map approximate kernel map, classifier trained supervision confidence. Experiments multiple real datasets conducted evaluate performance proposed approach. According experimental results, average ranks all comparison algorithms evaluated five evaluation metrics computed. The ranking results show our terms hamming loss, one error, precision, coverage 3.16, 2.27, 2.15, 2.38, 2.18, respectively. dS2PML second only corresponding centralized S2PML (cS2PML) higher than other existing metrics. rank differences Hamming between closest 0.28, 1.67, 1.80, 1.15, 1.62, Additionally, owing storage processing training data, reduces CPU time more 65% memory consumption 6% compared algorithms. indicate outperforms state-of-the-art accuracy, time, consumption.
Language: Английский
Citations
1Information Sciences, Journal Year: 2024, Volume and Issue: 675, P. 120715 - 120715
Published: May 14, 2024
Language: Английский
Citations
0Neural Computing and Applications, Journal Year: 2024, Volume and Issue: unknown
Published: Dec. 19, 2024
Language: Английский
Citations
0