An Improved Sample Selection Framework for Learning with Noisy Labels DOI
Qian Zhang,

Yi Zhu,

Ming Yang

et al.

Published: Jan. 1, 2023

Since the powerful memory capabilities of deep neural networks, they tend to overfit noisy labels, resulting in degradation discrimination. Sample selection methods that filter out possibly clean labels have been mainstream learning with labels. large gap between size filtered, subset and unlabeled subset, which is particularly obvious under high noise rates, label-free samples sample cannot be fully used, leaving space for performance improvement. This paper proposes an improved Selection framework OverSampling strategy, SOS, overcome this deficiency. It mines useful information carried instances boost models’ by combining oversampling strategy existing SOTA methods. We demonstrate effectiveness SOS through extensive experimental results on both synthetic datasets real-world datasets. The code will available at https://github.com/LanXiaoPang613/SOS.

Language: Английский

Hierarchical symmetric cross entropy for distant supervised relation extraction DOI
Yun Liu, Xiaoheng Jiang,

Pengshuai Lv

et al.

Applied Intelligence, Journal Year: 2024, Volume and Issue: 54(21), P. 11020 - 11033

Published: Sept. 3, 2024

Language: Английский

Citations

1

DMA-Net: Decoupled Multi-Scale Attention for Few-Shot Object Detection DOI Creative Commons
Xijun Xie, Feifei Lee, Qiu Chen

et al.

Applied Sciences, Journal Year: 2023, Volume and Issue: 13(12), P. 6933 - 6933

Published: June 8, 2023

As one of the most important fields in computer vision, object detection has undergone marked development recent years. Generally, requires many labeled samples for training, but it is not easy to collect and label specialized fields. In case few samples, general detectors typically exhibit overfitting poor generalizability when recognizing unknown objects, FSOD methods also cannot make good use support information or manage potential problem relationships between branch query branch. To address this issue, we propose paper a novel framework called Decoupled Multi-scale Attention (DMA-Net), core which Module (DMAM), consists three primary parts: multi-scale feature extractor, attention module, decoupled gradient module (DGM). DMAM performs extraction layer-to-layer fusion, can more efficiently, DGM reduce impact optimization exchange two branches. DMA-Net implement incremental FSOD, suitable practical applications. Extensive experimental results demonstrate that comparable on generic benchmarks, particularly setting, where achieves state-of-the-art performance.

Language: Английский

Citations

2

Gradient optimization for object detection in learning with noisy labels DOI
Qiangqiang Xia, Chunyan Hu, Feifei Lee

et al.

Applied Intelligence, Journal Year: 2024, Volume and Issue: 54(5), P. 4248 - 4259

Published: March 1, 2024

Language: Английский

Citations

0

Profiling effects of filtering noise labels on learning performance DOI
Chien‐Hsing Wu, Shu‐Chen Kao,

Rui-Qian Hong

et al.

Knowledge-Based Systems, Journal Year: 2024, Volume and Issue: 294, P. 111667 - 111667

Published: April 5, 2024

Language: Английский

Citations

0

An improved sample selection framework for learning with noisy labels DOI Creative Commons
Qian Zhang, Yi Zhu, Ming Yang

et al.

PLoS ONE, Journal Year: 2024, Volume and Issue: 19(12), P. e0309841 - e0309841

Published: Dec. 5, 2024

Deep neural networks have powerful memory capabilities, yet they frequently suffer from overfitting to noisy labels, leading a decline in classification and generalization performance. To address this issue, sample selection methods that filter out potentially clean labels been proposed. However, there is significant gap size between the filtered, possibly subset unlabeled subset, which becomes particularly pronounced at high-noise rates. Consequently, results underutilizing label-free samples methods, leaving room for performance improvement. This study introduces an enhanced framework with oversampling strategy (SOS) overcome limitation. leverages valuable information contained instances enhance model by combining SOS state-of-the-art methods. We validate effectiveness of through extensive experiments conducted on both synthetic datasets real-world such as CIFAR, WebVision, Clothing1M. The source code will be made available https://github.com/LanXiaoPang613/SOS.

Language: Английский

Citations

0

Learning with noisy labels via Mamba and entropy KNN framework DOI
Ningwei Wang, Weiqiang Jin,

Shirou Jing

et al.

Applied Soft Computing, Journal Year: 2024, Volume and Issue: 169, P. 112596 - 112596

Published: Dec. 14, 2024

Language: Английский

Citations

0

An Improved Sample Selection Framework for Learning with Noisy Labels DOI
Qian Zhang,

Yi Zhu,

Ming Yang

et al.

Published: Jan. 1, 2023

Since the powerful memory capabilities of deep neural networks, they tend to overfit noisy labels, resulting in degradation discrimination. Sample selection methods that filter out possibly clean labels have been mainstream learning with labels. large gap between size filtered, subset and unlabeled subset, which is particularly obvious under high noise rates, label-free samples sample cannot be fully used, leaving space for performance improvement. This paper proposes an improved Selection framework OverSampling strategy, SOS, overcome this deficiency. It mines useful information carried instances boost models’ by combining oversampling strategy existing SOTA methods. We demonstrate effectiveness SOS through extensive experimental results on both synthetic datasets real-world datasets. The code will available at https://github.com/LanXiaoPang613/SOS.

Language: Английский

Citations

0

An Improved Sample Selection Framework for Learning with Noisy Labels DOI
Qian Zhang,

Yi Zhu,

Ming Yang

et al.

Published: Jan. 1, 2023

Since the powerful memory capabilities of deep neural networks, they tend to overfit noisy labels, resulting in degradation discrimination. Sample selection methods that filter out possibly clean labels have been mainstream learning with labels. large gap between size filtered, subset and unlabeled subset, which is particularly obvious under high noise rates, label-free samples sample cannot be fully used, leaving space for performance improvement. This paper proposes an improved Selection framework OverSampling strategy, SOS, overcome this deficiency. It mines useful information carried instances boost models’ by combining oversampling strategy existing SOTA methods. We demonstrate effectiveness SOS through extensive experimental results on both synthetic datasets real-world datasets. The code will available at https://github.com/LanXiaoPang613/SOS.

Language: Английский

Citations

0