Learning Accurate Pseudo-Labels via Feature Similarity in the Presence of Label Noise

Due to the exceptional learning capabilities of deep neural networks (DNNs), they continue to struggle to handle label noise. To address this challenge, the pseudo-label approach has emerged as a preferred solution. Recent works have achieved significant improvements by exploring the information inv...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied sciences 2024-04, Vol.14 (7), p.2759
Hauptverfasser: Wang, Peng, Wang, Xiaoxiao, Wang , Zhen, Dong, Yongfeng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Due to the exceptional learning capabilities of deep neural networks (DNNs), they continue to struggle to handle label noise. To address this challenge, the pseudo-label approach has emerged as a preferred solution. Recent works have achieved significant improvements by exploring the information involved in DNN predictions and designing a straightforward method to incorporate model predictions into the training process by using a convex combination of original labels and predictions as the training targets. However, these methods overlook the feature-level information contained in the sample, which significantly impacts the accuracy of the pseudo label. This study introduces a straightforward yet potent technique named FPL (feature pseudo-label), which leverages information from model predictions as well as feature similarity. Additionally, we utilize an exponential moving average scheme to bolster the stability of corrected labels while upholding the stability of pseudo-labels. Extensive experiments were carried out on synthetic and real datasets across different noise types. The CIFAR10 dataset yielded the highest accuracy of 94.13% (Top1), while Clothing1M achieved 73.54%. The impressive outcomes showcased the efficacy and robustness of learning amid label noise.
ISSN:2076-3417
2076-3417
DOI:10.3390/app14072759