Semi-supervised Learning using Robust Loss
The amount of manually labeled data is limited in medical applications, so semi-supervised learning and automatic labeling strategies can be an asset for training deep neural networks. However, the quality of the automatically generated labels can be uneven and inferior to manual labels. In this pap...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The amount of manually labeled data is limited in medical applications, so
semi-supervised learning and automatic labeling strategies can be an asset for
training deep neural networks. However, the quality of the automatically
generated labels can be uneven and inferior to manual labels. In this paper, we
suggest a semi-supervised training strategy for leveraging both manually
labeled data and extra unlabeled data. In contrast to the existing approaches,
we apply robust loss for the automated labeled data to automatically compensate
for the uneven data quality using a teacher-student framework. First, we
generate pseudo-labels for unlabeled data using a teacher model pre-trained on
labeled data. These pseudo-labels are noisy, and using them along with labeled
data for training a deep neural network can severely degrade learned feature
representations and the generalization of the network. Here we mitigate the
effect of these pseudo-labels by using robust loss functions. Specifically, we
use three robust loss functions, namely beta cross-entropy, symmetric
cross-entropy, and generalized cross-entropy. We show that our proposed
strategy improves the model performance by compensating for the uneven quality
of labels in image classification as well as segmentation applications. |
---|---|
DOI: | 10.48550/arxiv.2203.01524 |