Imbalanced image classification with complement cross entropy

•This work proposes a simple loss function for imbalanced image classification.•This work studies the effect of suppressing output scores on incorrect classes for imbalanced image classification.•This work demonstrates the effectiveness of the proposed method through experiments on imbalanced datase...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition letters 2021-11, Vol.151, p.33-40
Hauptverfasser: Kim, Yechan, Lee, Younkwan, Jeon, Moongu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•This work proposes a simple loss function for imbalanced image classification.•This work studies the effect of suppressing output scores on incorrect classes for imbalanced image classification.•This work demonstrates the effectiveness of the proposed method through experiments on imbalanced datasets. [Display omitted] Recently, deep learning models have achieved great success in computer vision applications, relying on large-scale class-balanced datasets. However, imbalanced class distributions still limit the wide applicability of these models due to degradation in performance. To solve this problem, in this paper, we concentrate on the study of cross entropy which mostly ignores output scores on incorrect classes. This work discovers that neutralizing predicted probabilities on incorrect classes improves the prediction accuracy for imbalanced image classification. This paper proposes a simple but effective loss named complement cross entropy based on this finding. The proposed loss makes the ground truth class overwhelm the other classes in terms of softmax probability, by neutralizing probabilities of incorrect classes, without additional training procedures. Along with it, this loss facilitates the models to learn key information especially from samples on minority classes. It ensures more accurate and robust classification results on imbalanced distributions. Extensive experiments on imbalanced datasets demonstrate the effectiveness of the proposed method compared to other state-of-the-art methods.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2021.07.017