Three Heads Are Better Than One: Complementary Experts for Long-Tailed Semi-supervised Learning
We address the challenging problem of Long-Tailed Semi-Supervised Learning (LTSSL) where labeled data exhibit imbalanced class distribution and unlabeled data follow an unknown distribution. Unlike in balanced SSL, the generated pseudo-labels are skewed towards head classes, intensifying the trainin...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We address the challenging problem of Long-Tailed Semi-Supervised Learning
(LTSSL) where labeled data exhibit imbalanced class distribution and unlabeled
data follow an unknown distribution. Unlike in balanced SSL, the generated
pseudo-labels are skewed towards head classes, intensifying the training bias.
Such a phenomenon is even amplified as more unlabeled data will be mislabeled
as head classes when the class distribution of labeled and unlabeled datasets
are mismatched. To solve this problem, we propose a novel method named
ComPlementary Experts (CPE). Specifically, we train multiple experts to model
various class distributions, each of them yielding high-quality pseudo-labels
within one form of class distribution. Besides, we introduce Classwise Batch
Normalization for CPE to avoid performance degradation caused by feature
distribution mismatch between head and non-head classes. CPE achieves
state-of-the-art performances on CIFAR-10-LT, CIFAR-100-LT, and STL-10-LT
dataset benchmarks. For instance, on CIFAR-10-LT, CPE improves test accuracy by
over 2.22% compared to baselines. Code is available at
https://github.com/machengcheng2016/CPE-LTSSL. |
---|---|
DOI: | 10.48550/arxiv.2312.15702 |