Joint Feature Disentanglement and Hallucination for Few-Shot Image Classification
Few-shot learning (FSL) refers to the learning task that generalizes from base to novel concepts with only few examples observed during training. One intuitive FSL approach is to hallucinate additional training samples for novel categories. While this is typically done by learning from a disjoint se...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on image processing 2021, Vol.30, p.9245-9258 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Few-shot learning (FSL) refers to the learning task that generalizes from base to novel concepts with only few examples observed during training. One intuitive FSL approach is to hallucinate additional training samples for novel categories. While this is typically done by learning from a disjoint set of base categories with sufficient amount of training data, most existing works did not fully exploit the intra-class information from base categories, and thus there is no guarantee that the hallucinated data would represent the class of interest accordingly. In this paper, we propose Feature Disentanglement and Hallucination Network (FDH-Net), which jointly performs feature disentanglement and hallucination for FSL purposes. More specifically, our FDH-Net is able to disentangle input visual data into class-specific and appearance-specific features. With both data recovery and classification constraints, hallucination of image features for novel categories using appearance information extracted from base categories can be achieved. We perform extensive experiments on two fine-grained datasets (CUB and FLO) and two coarse-grained ones ( mini -ImageNet and CIFAR-100). The results confirm that our framework performs favorably against state-of-the-art metric-learning and hallucination-based FSL models. |
---|---|
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2021.3124322 |