ICCL: Independent and Correlative Correspondence Learning for few-shot image classification

Few-shot learning, which aims to transfer knowledge from past experiences to recognize novel categories with limited samples, is a challenging task in computer vision. However, existing few-shot works tend to focus on determining the baseline model independently and ignoring the correlation learning...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge-based systems 2023-04, Vol.266, p.110412, Article 110412
Hauptverfasser: Zheng, Zijun, Wu, Heng, Lv, Laishui, Ye, Hailiang, Zhang, Changchun, Yu, Gaohang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Few-shot learning, which aims to transfer knowledge from past experiences to recognize novel categories with limited samples, is a challenging task in computer vision. However, existing few-shot works tend to focus on determining the baseline model independently and ignoring the correlation learning among instances. In light of this, in this paper, we propose a novel approach, termed Independent and Correlative Correspondence Learning (ICCL), to deal with the few-shot image classification problem. To build independent learning, we supervise the different geometry directions of each instance under data augmentation and the category of the instance in the source domain space, which enhance the discrimination of features. For constructing the correlation learning, we utilize the transformer unit to yield the long-range relationship between the global and local representation and the N-pair metric to learn the intra-class and inter-class variations, which generate credible decision boundaries. By performing the operations of independent learning and correlation learning, the generalization ability of the model can be substantially enhanced for the unseen category. Comprehensive experiments on public few-shot image classification benchmarks demonstrate that our proposed ICCL gains state-of-the-art results, and different ingredients in ICCL reflect their effectiveness.
ISSN:0950-7051
1872-7409
DOI:10.1016/j.knosys.2023.110412