Overall positive prototype for few-shot open-set recognition
Few-shot open-set recognition (FSOR) is the task of recognizing samples in known classes with a limited number of annotated instances while also detecting samples that do not belong to any known class. This is a challenging problem because the models must learn to generalize from a small number of l...
Gespeichert in:
Veröffentlicht in: | Pattern recognition 2024-07, Vol.151, p.110400, Article 110400 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Few-shot open-set recognition (FSOR) is the task of recognizing samples in known classes with a limited number of annotated instances while also detecting samples that do not belong to any known class. This is a challenging problem because the models must learn to generalize from a small number of labeled samples and distinguish them from an unlimited number of potential negative examples. In this paper, we propose a novel approach called overall positive prototype to effectively improve performance. Conceptually, negative samples would distribute throughout the feature space and are hard to be described. From the opposite viewpoint, we propose to construct an overall positive prototype that acts as a cohesive representation for positive samples that distribute in a relatively smaller neighborhood. By measuring the distance between a query sample and the overall positive prototype, we can effectively classify it as either positive or negative. We show that this simple yet innovative approach provides the state-of-the-art FSOR performance in terms of accuracy and AUROC.
[Display omitted]
•We propose the concept of overall positive prototype (OPP) to summarize positive prototypes.•The OPP is employed to solve the few-shot open-set image recognition problem.•Comprehensive evaluation and ablation studies show OPP achieves the state-of-the-art performance. |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2024.110400 |