Reconstruction guided Meta-learning for Few Shot Open Set Recognition
In many applications, we are constrained to learn classifiers from very limited data (few-shot classification). The task becomes even more challenging if it is also required to identify samples from unknown categories (open-set classification). Learning a good abstraction for a class with very few s...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In many applications, we are constrained to learn classifiers from very
limited data (few-shot classification). The task becomes even more challenging
if it is also required to identify samples from unknown categories (open-set
classification). Learning a good abstraction for a class with very few samples
is extremely difficult, especially under open-set settings. As a result,
open-set recognition has received minimal attention in the few-shot setting.
However, it is a critical task in many applications like environmental
monitoring, where the number of labeled examples for each class is limited.
Existing few-shot open-set recognition (FSOSR) methods rely on thresholding
schemes, with some considering uniform probability for open-class samples.
However, this approach is often inaccurate, especially for fine-grained
categorization, and makes them highly sensitive to the choice of a threshold.
To address these concerns, we propose Reconstructing Exemplar-based Few-shot
Open-set ClaSsifier (ReFOCS). By using a novel exemplar reconstruction-based
meta-learning strategy ReFOCS streamlines FSOSR eliminating the need for a
carefully tuned threshold by learning to be self-aware of the openness of a
sample. The exemplars, act as class representatives and can be either provided
in the training dataset or estimated in the feature domain. By testing on a
wide variety of datasets, we show ReFOCS to outperform multiple
state-of-the-art methods. |
---|---|
DOI: | 10.48550/arxiv.2108.00340 |