GenDet: Meta Learning to Generate Detectors From Few Shots
Object detection has made enormous progress and has been widely used in many applications. However, it performs poorly when only limited training data is available for novel classes that the model has never seen before. Most existing approaches solve few-shot detection tasks implicitly without direc...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2022-08, Vol.33 (8), p.3448-3460 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Object detection has made enormous progress and has been widely used in many applications. However, it performs poorly when only limited training data is available for novel classes that the model has never seen before. Most existing approaches solve few-shot detection tasks implicitly without directly modeling the detectors for novel classes. In this article, we propose GenDet, a new meta-learning-based framework that can effectively generate object detectors for novel classes from few shots and, thus, conducts few-shot detection tasks explicitly. The detector generator is trained by numerous few-shot detection tasks sampled from base classes each with sufficient samples, and thus, it is expected to generalize well on novel classes. An adaptive pooling module is further introduced to suppress distracting samples and aggregate the detectors generated from multiple shots. Moreover, we propose to train a reference detector for each base class in the conventional way, with which to guide the training of the detector generator. The reference detectors and the detector generator can be trained simultaneously. Finally, the generated detectors of different classes are encouraged to be orthogonal to each other for better generalization. The proposed approach is extensively evaluated on the ImageNet, VOC, and COCO data sets under various few-shot detection settings, and it achieves new state-of-the-art results. |
---|---|
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2021.3053005 |