Few-Shot Object Detection Based on Self-Knowledge Distillation
In many fields, due to the lack of large-scale training data, the traditional object detection methods cannot complete the actual work well. The main reason is the overfitting problem and lack of the generalization ability. In this work, we propose a general method to alleviate the overfitting probl...
Gespeichert in:
Veröffentlicht in: | IEEE intelligent systems 2024, p.1-8 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In many fields, due to the lack of large-scale training data, the traditional object detection methods cannot complete the actual work well. The main reason is the overfitting problem and lack of the generalization ability. In this work, we propose a general method to alleviate the overfitting problem in the few-shot object detection. Our work extends Faster R-CNN with self-knowledge distillation algorithm and designs the loss function with attention mechanism, which can improve true detection in the foreground. In this way, object detector can learn an approximate mapping relationship from few samples, which makes the network possess a stronger generalization ability when tackling few images. Through numerous comparative experiments, we demonstrate that our method is general and feasible on VOC and COCO benchmarks datasets with different settings. We provide a new idea for solving the problem of few-shot object detection, and produce an excellent performance of recall rate on few-shot object detection. |
---|---|
ISSN: | 1541-1672 1941-1294 |
DOI: | 10.1109/MIS.2022.3205686 |