A Novel One-Shot Object Detection via Multifeature Auxiliary Information

With the advantage of using only a limited number of samples, few-shot learning has been developed rapidly in recent years. It is mostly applied in the object classification or detection of a small number of samples which is typically less than ten. However, there is not much research related to few...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Security and communication networks 2022-02, Vol.2022, p.1-9
Hauptverfasser: Song, Yu, Li, Min, Du, Weidong, Gou, Yao, Wu, Zhaoqing, He, Yujie
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:With the advantage of using only a limited number of samples, few-shot learning has been developed rapidly in recent years. It is mostly applied in the object classification or detection of a small number of samples which is typically less than ten. However, there is not much research related to few-shot detection, especially one-shot detection. In this paper, the multifeature information-assisted one-shot detection method is proposed to improve the accuracy of one-shot object detection. Specifically, two auxiliary modules are applied to the detection algorithm: Semantic Feature Module (SFM) and Detail Feature Module (DFM), which, respectively, extract semantic feature information and detailed feature information of samples in the support set. Then these two kinds of information are then calculated with the feature image extracted from the query image to obtain the corresponding auxiliary information that is used to complete one-shot detection. Thanks to the two auxiliary modules, which can retain more semantic and detailed information of samples in the support set, the proposed method can enhance the utilization rate of sample feature information and improve object detection accuracy by 2.97% compared to the benchmark method.
ISSN:1939-0114
1939-0122
DOI:10.1155/2022/6805526