Multi-model ensemble with rich spatial information for object detection

•Ensemble learning improves the performance of object detection and achieves the mAP of state-of-the-art detectors.•The combination of context modeling and dilated convolution ensures the detection speed.•The proposed multi-scale feature fusion module confers a clear improvement to the detector.•The...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition 2020-03, Vol.99, p.107098, Article 107098
Hauptverfasser: Xu, Jie, Wang, Wei, Wang, Hanyuan, Guo, Jinhong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Ensemble learning improves the performance of object detection and achieves the mAP of state-of-the-art detectors.•The combination of context modeling and dilated convolution ensures the detection speed.•The proposed multi-scale feature fusion module confers a clear improvement to the detector.•The proposed ensemble modes demonstrate the effectiveness of ensemble learning in the field of object detection. Due to the development of deep learning networks and big data dimensionality, research on ensemble deep learning is receiving an increasing amount of attention. This paper takes the object detection task as the research domain and proposes an object detection framework based on ensemble deep learning. To guarantee the accuracy as well as real-time detection, the detector uses a Single Shot MultiBox Detector (SSD) as the backbone and combines ensemble learning with context modeling and multi-scale feature representation. Two modes were designed in order to achieve ensemble learning: NMS Ensembling and Feature Ensembling. In addition, to obtain contextual information, we used dilated convolution to expand the receptive field of the network. Compared with state-of-the-art detectors, our detector achieves superior performance on the PASCAL VOC set and the MS COCO set.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2019.107098