Robust object detection under harsh autonomous‐driving environments

In the autonomous driving environment, object instances in an image can be affected by various factors such as camera, driving state, weather, and system component. However, the deep learning‐based vision systems are vulnerable to perturbation, which contains noise. Thus, robust object detection und...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET image processing 2022-03, Vol.16 (4), p.958-971
Hauptverfasser: Kim, Youngjun, Hwang, Hyekyoung, Shin, Jitae
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In the autonomous driving environment, object instances in an image can be affected by various factors such as camera, driving state, weather, and system component. However, the deep learning‐based vision systems are vulnerable to perturbation, which contains noise. Thus, robust object detection under harsh autonomous‐driving environments is a more difficult than the generic situation. In this paper, it is found that not only the accuracy, but also the speed of the non‐maximum suppression‐based detector can be degraded under harsh environments. Therefore, object detection is handled under a harsh situation with adversarial mechanisms such as adversarial training and adversarial defence. Adversarial defence modules are designed to improve robustness in feature extraction level and define perturbations under a harsh environment for training object detectors to improve the robustness of the model's decision boundary. The proposed adversarial defence and training mechanisms improve the object detector in both accuracy and speed. The proposed method shows a 43.7% mean average precision for the COCO2015 dataset in generic object detection and 39.0% mean average precision for the BDD100K dataset in a driving environment. Furthermore, it achieves a real‐time capability of 23 frames per second.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12159