Deep Learning-Based Lane Marking Detection using A 2 -LMDet
Automated lane marking detection is essential for advanced driver assistance system (ADAS) and pavement management work. However, prior research has mostly detected lane marking segments from a front-view image, which easily suffers from occlusion or noise disturbance. In this paper, we aim at accur...
Gespeichert in:
Veröffentlicht in: | Transportation research record 2020-11, Vol.2674 (11), p.625-635 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Automated lane marking detection is essential for advanced driver assistance system (ADAS) and pavement management work. However, prior research has mostly detected lane marking segments from a front-view image, which easily suffers from occlusion or noise disturbance. In this paper, we aim at accurate and robust lane marking detection from a top-view perspective, and propose a deep learning-based detector with adaptive anchor scheme, referred to as A 2 -LMDet. On the one hand, it is an end-to-end framework that fuses feature extraction and object detection into a single deep convolutional neural network. On the other hand, the adaptive anchor scheme is designed by formulating a bilinear interpolation algorithm, and is used to guide specific-anchor box generation and informative feature extraction. To validate the proposed method, a newly built lane marking dataset contained 24,000 high-resolution laser imaging data is further developed for case study. Quantitative and qualitative results demonstrate that A 2 -LMDet achieves highly accurate performance with 0.9927 precision, 0.9612 recall, and a 0.9767 [Formula: see text] score, which outperforms other advanced methods by a considerable margin. Moreover, ablation analysis illustrates the effectiveness of the adaptive anchor scheme for enhancing feature representation and performance improvement. We expect our work will help the development of related research. |
---|---|
ISSN: | 0361-1981 2169-4052 |
DOI: | 10.1177/0361198120948508 |