Recognition of asphalt road hazards based on high-density gray point clouds

Road hazards can lead to dangerous accidents and endanger the safety of pedestrians. Frequent and thorough road inspection is required to maintain road safety. This paper proposes an improved U-Net model that combines gray-scale images and depth images and uses a data statistics method based on a ro...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Engineering Research Express 2022-09, Vol.4 (3), p.35048
Hauptverfasser: Tang, Chao, Xia, Mengxuan, Fan, Tingli, Wang, Li, Yu, Haibin, Xu, Yiqun, Hou, Haiqian, Wang, Xiaojing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Road hazards can lead to dangerous accidents and endanger the safety of pedestrians. Frequent and thorough road inspection is required to maintain road safety. This paper proposes an improved U-Net model that combines gray-scale images and depth images and uses a data statistics method based on a road depth map to eliminate hazard-free data automatically and reduce the computational complexity involved in hazard detection. Experiments showed that the proposed model, based on the improved U-Net-based pavement hazard recognition and extraction algorithm, could smoothly and efficiently extract pavement cracks and deformation hazards in complex scenes with noise interference and produce results with strong robustness. Comprehensive indicators such as global recognition accuracy rate A (Accuracy), precision rate P (Precision), recall rate R (Recall), evaluation index F1 (F -Measure), and Mean Intersection over Union (MIoU) were used to assess the effectiveness of the proposed model in comparison with existing hazard detection models; the proposed model greatly outperformed the models in all the indicators. The proposed model can provide a significant reference for subsequent pavement repair work and be used to improve road safety.
ISSN:2631-8695
2631-8695
DOI:10.1088/2631-8695/ac8cce