A novel real-time pixel-level road crack segmentation network
Road crack detection plays a vital role in preserving the life of roads and ensuring driver safety. Traditional methods relying on manual observation have limitations in terms of subjectivity and inefficiency in quantifying damage. In recent years, advances in deep learning techniques have held prom...
Gespeichert in:
Veröffentlicht in: | Journal of real-time image processing 2024-05, Vol.21 (3), p.76, Article 76 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Road crack detection plays a vital role in preserving the life of roads and ensuring driver safety. Traditional methods relying on manual observation have limitations in terms of subjectivity and inefficiency in quantifying damage. In recent years, advances in deep learning techniques have held promise for automated crack detection, but challenges, such as low contrast, small datasets, and inaccurate localization, remain. In this paper, we propose a deep learning-based pixel-level road crack segmentation network that achieves excellent performance on multiple datasets. In order to enrich the receptive fields of conventional convolutional modules, we design a residual asymmetric convolutional module for feature extraction. In addition to this, a multiple receptive field cascade module and a feature fusion module with non-local attention are proposed. Our network demonstrates superior accuracy and inference speed, achieving 55.60%, 59.01%, 75.65%, and 57.95% IoU on the CrackForest, CrackTree, CDD, and Crack500 datasets, respectively. It also has the ability to process 143 images per second. Experimental results and analysis validate the effectiveness of our approach. This work contributes to the advancement of road crack detection, providing a valuable tool for road maintenance and safety improvement. |
---|---|
ISSN: | 1861-8200 1861-8219 |
DOI: | 10.1007/s11554-024-01458-0 |