An Automatic Image Processing Algorithm Based on Crack Pixel Density for Pavement Crack Detection and Classification
Nowadays, there is a massive necessity to develop fully automated and efficient distress assessment systems to evaluate pavement conditions with the minimum cost. Due to having complex training processes, most of the current supervised learning-based practices in this area are not suitable for small...
Gespeichert in:
Veröffentlicht in: | International journal of pavement research & technology 2022, Vol.15 (1), p.159-172 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Nowadays, there is a massive necessity to develop fully automated and efficient distress assessment systems to evaluate pavement conditions with the minimum cost. Due to having complex training processes, most of the current supervised learning-based practices in this area are not suitable for smaller, local-level projects with limited resources. This paper aims to develop an automatic crack assessment method to detect and classify cracks from 2-D and 3-D pavement images. A tile-based image processing method was proposed to apply a localized thresholding technique on each tile and detect the cracked ones (tiles containing cracks) based on crack pixels’ spatial distribution. For longitudinal and transverse cracking, a curve is then fitted on the cracked tiles to connect them. Next, cracks are classified, and their lengths are measured based on the orientation axes and length of the crack curves. This method is not limited to the pavement texture type, and it is cost-efficient as it takes less than 20 s per image for a commodity computer to generate results. The method was tested on 130 images of Portland Cement Concrete (PCC) and Asphalt Concrete (AC) surfaces; test results were found to be promising (Precision = 0.89, Recall = 0.83,
F
1
score = 0.86, and Crack length measurement accuracy = 80%). |
---|---|
ISSN: | 1996-6814 1997-1400 |
DOI: | 10.1007/s42947-021-00006-4 |