A Real-Time Detection Method for Concrete Surface Cracks Based on Improved YOLOv4
Many structures in civil engineering are symmetrical. Crack detection is a critical task in the monitoring and inspection of civil engineering structures. This study implements a lightweight neural network based on the YOLOv4 algorithm to detect concrete surface cracks. In the extraction of backbone...
Gespeichert in:
Veröffentlicht in: | Symmetry (Basel) 2021-09, Vol.13 (9), p.1716 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Many structures in civil engineering are symmetrical. Crack detection is a critical task in the monitoring and inspection of civil engineering structures. This study implements a lightweight neural network based on the YOLOv4 algorithm to detect concrete surface cracks. In the extraction of backbone and the design of neck and head, the symmetry concept is adopted. The model modules are improved to reduce the depth and complexity of the overall network structure. Meanwhile, the separable convolution is used to realize spatial convolution, and the SPP and PANet modules are improved to reduce the model parameters. The convolutional layer and batch normalization layer are merged to improve the model inference speed. In addition, using the focal loss function for reference, the loss function of object detection network is improved to balance the proportion of the cracks and the background samples. To comprehensively evaluate the performance of the improved method, 10,000 images (256 × 256 pixels in size) of cracks on concrete surfaces are collected to build the database. The improved YOLOv4 model achieves an mAP of 94.09% with 8.04 M and 0.64 GMacs. The results show that the improved model is satisfactory in mAP, and the model size and calculation amount are greatly reduced. This performs better in terms of real-time detection on concrete surface cracks. |
---|---|
ISSN: | 2073-8994 2073-8994 |
DOI: | 10.3390/sym13091716 |