Real-time high-resolution neural network with semantic guidance for crack segmentation
Deep learning plays an important role in crack segmentation, but most work utilize off-the-shelf or improved models that have not been specifically developed for this task. High-resolution convolution neural networks that are sensitive to objects’ location and detail help improve the performance of...
Gespeichert in:
Veröffentlicht in: | Automation in construction 2023-12, Vol.156, p.105112, Article 105112 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deep learning plays an important role in crack segmentation, but most work utilize off-the-shelf or improved models that have not been specifically developed for this task. High-resolution convolution neural networks that are sensitive to objects’ location and detail help improve the performance of crack segmentation, yet conflict with real-time detection. This paper describes HrSegNet, a high-resolution network with semantic guidance specifically designed for crack segmentation, which guarantees real-time inference speed while preserving crack details. After evaluation on the composite dataset CrackSeg9k and the scenario-specific datasets Asphalt3k and Concrete3k, HrSegNet obtains state-of-the-art segmentation performance and efficiencies that far exceed those of the compared models. This approach demonstrates that there is a trade-off between high-resolution modeling and real-time detection, which fosters the use of edge devices to analyze cracks in real-world applications.
[Display omitted]
•A real-time high-resolution neural network is designed specifically for crack segmentation.•A flexible high-resolution CNN network design while ensuring efficient.•Methodology for the fusion of semantic and detailed features within a CNN architecture.•Detailed comparative analysis with the SOTA on three datasets. |
---|---|
ISSN: | 0926-5805 1872-7891 |
DOI: | 10.1016/j.autcon.2023.105112 |