Robust Sewer Defect Detection With Text Analysis Based on Deep Learning

Sewerage systems play a vital role in building modern cities, providing appropriate ways to release liquid wastes. Due to the rapid expansion of cities, the deterioration of sewage pipes are increasing. Hence, systematic maintenance methods are require to overcome this problem. In most cases, sewer...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, Vol.10, p.46224-46237
Hauptverfasser: Oh, Chanmi, Dang, L. Minh, Han, Dongil, Moon, Hyeonjoon
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Sewerage systems play a vital role in building modern cities, providing appropriate ways to release liquid wastes. Due to the rapid expansion of cities, the deterioration of sewage pipes are increasing. Hence, systematic maintenance methods are require to overcome this problem. In most cases, sewer inspection is done by human inspectors, which is error-prone, time-consuming, costly, and lacking appropriate survey evaluations. In this paper, we introduce a new automated framework for detecting sewage pipe defects based on the attention mechanism, improved YOLOv5 architecture, and location information recognition from CCTV videos. The main contributions include (1) the addition of a micro-scale detection feature in the layers to improve the defect detection mechanism; (2) the application of a convolutional block attention module for better channel/spatial features; (3) construction of a larger defect-detection dataset for the 12 most common defect types; and (4) implementation of the TPS-ResNet-BiLSTM-Attn (TRBA) model for the text-information recognition mechanism from CCTV videos. The experimental results show that the proposed real-time sewer defect detection model achieved the mean average precision (mAP) of 75.9% on the proposed dataset, outperforming other standard models, such as YOLO and SSD.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3168660