ROV-based binocular vision system for underwater structure crack detection and width measurement

It is efficient to replace human eyes with underwater vehicles equipped with visual sensors to carry out underwater inspections. However, the inability of monocular vision to provide accurate depth information highlights the importance of binocular vision in underwater target detection and measureme...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2023-06, Vol.82 (14), p.20899-20923
Hauptverfasser: Ma, Yunpeng, Wu, Yi, Li, Qingwu, Zhou, Yaqin, Yu, Dabing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:It is efficient to replace human eyes with underwater vehicles equipped with visual sensors to carry out underwater inspections. However, the inability of monocular vision to provide accurate depth information highlights the importance of binocular vision in underwater target detection and measurement. In this paper, an ROV (Remotely Operated Vehicles) based binocular vision system incorporating a specially designed underwater robot is developed to carry out underwater structure detection in real-time. The system is designed to adapt to long-distance and long-duration underwater missions in various underwater environments. Taking cracks as inspection targets, a crack detection and measurement approach is proposed after the robot’s surface cleaning function is applied. Firstly, an affine transformation model is used to enhance the color-distorted underwater images effectively. Then, the multi-directional gray-level fluctuation analysis is applied to acquire an accurate crack segmented result. Finally, the computed disparity map is combined with the segmentation map to determine the crack width quickly. A group of experiments is performed and the validity and effectiveness of the system and crack measurement algorithm are demonstrated.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-14168-1