Research on Obstacle Detection and Avoidance of Autonomous Underwater Vehicle Based on Forward-Looking Sonar

Due to the complexity of the ocean environment, an autonomous underwater vehicle (AUV) is disturbed by obstacles when performing tasks. Therefore, the research on underwater obstacle detection and avoidance is particularly important. Based on the images collected by a forward-looking sonar on an AUV...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2023-11, Vol.34 (11), p.9198-9208
Hauptverfasser: Cao, Xiang, Ren, Lu, Sun, Changyin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Due to the complexity of the ocean environment, an autonomous underwater vehicle (AUV) is disturbed by obstacles when performing tasks. Therefore, the research on underwater obstacle detection and avoidance is particularly important. Based on the images collected by a forward-looking sonar on an AUV, this article proposes an obstacle detection and avoidance algorithm. First, a deep learning-based obstacle candidate area detection algorithm is developed. This algorithm uses the You Only Look Once (YOLO) v3 network to determine obstacle candidate areas in a sonar image. Then, in the determined obstacle candidate areas, the obstacle detection algorithm based on the improved threshold segmentation algorithm is used to detect obstacles accurately. Finally, using the obstacle detection results obtained from the sonar images, an obstacle avoidance algorithm based on deep reinforcement learning (DRL) is developed to plan a reasonable obstacle avoidance path of an AUV. Experimental results show that the proposed algorithms improve obstacle detection accuracy and processing speed of sonar images. At the same time, the proposed algorithms ensure AUV navigation safety in a complex obstacle environment.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2022.3156907