An Improved Underwater Visual SLAM through Image Enhancement and Sonar Fusion

To enhance the performance of visual SLAM in underwater environments, this paper presents an enhanced front-end method based on visual feature enhancement. The method comprises three modules aimed at optimizing and improving the matching capability of visual features from different perspectives. Fir...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Remote sensing (Basel, Switzerland) Switzerland), 2024-07, Vol.16 (14), p.2512
Hauptverfasser: Qiu, Haiyang, Tang, Yijie, Wang, Hui, Wang, Lei, Xiang, Dan, Xiao, Mingming
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To enhance the performance of visual SLAM in underwater environments, this paper presents an enhanced front-end method based on visual feature enhancement. The method comprises three modules aimed at optimizing and improving the matching capability of visual features from different perspectives. Firstly, to address issues related to insufficient underwater illumination and uneven distribution of artificial light sources, a brightness-consistency recovery method is proposed. This method employs an adaptive histogram equalization algorithm to balance the brightness of images. Secondly, a method for denoising underwater suspended particulates is introduced to filter out noise from images. After image-level processing, a combined underwater acousto–optic feature-association method is proposed, which associates acoustic features from sonar with visual features, thereby providing distance information for visual features. Finally, utilizing the AFRL dataset, the improved system incorporating the proposed enhancement methods is evaluated for its performance against the OKVIS framework. The system achieves a better trajectory estimation accuracy compared to OKVIS and demonstrates robustness in underwater environments.
ISSN:2072-4292
2072-4292
DOI:10.3390/rs16142512