Navigating the Depths: A Comprehensive Survey of Deep Learning for Passive Underwater Acoustic Target Recognition

The field of deep learning is a rapidly developing research area with numerous applications across multiple domains. Sonar (SOund Navigation And Ranging) processing has traditionally been a field of statistical analysis. However, in the past ten to fifteen years, the rapid growth of deep learning ha...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.154092-154118
Hauptverfasser: Muller, Nils, Reermann, Jens, Meisen, Tobias
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The field of deep learning is a rapidly developing research area with numerous applications across multiple domains. Sonar (SOund Navigation And Ranging) processing has traditionally been a field of statistical analysis. However, in the past ten to fifteen years, the rapid growth of deep learning has challenged classical approaches with modern deep learning-based methods. This survey provides a systematic overview of the Underwater Acoustic Target Recognition (UATR) domain within the area of deep learning. The objective is to highlight popular design choices and evaluate the commonalities and differences of the investigated techniques in relation to the selected architectures and pre-processing methods. Furthermore, this survey examines the state of UATR literature through the identification of prominent conferences and journals which points new researchers in directions where to allocate UATR related publications. Additionally, popular datasets and available benchmarks are identified and analysed for complexity coverage. This work targets researchers new to the field as well as experienced researchers that want to get a broader overview. Nonetheless, experienced sonar engineers with a strong background within classical analysis also benefit from this survey.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3480788