InversionNet3D: Efficient and Scalable Learning for 3-D Full-Waveform Inversion

Seismic full-waveform inversion (FWI) techniques aim to find a high-resolution subsurface geophysical model provided with waveform data. Some recent effort in data-driven FWI has shown some encouraging results in obtaining 2-D velocity maps. However, due to high computational complexity and large me...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-16
Hauptverfasser: Zeng, Qili, Feng, Shihang, Wohlberg, Brendt, Lin, Youzuo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Seismic full-waveform inversion (FWI) techniques aim to find a high-resolution subsurface geophysical model provided with waveform data. Some recent effort in data-driven FWI has shown some encouraging results in obtaining 2-D velocity maps. However, due to high computational complexity and large memory consumption, the reconstruction of 3-D high-resolution velocity maps via deep networks is still a great challenge. In this article, we present InversionNet3D (InvNet3D), an efficient and scalable encoder-decoder network for 3-D FWI. The proposed method employs group convolution in the encoder to establish an effective hierarchy for learning information from multiple sources while cutting down unnecessary parameters and operations at the same time. The introduction of invertible layers further reduces the memory consumption of intermediate features during training and, thus, enables the development of deeper networks with more layers and higher capacity as required by different application scenarios. Experiments on the 3-D Kimberlina dataset demonstrate that InvNet3D achieves state-of-the-art reconstruction performance with lower computational cost and lower memory footprint compared to the baseline.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3135354