EMRNet: End-to-End Electrical Model Restoration Network
The traditional method to improve the resolution in electromagnetic inversion is increasing the number of iterations, which displays poor nonlinear mapping and strong nonuniqueness. To meet this challenge, a new strategy is proposed via reconstructing the geoelectric model for traditional inversion...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-12 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The traditional method to improve the resolution in electromagnetic inversion is increasing the number of iterations, which displays poor nonlinear mapping and strong nonuniqueness. To meet this challenge, a new strategy is proposed via reconstructing the geoelectric model for traditional inversion results through a deep neural network (DNN). DNN possesses the advantage of establishing an uncertain mapping between low-resolution images and high-resolution target images. To recover the high-precision geoelectric model, we propose an end-to-end electromagnetic recovery network (EMRNet) with novel components to adequately use the geoelectric model data of traditional inversion. Specifically, EMRNet uses the codec structure from U-Net, whereby a cross-scale feature attention module (CSFA Block) is incorporated into the decoding process to make full use of feature information of different scales. The superiority of EMRNet is validated on both the synthetic and measured data. The predicted geoelectric models of EMRNet are more consistent with the target from the aspects of resistivity values, overall structure, and resolution. In addition, the geoelectric model predicted by EMRNet agrees well with the real geological background data, and the corresponding response data are closer to the measured data. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2022.3193297 |