Semi‐supervised deep autoencoder for seismic facies classification

ABSTRACT Facies boundaries are critical for flow performance in a reservoir and are significant for lithofacies identification in well interpretation and reservoir prediction. Facies identification based on supervised machine learning methods usually requires a large amount of labelled data, which a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:GEOPHYSICAL PROSPECTING 2021-07, Vol.69 (6), p.1295-1315
Hauptverfasser: Liu, Xingye, Li, Bin, Li, Jingye, Chen, Xiaohong, Li, Qingchun, Chen, Yangkang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:ABSTRACT Facies boundaries are critical for flow performance in a reservoir and are significant for lithofacies identification in well interpretation and reservoir prediction. Facies identification based on supervised machine learning methods usually requires a large amount of labelled data, which are sometimes difficult to obtain. Here, we introduce the deep autoencoder to learn the hidden features and conduct facies classification from elastic attributes. Both labelled and unlabelled data are involved in the training process. Then, we develop a semi‐supervised deep autoencoder by taking the mean of intra‐class and the whole population of facies into account to construct a classification regularization term, thereby improving the classification accuracy and reducing the uncertainty. The new method inherits the profits of deep autoencoder and absorbs the information provided by labelled data. The proposed method performs well and produces promising results when it is used to address problems of reservoir prediction and facies identification. The new method is evaluated on both well and seismic data and compared with the conventional deep autoencoder method, which demonstrates its feasibility and superiority with respect to classification accuracy.
ISSN:0016-8025
1365-2478
DOI:10.1111/1365-2478.13106