Deep learning assisted well log inversion for fracture identification

ABSTRACT Manual fracture identification methods based on cores and image logging pseudo‐pictures are limited by the expense and the amount of data. In this paper, we propose an integrated workflow, which takes the fracture identification as an end‐to‐end project, to combine the boundary detection an...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Geophysical Prospecting 2021-02, Vol.69 (2), p.419-433
Hauptverfasser: Tian, Miao, Li, Bingtao, Xu, Huaimin, Yan, Dezhi, Gao, Yining, Lang, Xiaozheng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:ABSTRACT Manual fracture identification methods based on cores and image logging pseudo‐pictures are limited by the expense and the amount of data. In this paper, we propose an integrated workflow, which takes the fracture identification as an end‐to‐end project, to combine the boundary detection and the deep learning classification to recognize fractured zones with accurate locations and reasonable thickness. We first apply the discrete wavelet transform algorithm and a boundary detection method named changing point detection to enhance the fracture sensibility of acoustic logs and segment the whole logging interval into non‐overlapping subsections by estimating boundaries. The deep neural network based auto‐encoders and the convolutional neural network classifier are then implemented to extract the hidden information from logs and categorize the subsections as the fractured or non‐fractured zones. To validate the feasibility of this workflow, we apply it to the logging data from a real well. Compare with the benchmarks provided by the support vector machine , random forest and Adaboost model, the one‐dimensional well profile predicted by the proposed changing point detection‐deep learning classifier is more consistent with the manual identification result.
ISSN:0016-8025
1365-2478
DOI:10.1111/1365-2478.13054