Intelligent Identification of Coal Crack in CT Images Based on Deep Learning

Automatic segmentation of coal crack in CT images is of great significance for the establishment of digital cores. In addition, segmentation in this field remains challenging due to some properties of coal crack CT images: high noise, small targets, unbalanced positive and negative samples, and comp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computational intelligence and neuroscience 2022-09, Vol.2022, p.1-10
Hauptverfasser: Yu, Jinxia, Wu, Chengyi, Li, Yingying, Zhang, Yimin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Automatic segmentation of coal crack in CT images is of great significance for the establishment of digital cores. In addition, segmentation in this field remains challenging due to some properties of coal crack CT images: high noise, small targets, unbalanced positive and negative samples, and complex, diverse backgrounds. In this paper, a segmentation method of coal crack CT images is proposed and a dataset of coal crack CT images is established. Based on the semantic segmentation model DeepLabV3+ of deep learning, the OS of the backbone has been modified to 8, and the ASPP module rate has also been modified. A new loss function is defined by combining CE loss and Dice loss. This deep learning method avoids the problem of manually setting thresholds in traditional threshold segmentation and can automatically and intelligently extract cracks. Besides, the proposed model has 0.1%, 1.2%, 2.9%, and 0.5% increase in Acc, mAcc, MioU, and FWIoU compared with other techniques and has 0.1%, 0.8%, 2%, and 0.4% increase compared with the original DeepLabV3+ on the dataset of coal CT images. The obtained results denote that the proposed segmentation method outperforms existing crack detection techniques and have practical application value in safety engineering.
ISSN:1687-5265
1687-5273
DOI:10.1155/2022/7092436