ResD-Unet Research and Application for Pulmonary Artery Segmentation

In the three-dimensional reconstruction of the pulmonary artery and the identification of pulmonary embolism, experts find it difficult to accurately estimate the severity of the embolism in the pulmonary artery, due to its irregular shape and complex adjacent tissues. In effect, segmenting the pulm...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2021, Vol.9, p.67504-67511
Hauptverfasser: Yuan, Hongfang, Liu, Zhenhong, Shao, Yajun, Liu, Min
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In the three-dimensional reconstruction of the pulmonary artery and the identification of pulmonary embolism, experts find it difficult to accurately estimate the severity of the embolism in the pulmonary artery, due to its irregular shape and complex adjacent tissues. In effect, segmenting the pulmonary artery accurately is the basis for assessing the severity of pulmonary embolism, and it is also a challengeable task. To solve this problem, this study proposes a ResD-Unet architecture for pulmonary artery segmentation. To begin with, the U-Net network is used as the basic structure, which allows efficient information flow and good performance in the absence of a sufficiently large dataset. In what follows, novel Residual-Dense blocks are introduced in the ResD-Unet architecture to refine image segmentation and build a deeper network while improving the gradient circulation of the network. Finally, a novel hybrid loss function is utilized to make full use of the advantages of the binary cross entropy loss, Dice loss and SSIM loss. Equipped with the hybrid loss, the proposed architecture is able to effectively segment the object areas and accurately predict the structures with clear boundaries. The experimental results show that the proposed framework can achieve high segmentation accuracy and efficiency, and the segmentation results are comparable to that of manual segmentation.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3073051