Convolutional-Neural-Network-Based Approach for Segmentation of Apical Four-Chamber View from Fetal Echocardiography
An apical four-chamber (A4C) view from early fetal echocardiography is an extremely significant step in early diagnosis and timely treatment of congenital heart diseases. The objective is to perform automated segmentation of cardiac structures, namely, the epicardium, left ventricle, left atrium, de...
Gespeichert in:
Veröffentlicht in: | IEEE access 2020, Vol.8, p.80437-80446 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | An apical four-chamber (A4C) view from early fetal echocardiography is an extremely significant step in early diagnosis and timely treatment of congenital heart diseases. The objective is to perform automated segmentation of cardiac structures, namely, the epicardium, left ventricle, left atrium, descending aorta, right atrium, right ventricle, and thorax, in ultrasound A4C views in one shot in order to assist clinicians in prenatal examination. However, such a segmentation task is often faced with the following challenges: 1) low imaging resolution; 2) incomplete tissue boundary; 3) overall contrast of the image. To address these issues, in this study, we propose a cascaded U-net, named CU-net, with structural similarity index measure (SSIM) loss. First, the CU-net with two branch supervisions helps gain clear tissue boundaries and alleviate the gradient vanishing problem caused by increasing network depth. Second, between-net connections in the CU-net can transmit the prior information from the shallow layer to the deeper layer and obtain more refined segmentation results. Third, the method leverages on SSIM loss to preserve fine-grained structural information and obtain clear boundaries. Extensive experiments on a dataset of 1712 A4C views demonstrate that the proposed method achieves a high dice coefficient of 0.856, Hausdorff distance of 3.33, and pixel accuracy of 0.929, revealing its effectiveness and potential as a clinical tool. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2020.2984630 |