Automatic Pancreas Segmentation via Coarse Location and Ensemble Learning
Automatic and reliable segmentation of the pancreas is an important but difficult task for various clinical applications, such as pancreatic cancer radiotherapy and computer-aided diagnosis (CAD). The main challenges for accurate CT pancreas segmentation lie in two aspects: (1) large shape variation...
Gespeichert in:
Veröffentlicht in: | IEEE access 2020, Vol.8, p.2906-2914 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Automatic and reliable segmentation of the pancreas is an important but difficult task for various clinical applications, such as pancreatic cancer radiotherapy and computer-aided diagnosis (CAD). The main challenges for accurate CT pancreas segmentation lie in two aspects: (1) large shape variation across different patients, and (2) low contrast and blurring around the pancreas boundary. In this paper, we propose a two-stage, ensemble-based fully convolutional neural network (FCN) to solve the challenging pancreas segmentation problem in CT images. First, candidate region generation is performed by classifying patches generated by superpixels. Second, five FCNs based on the U-Net architecture are trained with different objective functions. For each network, 2.5D slices are used as the input to provide 3D image information complementarily without the need for computationally expensive 3D convolutions. Then, an ensemble model is utilized to combine the five output segmentation maps and achieve the final segmentation. The proposed method is extensively evaluated on a publicly available dataset of 82 manually segmented CT volumes via 4-fold cross-validation. Experimental results show its superior performance compared with several state-of-the-art methods with a Dice coefficient of 84.10±4.91% and Jaccard coefficient of 72.86±6.89%. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2961125 |