Deep Convolutional Neural Network-Assisted Feature Extraction for Diagnostic Discrimination and Feature Visualization in Pancreatic Ductal Adenocarcinoma (PDAC) versus Autoimmune Pancreatitis (AIP)

The differentiation of autoimmune pancreatitis (AIP) and pancreatic ductal adenocarcinoma (PDAC) poses a relevant diagnostic challenge and can lead to misdiagnosis and consequently poor patient outcome. Recent studies have shown that radiomics-based models can achieve high sensitivity and specificit...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of clinical medicine 2020-12, Vol.9 (12), p.4013, Article 4013
Hauptverfasser: Ziegelmayer, Sebastian, Kaissis, Georgios, Harder, Felix, Jungmann, Friederike, Mueller, Tamara, Makowski, Marcus, Braren, Rickmer
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The differentiation of autoimmune pancreatitis (AIP) and pancreatic ductal adenocarcinoma (PDAC) poses a relevant diagnostic challenge and can lead to misdiagnosis and consequently poor patient outcome. Recent studies have shown that radiomics-based models can achieve high sensitivity and specificity in predicting both entities. However, radiomic features can only capture low level representations of the input image. In contrast, convolutional neural networks (CNNs) can learn and extract more complex representations which have been used for image classification to great success. In our retrospective observational study, we performed a deep learning-based feature extraction using CT-scans of both entities and compared the predictive value against traditional radiomic features. In total, 86 patients, 44 with AIP and 42 with PDACs, were analyzed. Whole pancreas segmentation was automatically performed on CT-scans during the portal venous phase. The segmentation masks were manually checked and corrected if necessary. In total, 1411 radiomic features were extracted using PyRadiomics and 256 features (deep features) were extracted using an intermediate layer of a convolutional neural network (CNN). After feature selection and normalization, an extremely randomized trees algorithm was trained and tested using a two-fold shuffle-split cross-validation with a test sample of 20% (n = 18) to discriminate between AIP or PDAC. Feature maps were plotted and visual difference was noted. The machine learning (ML) model achieved a sensitivity, specificity, and ROC-AUC of 0.89 +/- 0.11, 0.83 +/- 0.06, and 0.90 +/- 0.02 for the deep features and 0.72 +/- 0.11, 0.78 +/- 0.06, and 0.80 +/- 0.01 for the radiomic features. Visualization of feature maps indicated different activation patterns for AIP and PDAC. We successfully trained a machine learning model using deep feature extraction from CT-images to differentiate between AIP and PDAC. In comparison to traditional radiomic features, deep features achieved a higher sensitivity, specificity, and ROC-AUC. Visualization of deep features could further improve the diagnostic accuracy of non-invasive differentiation of AIP and PDAC.
ISSN:2077-0383
2077-0383
DOI:10.3390/jcm9124013