Deep Learning for Real-time, Automatic, and Scanner-adapted Prostate (Zone) Segmentation of Transrectal Ultrasound, for Example, Magnetic Resonance Imaging–transrectal Ultrasound Fusion Prostate Biopsy

Although recent advances in multiparametric magnetic resonance imaging (MRI) led to an increase in MRI-transrectal ultrasound (TRUS) fusion prostate biopsies, these are time consuming, laborious, and costly. Introduction of deep-learning approach would improve prostate segmentation. To exploit deep...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:European urology focus 2021-01, Vol.7 (1), p.78-85
Hauptverfasser: van Sloun, Ruud J.G., Wildeboer, Rogier R., Mannaerts, Christophe K., Postema, Arnoud W., Gayet, Maudy, Beerlage, Harrie P., Salomon, Georg, Wijkstra, Hessel, Mischi, Massimo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Although recent advances in multiparametric magnetic resonance imaging (MRI) led to an increase in MRI-transrectal ultrasound (TRUS) fusion prostate biopsies, these are time consuming, laborious, and costly. Introduction of deep-learning approach would improve prostate segmentation. To exploit deep learning to perform automatic, real-time prostate (zone) segmentation on TRUS images from different scanners. Three datasets with TRUS images were collected at different institutions, using an iU22 (Philips Healthcare, Bothell, WA, USA), a Pro Focus 2202a (BK Medical), and an Aixplorer (SuperSonic Imagine, Aix-en-Provence, France) ultrasound scanner. The datasets contained 436 images from 181 men. Manual delineations from an expert panel were used as ground truth. The (zonal) segmentation performance was evaluated in terms of the pixel-wise accuracy, Jaccard index, and Hausdorff distance. The developed deep-learning approach was demonstrated to significantly improve prostate segmentation compared with a conventional automated technique, reaching median accuracy of 98% (95% confidence interval 95–99%), a Jaccard index of 0.93 (0.80–0.96), and a Hausdorff distance of 3.0 (1.3–8.7) mm. Zonal segmentation yielded pixel-wise accuracy of 97% (95–99%) and 98% (96–99%) for the peripheral and transition zones, respectively. Supervised domain adaptation resulted in retainment of high performance when applied to images from different ultrasound scanners (p > 0.05). Moreover, the algorithm’s assessment of its own segmentation performance showed a strong correlation with the actual segmentation performance (Pearson’s correlation 0.72, p 
ISSN:2405-4569
2405-4569
DOI:10.1016/j.euf.2019.04.009