PCa-RadHop: A Transparent and Lightweight Feed-forward Method for Clinically Significant Prostate Cancer Segmentation
Prostate Cancer is one of the most frequently occurring cancers in men, with a low survival rate if not early diagnosed. PI-RADS reading has a high false positive rate, thus increasing the diagnostic incurred costs and patient discomfort. Deep learning (DL) models achieve a high segmentation perform...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Prostate Cancer is one of the most frequently occurring cancers in men, with
a low survival rate if not early diagnosed. PI-RADS reading has a high false
positive rate, thus increasing the diagnostic incurred costs and patient
discomfort. Deep learning (DL) models achieve a high segmentation performance,
although require a large model size and complexity. Also, DL models lack of
feature interpretability and are perceived as ``black-boxes" in the medical
field. PCa-RadHop pipeline is proposed in this work, aiming to provide a more
transparent feature extraction process using a linear model. It adopts the
recently introduced Green Learning (GL) paradigm, which offers a small model
size and low complexity. PCa-RadHop consists of two stages: Stage-1 extracts
data-driven radiomics features from the bi-parametric Magnetic Resonance
Imaging (bp-MRI) input and predicts an initial heatmap. To reduce the false
positive rate, a subsequent stage-2 is introduced to refine the predictions by
including more contextual information and radiomics features from each already
detected Region of Interest (ROI). Experiments on the largest publicly
available dataset, PI-CAI, show a competitive performance standing of the
proposed method among other deep DL models, achieving an area under the curve
(AUC) of 0.807 among a cohort of 1,000 patients. Moreover, PCa-RadHop maintains
orders of magnitude smaller model size and complexity. |
---|---|
DOI: | 10.48550/arxiv.2403.15969 |