Uncertainty aware and explainable diagnosis of retinal disease
Deep learning methods for ophthalmic diagnosis have shown considerable success in tasks like segmentation and classification. However, their widespread application is limited due to the models being opaque and vulnerable to making a wrong decision in complicated cases. Explainability methods show th...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deep learning methods for ophthalmic diagnosis have shown considerable
success in tasks like segmentation and classification. However, their
widespread application is limited due to the models being opaque and vulnerable
to making a wrong decision in complicated cases. Explainability methods show
the features that a system used to make prediction while uncertainty awareness
is the ability of a system to highlight when it is not sure about the decision.
This is one of the first studies using uncertainty and explanations for
informed clinical decision making. We perform uncertainty analysis of a deep
learning model for diagnosis of four retinal diseases - age-related macular
degeneration (AMD), central serous retinopathy (CSR), diabetic retinopathy
(DR), and macular hole (MH) using images from a publicly available (OCTID)
dataset. Monte Carlo (MC) dropout is used at the test time to generate a
distribution of parameters and the predictions approximate the predictive
posterior of a Bayesian model. A threshold is computed using the distribution
and uncertain cases can be referred to the ophthalmologist thus avoiding an
erroneous diagnosis. The features learned by the model are visualized using a
proven attribution method from a previous study. The effects of uncertainty on
model performance and the relationship between uncertainty and explainability
are discussed in terms of clinical significance. The uncertainty information
along with the heatmaps make the system more trustworthy for use in clinical
settings. |
---|---|
DOI: | 10.48550/arxiv.2101.12041 |