Uncertainty aware and explainable diagnosis of retinal disease

Deep learning methods for ophthalmic diagnosis have shown considerable success in tasks like segmentation and classification. However, their widespread application is limited due to the models being opaque and vulnerable to making a wrong decision in complicated cases. Explainability methods show th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Singh, Amitojdeep, Sengupta, Sourya, Rasheed, Mohammed Abdul, Jayakumar, Varadharajan, Lakshminarayanan, Vasudevan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Singh, Amitojdeep
Sengupta, Sourya
Rasheed, Mohammed Abdul
Jayakumar, Varadharajan
Lakshminarayanan, Vasudevan
description Deep learning methods for ophthalmic diagnosis have shown considerable success in tasks like segmentation and classification. However, their widespread application is limited due to the models being opaque and vulnerable to making a wrong decision in complicated cases. Explainability methods show the features that a system used to make prediction while uncertainty awareness is the ability of a system to highlight when it is not sure about the decision. This is one of the first studies using uncertainty and explanations for informed clinical decision making. We perform uncertainty analysis of a deep learning model for diagnosis of four retinal diseases - age-related macular degeneration (AMD), central serous retinopathy (CSR), diabetic retinopathy (DR), and macular hole (MH) using images from a publicly available (OCTID) dataset. Monte Carlo (MC) dropout is used at the test time to generate a distribution of parameters and the predictions approximate the predictive posterior of a Bayesian model. A threshold is computed using the distribution and uncertain cases can be referred to the ophthalmologist thus avoiding an erroneous diagnosis. The features learned by the model are visualized using a proven attribution method from a previous study. The effects of uncertainty on model performance and the relationship between uncertainty and explainability are discussed in terms of clinical significance. The uncertainty information along with the heatmaps make the system more trustworthy for use in clinical settings.
doi_str_mv 10.48550/arxiv.2101.12041
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2101_12041</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2101_12041</sourcerecordid><originalsourceid>FETCH-LOGICAL-a671-341cda22f5907d056524944c5cdb9dd8e3006f479a8e396871ec372594a9b8073</originalsourceid><addsrcrecordid>eNotj71qwzAURrV0KGkeoFP1Anav_ixrCZSQtIFAl3Q219J1EThOkEx-3j5umun7OMOBw9irgFLXxsA7pks8lVKAKIUELZ7Z4mfwlEaMw3jleMZEHIfA6XLsJ4ZtTzxE_B0OOWZ-6HiiccL9BDNhphf21GGfaf7YGdutV7vlV7H9_twsP7YFVlYUSgsfUMrOOLABTGWkdlp740PrQqhJAVSdtg6n66raCvLKSuM0urYGq2bs7V97D2iOKe4xXZu_kOYeom5zXUJm</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Uncertainty aware and explainable diagnosis of retinal disease</title><source>arXiv.org</source><creator>Singh, Amitojdeep ; Sengupta, Sourya ; Rasheed, Mohammed Abdul ; Jayakumar, Varadharajan ; Lakshminarayanan, Vasudevan</creator><creatorcontrib>Singh, Amitojdeep ; Sengupta, Sourya ; Rasheed, Mohammed Abdul ; Jayakumar, Varadharajan ; Lakshminarayanan, Vasudevan</creatorcontrib><description>Deep learning methods for ophthalmic diagnosis have shown considerable success in tasks like segmentation and classification. However, their widespread application is limited due to the models being opaque and vulnerable to making a wrong decision in complicated cases. Explainability methods show the features that a system used to make prediction while uncertainty awareness is the ability of a system to highlight when it is not sure about the decision. This is one of the first studies using uncertainty and explanations for informed clinical decision making. We perform uncertainty analysis of a deep learning model for diagnosis of four retinal diseases - age-related macular degeneration (AMD), central serous retinopathy (CSR), diabetic retinopathy (DR), and macular hole (MH) using images from a publicly available (OCTID) dataset. Monte Carlo (MC) dropout is used at the test time to generate a distribution of parameters and the predictions approximate the predictive posterior of a Bayesian model. A threshold is computed using the distribution and uncertain cases can be referred to the ophthalmologist thus avoiding an erroneous diagnosis. The features learned by the model are visualized using a proven attribution method from a previous study. The effects of uncertainty on model performance and the relationship between uncertainty and explainability are discussed in terms of clinical significance. The uncertainty information along with the heatmaps make the system more trustworthy for use in clinical settings.</description><identifier>DOI: 10.48550/arxiv.2101.12041</identifier><language>eng</language><subject>Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Learning</subject><creationdate>2021-01</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2101.12041$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2101.12041$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Singh, Amitojdeep</creatorcontrib><creatorcontrib>Sengupta, Sourya</creatorcontrib><creatorcontrib>Rasheed, Mohammed Abdul</creatorcontrib><creatorcontrib>Jayakumar, Varadharajan</creatorcontrib><creatorcontrib>Lakshminarayanan, Vasudevan</creatorcontrib><title>Uncertainty aware and explainable diagnosis of retinal disease</title><description>Deep learning methods for ophthalmic diagnosis have shown considerable success in tasks like segmentation and classification. However, their widespread application is limited due to the models being opaque and vulnerable to making a wrong decision in complicated cases. Explainability methods show the features that a system used to make prediction while uncertainty awareness is the ability of a system to highlight when it is not sure about the decision. This is one of the first studies using uncertainty and explanations for informed clinical decision making. We perform uncertainty analysis of a deep learning model for diagnosis of four retinal diseases - age-related macular degeneration (AMD), central serous retinopathy (CSR), diabetic retinopathy (DR), and macular hole (MH) using images from a publicly available (OCTID) dataset. Monte Carlo (MC) dropout is used at the test time to generate a distribution of parameters and the predictions approximate the predictive posterior of a Bayesian model. A threshold is computed using the distribution and uncertain cases can be referred to the ophthalmologist thus avoiding an erroneous diagnosis. The features learned by the model are visualized using a proven attribution method from a previous study. The effects of uncertainty on model performance and the relationship between uncertainty and explainability are discussed in terms of clinical significance. The uncertainty information along with the heatmaps make the system more trustworthy for use in clinical settings.</description><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj71qwzAURrV0KGkeoFP1Anav_ixrCZSQtIFAl3Q219J1EThOkEx-3j5umun7OMOBw9irgFLXxsA7pks8lVKAKIUELZ7Z4mfwlEaMw3jleMZEHIfA6XLsJ4ZtTzxE_B0OOWZ-6HiiccL9BDNhphf21GGfaf7YGdutV7vlV7H9_twsP7YFVlYUSgsfUMrOOLABTGWkdlp740PrQqhJAVSdtg6n66raCvLKSuM0urYGq2bs7V97D2iOKe4xXZu_kOYeom5zXUJm</recordid><startdate>20210126</startdate><enddate>20210126</enddate><creator>Singh, Amitojdeep</creator><creator>Sengupta, Sourya</creator><creator>Rasheed, Mohammed Abdul</creator><creator>Jayakumar, Varadharajan</creator><creator>Lakshminarayanan, Vasudevan</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20210126</creationdate><title>Uncertainty aware and explainable diagnosis of retinal disease</title><author>Singh, Amitojdeep ; Sengupta, Sourya ; Rasheed, Mohammed Abdul ; Jayakumar, Varadharajan ; Lakshminarayanan, Vasudevan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a671-341cda22f5907d056524944c5cdb9dd8e3006f479a8e396871ec372594a9b8073</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Singh, Amitojdeep</creatorcontrib><creatorcontrib>Sengupta, Sourya</creatorcontrib><creatorcontrib>Rasheed, Mohammed Abdul</creatorcontrib><creatorcontrib>Jayakumar, Varadharajan</creatorcontrib><creatorcontrib>Lakshminarayanan, Vasudevan</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Singh, Amitojdeep</au><au>Sengupta, Sourya</au><au>Rasheed, Mohammed Abdul</au><au>Jayakumar, Varadharajan</au><au>Lakshminarayanan, Vasudevan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Uncertainty aware and explainable diagnosis of retinal disease</atitle><date>2021-01-26</date><risdate>2021</risdate><abstract>Deep learning methods for ophthalmic diagnosis have shown considerable success in tasks like segmentation and classification. However, their widespread application is limited due to the models being opaque and vulnerable to making a wrong decision in complicated cases. Explainability methods show the features that a system used to make prediction while uncertainty awareness is the ability of a system to highlight when it is not sure about the decision. This is one of the first studies using uncertainty and explanations for informed clinical decision making. We perform uncertainty analysis of a deep learning model for diagnosis of four retinal diseases - age-related macular degeneration (AMD), central serous retinopathy (CSR), diabetic retinopathy (DR), and macular hole (MH) using images from a publicly available (OCTID) dataset. Monte Carlo (MC) dropout is used at the test time to generate a distribution of parameters and the predictions approximate the predictive posterior of a Bayesian model. A threshold is computed using the distribution and uncertain cases can be referred to the ophthalmologist thus avoiding an erroneous diagnosis. The features learned by the model are visualized using a proven attribution method from a previous study. The effects of uncertainty on model performance and the relationship between uncertainty and explainability are discussed in terms of clinical significance. The uncertainty information along with the heatmaps make the system more trustworthy for use in clinical settings.</abstract><doi>10.48550/arxiv.2101.12041</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2101.12041
ispartof
issn
language eng
recordid cdi_arxiv_primary_2101_12041
source arXiv.org
subjects Computer Science - Computer Vision and Pattern Recognition
Computer Science - Learning
title Uncertainty aware and explainable diagnosis of retinal disease
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-20T08%3A41%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Uncertainty%20aware%20and%20explainable%20diagnosis%20of%20retinal%20disease&rft.au=Singh,%20Amitojdeep&rft.date=2021-01-26&rft_id=info:doi/10.48550/arxiv.2101.12041&rft_dat=%3Carxiv_GOX%3E2101_12041%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true