Evaluating Explainable Artificial Intelligence (XAI) techniques in chest radiology imaging through a human-centered Lens

The field of radiology imaging has experienced a remarkable increase in using of deep learning (DL) algorithms to support diagnostic and treatment decisions. This rise has led to the development of Explainable AI (XAI) system to improve the transparency and trust of complex DL methods. However, XAI...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:PloS one 2024-10, Vol.19 (10), p.e0308758
Hauptverfasser: E Ihongbe, Izegbua, Fouad, Shereen, F Mahmoud, Taha, Rajasekaran, Arvind, Bhatia, Bahadar
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 10
container_start_page e0308758
container_title PloS one
container_volume 19
creator E Ihongbe, Izegbua
Fouad, Shereen
F Mahmoud, Taha
Rajasekaran, Arvind
Bhatia, Bahadar
description The field of radiology imaging has experienced a remarkable increase in using of deep learning (DL) algorithms to support diagnostic and treatment decisions. This rise has led to the development of Explainable AI (XAI) system to improve the transparency and trust of complex DL methods. However, XAI systems face challenges in gaining acceptance within the healthcare sector, mainly due to technical hurdles in utilizing these systems in practice and the lack of human-centered evaluation/validation. In this study, we focus on visual XAI systems applied to DL-enabled diagnostic system in chest radiography. In particular, we conduct a user study to evaluate two prominent visual XAI techniques from the human perspective. To this end, we created two clinical scenarios for diagnosing pneumonia and COVID-19 using DL techniques applied to chest X-ray and CT scans. The achieved accuracy rates were 90% for pneumonia and 98% for COVID-19. Subsequently, we employed two well-known XAI methods, Grad-CAM (Gradient-weighted Class Activation Mapping) and LIME (Local Interpretable Model-agnostic Explanations), to generate visual explanations elucidating the AI decision-making process. The visual explainability results were shared through a user study, undergoing evaluation by medical professionals in terms of clinical relevance, coherency, and user trust. In general, participants expressed a positive perception of the use of XAI systems in chest radiography. However, there was a noticeable lack of awareness regarding their value and practical aspects. Regarding preferences, Grad-CAM showed superior performance over LIME in terms of coherency and trust, although concerns were raised about its clinical usability. Our findings highlight key user-driven explainability requirements, emphasizing the importance of multi-modal explainability and the necessity to increase awareness of XAI systems among medical practitioners. Inclusive design was also identified as a crucial need to ensure better alignment of these systems with user needs.
doi_str_mv 10.1371/journal.pone.0308758
format Article
fullrecord <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_3114797740</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A811706031</galeid><sourcerecordid>A811706031</sourcerecordid><originalsourceid>FETCH-LOGICAL-c407t-5000eb3d9174fa3bdfa29d5aced3e0e11a3b6130eee257c01277ef0d705650ba3</originalsourceid><addsrcrecordid>eNptUsFu1DAQjRCIlsIfILDEpRyyjOMkTk7VqlpgpZW4gMTNcpxJ4sqxFzup2r_HoWnVosoHW-P33sw8vSR5T2FDGadfrtzsrTSbo7O4AQYVL6oXySmtWZaWGbCXj94nyZsQrgAKVpXl6-SE1axiNOenyc3uWppZTtr2ZHdzNFJb2RgkWz_pTistDdnbCY3RPVqF5Pz3dv-ZTKgGq__MGIi2RA0YJuJlq51x_S3Ro-wXvWnwbu4HIskwj9KmCqOSx5Yc0Ia3yatOmoDv1vss-fV19_Pye3r48W1_uT2kKgc-pQUAYMPamvK8k6xpO5nVbSEVtgwBKY21kjJAxKzgCmjGOXbQcijKAhrJzpKPd7pH44JYTQuC0bh-zXkOEXGxIuZmxHaZ0ksjjj7u4W-Fk1o8_bF6EL27FlGiZLwoo8L5quDdYsokRh1UNE1adPO_ZgXUBa1YhH76D_r8SCuqlwaFtp2LjdUiKrYVpRxKYDSiNs-g4mlx1CrGotOx_oSQ3xGUdyF47B6WpCCWUN0PI5ZQiTVUkfbhsUEPpPsUsb8MLMsF</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3114797740</pqid></control><display><type>article</type><title>Evaluating Explainable Artificial Intelligence (XAI) techniques in chest radiology imaging through a human-centered Lens</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Public Library of Science (PLoS) Journals Open Access</source><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><creator>E Ihongbe, Izegbua ; Fouad, Shereen ; F Mahmoud, Taha ; Rajasekaran, Arvind ; Bhatia, Bahadar</creator><contributor>Ieracitano, Cosimo</contributor><creatorcontrib>E Ihongbe, Izegbua ; Fouad, Shereen ; F Mahmoud, Taha ; Rajasekaran, Arvind ; Bhatia, Bahadar ; Ieracitano, Cosimo</creatorcontrib><description>The field of radiology imaging has experienced a remarkable increase in using of deep learning (DL) algorithms to support diagnostic and treatment decisions. This rise has led to the development of Explainable AI (XAI) system to improve the transparency and trust of complex DL methods. However, XAI systems face challenges in gaining acceptance within the healthcare sector, mainly due to technical hurdles in utilizing these systems in practice and the lack of human-centered evaluation/validation. In this study, we focus on visual XAI systems applied to DL-enabled diagnostic system in chest radiography. In particular, we conduct a user study to evaluate two prominent visual XAI techniques from the human perspective. To this end, we created two clinical scenarios for diagnosing pneumonia and COVID-19 using DL techniques applied to chest X-ray and CT scans. The achieved accuracy rates were 90% for pneumonia and 98% for COVID-19. Subsequently, we employed two well-known XAI methods, Grad-CAM (Gradient-weighted Class Activation Mapping) and LIME (Local Interpretable Model-agnostic Explanations), to generate visual explanations elucidating the AI decision-making process. The visual explainability results were shared through a user study, undergoing evaluation by medical professionals in terms of clinical relevance, coherency, and user trust. In general, participants expressed a positive perception of the use of XAI systems in chest radiography. However, there was a noticeable lack of awareness regarding their value and practical aspects. Regarding preferences, Grad-CAM showed superior performance over LIME in terms of coherency and trust, although concerns were raised about its clinical usability. Our findings highlight key user-driven explainability requirements, emphasizing the importance of multi-modal explainability and the necessity to increase awareness of XAI systems among medical practitioners. Inclusive design was also identified as a crucial need to ensure better alignment of these systems with user needs.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0308758</identifier><identifier>PMID: 39383147</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Algorithms ; Artificial Intelligence ; Automation ; Betacoronavirus ; Biology and Life Sciences ; Chest ; Classification ; Computed tomography ; Computer and Information Sciences ; Coronavirus Infections - diagnostic imaging ; COVID-19 ; COVID-19 - diagnostic imaging ; Cybersecurity ; Decision making ; Deep Learning ; Diagnostic systems ; Evaluation ; Explainable artificial intelligence ; Female ; Humans ; Machine learning ; Male ; Medical examination ; Medical imaging ; Medical personnel ; Medical research ; Medicine and Health Sciences ; Medicine, Experimental ; Methods ; Neural networks ; Pandemics ; Pneumonia ; Pneumonia - diagnostic imaging ; Radiography ; Radiography, Thoracic - methods ; Radiology ; Radiology, Medical ; Research and Analysis Methods ; Respiratory diseases ; SARS-CoV-2 ; Surveys ; Technology application ; Tomography, X-Ray Computed - methods ; Usability ; X-rays</subject><ispartof>PloS one, 2024-10, Vol.19 (10), p.e0308758</ispartof><rights>Copyright: © 2024 E. Ihongbe et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.</rights><rights>COPYRIGHT 2024 Public Library of Science</rights><rights>2024 E. Ihongbe et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2024 E. Ihongbe et al 2024 E. Ihongbe et al</rights><rights>2024 E. Ihongbe et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c407t-5000eb3d9174fa3bdfa29d5aced3e0e11a3b6130eee257c01277ef0d705650ba3</cites><orcidid>0009-0003-3630-5404 ; 0000-0002-4965-7017</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC11463756/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC11463756/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,864,885,2928,23866,27924,27925,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39383147$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Ieracitano, Cosimo</contributor><creatorcontrib>E Ihongbe, Izegbua</creatorcontrib><creatorcontrib>Fouad, Shereen</creatorcontrib><creatorcontrib>F Mahmoud, Taha</creatorcontrib><creatorcontrib>Rajasekaran, Arvind</creatorcontrib><creatorcontrib>Bhatia, Bahadar</creatorcontrib><title>Evaluating Explainable Artificial Intelligence (XAI) techniques in chest radiology imaging through a human-centered Lens</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>The field of radiology imaging has experienced a remarkable increase in using of deep learning (DL) algorithms to support diagnostic and treatment decisions. This rise has led to the development of Explainable AI (XAI) system to improve the transparency and trust of complex DL methods. However, XAI systems face challenges in gaining acceptance within the healthcare sector, mainly due to technical hurdles in utilizing these systems in practice and the lack of human-centered evaluation/validation. In this study, we focus on visual XAI systems applied to DL-enabled diagnostic system in chest radiography. In particular, we conduct a user study to evaluate two prominent visual XAI techniques from the human perspective. To this end, we created two clinical scenarios for diagnosing pneumonia and COVID-19 using DL techniques applied to chest X-ray and CT scans. The achieved accuracy rates were 90% for pneumonia and 98% for COVID-19. Subsequently, we employed two well-known XAI methods, Grad-CAM (Gradient-weighted Class Activation Mapping) and LIME (Local Interpretable Model-agnostic Explanations), to generate visual explanations elucidating the AI decision-making process. The visual explainability results were shared through a user study, undergoing evaluation by medical professionals in terms of clinical relevance, coherency, and user trust. In general, participants expressed a positive perception of the use of XAI systems in chest radiography. However, there was a noticeable lack of awareness regarding their value and practical aspects. Regarding preferences, Grad-CAM showed superior performance over LIME in terms of coherency and trust, although concerns were raised about its clinical usability. Our findings highlight key user-driven explainability requirements, emphasizing the importance of multi-modal explainability and the necessity to increase awareness of XAI systems among medical practitioners. Inclusive design was also identified as a crucial need to ensure better alignment of these systems with user needs.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Automation</subject><subject>Betacoronavirus</subject><subject>Biology and Life Sciences</subject><subject>Chest</subject><subject>Classification</subject><subject>Computed tomography</subject><subject>Computer and Information Sciences</subject><subject>Coronavirus Infections - diagnostic imaging</subject><subject>COVID-19</subject><subject>COVID-19 - diagnostic imaging</subject><subject>Cybersecurity</subject><subject>Decision making</subject><subject>Deep Learning</subject><subject>Diagnostic systems</subject><subject>Evaluation</subject><subject>Explainable artificial intelligence</subject><subject>Female</subject><subject>Humans</subject><subject>Machine learning</subject><subject>Male</subject><subject>Medical examination</subject><subject>Medical imaging</subject><subject>Medical personnel</subject><subject>Medical research</subject><subject>Medicine and Health Sciences</subject><subject>Medicine, Experimental</subject><subject>Methods</subject><subject>Neural networks</subject><subject>Pandemics</subject><subject>Pneumonia</subject><subject>Pneumonia - diagnostic imaging</subject><subject>Radiography</subject><subject>Radiography, Thoracic - methods</subject><subject>Radiology</subject><subject>Radiology, Medical</subject><subject>Research and Analysis Methods</subject><subject>Respiratory diseases</subject><subject>SARS-CoV-2</subject><subject>Surveys</subject><subject>Technology application</subject><subject>Tomography, X-Ray Computed - methods</subject><subject>Usability</subject><subject>X-rays</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNptUsFu1DAQjRCIlsIfILDEpRyyjOMkTk7VqlpgpZW4gMTNcpxJ4sqxFzup2r_HoWnVosoHW-P33sw8vSR5T2FDGadfrtzsrTSbo7O4AQYVL6oXySmtWZaWGbCXj94nyZsQrgAKVpXl6-SE1axiNOenyc3uWppZTtr2ZHdzNFJb2RgkWz_pTistDdnbCY3RPVqF5Pz3dv-ZTKgGq__MGIi2RA0YJuJlq51x_S3Ro-wXvWnwbu4HIskwj9KmCqOSx5Yc0Ia3yatOmoDv1vss-fV19_Pye3r48W1_uT2kKgc-pQUAYMPamvK8k6xpO5nVbSEVtgwBKY21kjJAxKzgCmjGOXbQcijKAhrJzpKPd7pH44JYTQuC0bh-zXkOEXGxIuZmxHaZ0ksjjj7u4W-Fk1o8_bF6EL27FlGiZLwoo8L5quDdYsokRh1UNE1adPO_ZgXUBa1YhH76D_r8SCuqlwaFtp2LjdUiKrYVpRxKYDSiNs-g4mlx1CrGotOx_oSQ3xGUdyF47B6WpCCWUN0PI5ZQiTVUkfbhsUEPpPsUsb8MLMsF</recordid><startdate>20241009</startdate><enddate>20241009</enddate><creator>E Ihongbe, Izegbua</creator><creator>Fouad, Shereen</creator><creator>F Mahmoud, Taha</creator><creator>Rajasekaran, Arvind</creator><creator>Bhatia, Bahadar</creator><general>Public Library of Science</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>COVID</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0009-0003-3630-5404</orcidid><orcidid>https://orcid.org/0000-0002-4965-7017</orcidid></search><sort><creationdate>20241009</creationdate><title>Evaluating Explainable Artificial Intelligence (XAI) techniques in chest radiology imaging through a human-centered Lens</title><author>E Ihongbe, Izegbua ; Fouad, Shereen ; F Mahmoud, Taha ; Rajasekaran, Arvind ; Bhatia, Bahadar</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c407t-5000eb3d9174fa3bdfa29d5aced3e0e11a3b6130eee257c01277ef0d705650ba3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Automation</topic><topic>Betacoronavirus</topic><topic>Biology and Life Sciences</topic><topic>Chest</topic><topic>Classification</topic><topic>Computed tomography</topic><topic>Computer and Information Sciences</topic><topic>Coronavirus Infections - diagnostic imaging</topic><topic>COVID-19</topic><topic>COVID-19 - diagnostic imaging</topic><topic>Cybersecurity</topic><topic>Decision making</topic><topic>Deep Learning</topic><topic>Diagnostic systems</topic><topic>Evaluation</topic><topic>Explainable artificial intelligence</topic><topic>Female</topic><topic>Humans</topic><topic>Machine learning</topic><topic>Male</topic><topic>Medical examination</topic><topic>Medical imaging</topic><topic>Medical personnel</topic><topic>Medical research</topic><topic>Medicine and Health Sciences</topic><topic>Medicine, Experimental</topic><topic>Methods</topic><topic>Neural networks</topic><topic>Pandemics</topic><topic>Pneumonia</topic><topic>Pneumonia - diagnostic imaging</topic><topic>Radiography</topic><topic>Radiography, Thoracic - methods</topic><topic>Radiology</topic><topic>Radiology, Medical</topic><topic>Research and Analysis Methods</topic><topic>Respiratory diseases</topic><topic>SARS-CoV-2</topic><topic>Surveys</topic><topic>Technology application</topic><topic>Tomography, X-Ray Computed - methods</topic><topic>Usability</topic><topic>X-rays</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>E Ihongbe, Izegbua</creatorcontrib><creatorcontrib>Fouad, Shereen</creatorcontrib><creatorcontrib>F Mahmoud, Taha</creatorcontrib><creatorcontrib>Rajasekaran, Arvind</creatorcontrib><creatorcontrib>Bhatia, Bahadar</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>Coronavirus Research Database</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials Science Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>E Ihongbe, Izegbua</au><au>Fouad, Shereen</au><au>F Mahmoud, Taha</au><au>Rajasekaran, Arvind</au><au>Bhatia, Bahadar</au><au>Ieracitano, Cosimo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Evaluating Explainable Artificial Intelligence (XAI) techniques in chest radiology imaging through a human-centered Lens</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2024-10-09</date><risdate>2024</risdate><volume>19</volume><issue>10</issue><spage>e0308758</spage><pages>e0308758-</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>The field of radiology imaging has experienced a remarkable increase in using of deep learning (DL) algorithms to support diagnostic and treatment decisions. This rise has led to the development of Explainable AI (XAI) system to improve the transparency and trust of complex DL methods. However, XAI systems face challenges in gaining acceptance within the healthcare sector, mainly due to technical hurdles in utilizing these systems in practice and the lack of human-centered evaluation/validation. In this study, we focus on visual XAI systems applied to DL-enabled diagnostic system in chest radiography. In particular, we conduct a user study to evaluate two prominent visual XAI techniques from the human perspective. To this end, we created two clinical scenarios for diagnosing pneumonia and COVID-19 using DL techniques applied to chest X-ray and CT scans. The achieved accuracy rates were 90% for pneumonia and 98% for COVID-19. Subsequently, we employed two well-known XAI methods, Grad-CAM (Gradient-weighted Class Activation Mapping) and LIME (Local Interpretable Model-agnostic Explanations), to generate visual explanations elucidating the AI decision-making process. The visual explainability results were shared through a user study, undergoing evaluation by medical professionals in terms of clinical relevance, coherency, and user trust. In general, participants expressed a positive perception of the use of XAI systems in chest radiography. However, there was a noticeable lack of awareness regarding their value and practical aspects. Regarding preferences, Grad-CAM showed superior performance over LIME in terms of coherency and trust, although concerns were raised about its clinical usability. Our findings highlight key user-driven explainability requirements, emphasizing the importance of multi-modal explainability and the necessity to increase awareness of XAI systems among medical practitioners. Inclusive design was also identified as a crucial need to ensure better alignment of these systems with user needs.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>39383147</pmid><doi>10.1371/journal.pone.0308758</doi><orcidid>https://orcid.org/0009-0003-3630-5404</orcidid><orcidid>https://orcid.org/0000-0002-4965-7017</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1932-6203
ispartof PloS one, 2024-10, Vol.19 (10), p.e0308758
issn 1932-6203
1932-6203
language eng
recordid cdi_plos_journals_3114797740
source MEDLINE; DOAJ Directory of Open Access Journals; Public Library of Science (PLoS) Journals Open Access; EZB-FREE-00999 freely available EZB journals; PubMed Central; Free Full-Text Journals in Chemistry
subjects Algorithms
Artificial Intelligence
Automation
Betacoronavirus
Biology and Life Sciences
Chest
Classification
Computed tomography
Computer and Information Sciences
Coronavirus Infections - diagnostic imaging
COVID-19
COVID-19 - diagnostic imaging
Cybersecurity
Decision making
Deep Learning
Diagnostic systems
Evaluation
Explainable artificial intelligence
Female
Humans
Machine learning
Male
Medical examination
Medical imaging
Medical personnel
Medical research
Medicine and Health Sciences
Medicine, Experimental
Methods
Neural networks
Pandemics
Pneumonia
Pneumonia - diagnostic imaging
Radiography
Radiography, Thoracic - methods
Radiology
Radiology, Medical
Research and Analysis Methods
Respiratory diseases
SARS-CoV-2
Surveys
Technology application
Tomography, X-Ray Computed - methods
Usability
X-rays
title Evaluating Explainable Artificial Intelligence (XAI) techniques in chest radiology imaging through a human-centered Lens
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-18T18%3A05%3A42IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Evaluating%20Explainable%20Artificial%20Intelligence%20(XAI)%20techniques%20in%20chest%20radiology%20imaging%20through%20a%20human-centered%20Lens&rft.jtitle=PloS%20one&rft.au=E%20Ihongbe,%20Izegbua&rft.date=2024-10-09&rft.volume=19&rft.issue=10&rft.spage=e0308758&rft.pages=e0308758-&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0308758&rft_dat=%3Cgale_plos_%3EA811706031%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3114797740&rft_id=info:pmid/39383147&rft_galeid=A811706031&rfr_iscdi=true