Explainable Knowledge Distillation for On-Device Chest X-Ray Classification

Automated multi-label chest X-rays (CXR) image classification has achieved substantial progress in clinical diagnosis via utilizing sophisticated deep learning approaches. However, most deep models have high computational demands, which makes them less feasible for compact devices with low computati...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE/ACM transactions on computational biology and bioinformatics 2024-07, Vol.21 (4), p.846-856
Hauptverfasser: Termritthikun, Chakkrit, Umer, Ayaz, Suwanwimolkul, Suwichaya, Xia, Feng, Lee, Ivan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 856
container_issue 4
container_start_page 846
container_title IEEE/ACM transactions on computational biology and bioinformatics
container_volume 21
creator Termritthikun, Chakkrit
Umer, Ayaz
Suwanwimolkul, Suwichaya
Xia, Feng
Lee, Ivan
description Automated multi-label chest X-rays (CXR) image classification has achieved substantial progress in clinical diagnosis via utilizing sophisticated deep learning approaches. However, most deep models have high computational demands, which makes them less feasible for compact devices with low computational requirements. To overcome this problem, we propose a knowledge distillation (KD) strategy to create the compact deep learning model for the real-time multi-label CXR image classification. We study different alternatives of CNNs and Transforms as the teacher to distill the knowledge to a smaller student. Then, we employed explainable artificial intelligence (XAI) to provide the visual explanation for the model decision improved by the KD. Our results on three benchmark CXR datasets show that our KD strategy provides the improved performance on the compact student model, thus being the feasible choice for many limited hardware platforms. For instance, when using DenseNet161 as the teacher network, EEEA-Net-C2 achieved an AUC of 83.7%, 87.1%, and 88.7% on the ChestX-ray14, CheXpert, and PadChest datasets, respectively, with fewer parameters of 4.7 million and computational cost of 0.3 billion FLOPS.
doi_str_mv 10.1109/TCBB.2023.3272333
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_miscellaneous_2809004627</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10114588</ieee_id><sourcerecordid>2809004627</sourcerecordid><originalsourceid>FETCH-LOGICAL-c322t-fa11b16388b67007164075d800e849a8929382cd2a30036e13ce00e10e2fd9d23</originalsourceid><addsrcrecordid>eNpNkE1Lw0AQhhdRbK3-AEEkRy-pszubZPdo0_pBCwWp4C1sk4mupEnNpmr_vYmt4mkG5nlfhoexcw5DzkFfL-LRaChA4BBFJBDxgPV5EES-1qE87HYZ-IEOscdOnHsDEFKDPGY9jDiCCKDPppOvdWFsaZYFedOy-iwoeyFvbF1ji8I0tiq9vKq9eemP6cOm5MWv5Brv2X80Wy8ujHM2t-kPeMqOclM4OtvPAXu6nSzie382v3uIb2Z-ikI0fm44X_IQlVqGEUDEQwlRkCkAUlIbpYVGJdJMGATAkDim1N44kMgznQkcsKtd77qu3jftN8nKupTad0uqNi4RCjSADEXUonyHpnXlXE15sq7tytTbhEPSOUw6h0nnMNk7bDOX-_rNckXZX-JXWgtc7ABLRP8KOZeBUvgN8XhzcA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2809004627</pqid></control><display><type>article</type><title>Explainable Knowledge Distillation for On-Device Chest X-Ray Classification</title><source>IEEE Electronic Library (IEL)</source><creator>Termritthikun, Chakkrit ; Umer, Ayaz ; Suwanwimolkul, Suwichaya ; Xia, Feng ; Lee, Ivan</creator><creatorcontrib>Termritthikun, Chakkrit ; Umer, Ayaz ; Suwanwimolkul, Suwichaya ; Xia, Feng ; Lee, Ivan</creatorcontrib><description>Automated multi-label chest X-rays (CXR) image classification has achieved substantial progress in clinical diagnosis via utilizing sophisticated deep learning approaches. However, most deep models have high computational demands, which makes them less feasible for compact devices with low computational requirements. To overcome this problem, we propose a knowledge distillation (KD) strategy to create the compact deep learning model for the real-time multi-label CXR image classification. We study different alternatives of CNNs and Transforms as the teacher to distill the knowledge to a smaller student. Then, we employed explainable artificial intelligence (XAI) to provide the visual explanation for the model decision improved by the KD. Our results on three benchmark CXR datasets show that our KD strategy provides the improved performance on the compact student model, thus being the feasible choice for many limited hardware platforms. For instance, when using DenseNet161 as the teacher network, EEEA-Net-C2 achieved an AUC of 83.7%, 87.1%, and 88.7% on the ChestX-ray14, CheXpert, and PadChest datasets, respectively, with fewer parameters of 4.7 million and computational cost of 0.3 billion FLOPS.</description><identifier>ISSN: 1545-5963</identifier><identifier>ISSN: 1557-9964</identifier><identifier>EISSN: 1557-9964</identifier><identifier>DOI: 10.1109/TCBB.2023.3272333</identifier><identifier>PMID: 37130250</identifier><identifier>CODEN: ITCBCY</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Algorithms ; Artificial Intelligence ; chest X-ray ; Computational modeling ; Computer architecture ; Databases, Factual ; Deep Learning ; Diseases ; explainable artificial intelligence ; Humans ; Image classification ; Knowledge distillation ; on-device ; Radiographic Image Interpretation, Computer-Assisted - methods ; Radiography, Thoracic - methods ; Transformers</subject><ispartof>IEEE/ACM transactions on computational biology and bioinformatics, 2024-07, Vol.21 (4), p.846-856</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c322t-fa11b16388b67007164075d800e849a8929382cd2a30036e13ce00e10e2fd9d23</citedby><cites>FETCH-LOGICAL-c322t-fa11b16388b67007164075d800e849a8929382cd2a30036e13ce00e10e2fd9d23</cites><orcidid>0000-0002-2370-636X ; 0000-0002-8324-1859 ; 0000-0002-1508-3123 ; 0000-0001-7369-9711 ; 0000-0002-2826-6367</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10114588$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10114588$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37130250$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Termritthikun, Chakkrit</creatorcontrib><creatorcontrib>Umer, Ayaz</creatorcontrib><creatorcontrib>Suwanwimolkul, Suwichaya</creatorcontrib><creatorcontrib>Xia, Feng</creatorcontrib><creatorcontrib>Lee, Ivan</creatorcontrib><title>Explainable Knowledge Distillation for On-Device Chest X-Ray Classification</title><title>IEEE/ACM transactions on computational biology and bioinformatics</title><addtitle>TCBB</addtitle><addtitle>IEEE/ACM Trans Comput Biol Bioinform</addtitle><description>Automated multi-label chest X-rays (CXR) image classification has achieved substantial progress in clinical diagnosis via utilizing sophisticated deep learning approaches. However, most deep models have high computational demands, which makes them less feasible for compact devices with low computational requirements. To overcome this problem, we propose a knowledge distillation (KD) strategy to create the compact deep learning model for the real-time multi-label CXR image classification. We study different alternatives of CNNs and Transforms as the teacher to distill the knowledge to a smaller student. Then, we employed explainable artificial intelligence (XAI) to provide the visual explanation for the model decision improved by the KD. Our results on three benchmark CXR datasets show that our KD strategy provides the improved performance on the compact student model, thus being the feasible choice for many limited hardware platforms. For instance, when using DenseNet161 as the teacher network, EEEA-Net-C2 achieved an AUC of 83.7%, 87.1%, and 88.7% on the ChestX-ray14, CheXpert, and PadChest datasets, respectively, with fewer parameters of 4.7 million and computational cost of 0.3 billion FLOPS.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>chest X-ray</subject><subject>Computational modeling</subject><subject>Computer architecture</subject><subject>Databases, Factual</subject><subject>Deep Learning</subject><subject>Diseases</subject><subject>explainable artificial intelligence</subject><subject>Humans</subject><subject>Image classification</subject><subject>Knowledge distillation</subject><subject>on-device</subject><subject>Radiographic Image Interpretation, Computer-Assisted - methods</subject><subject>Radiography, Thoracic - methods</subject><subject>Transformers</subject><issn>1545-5963</issn><issn>1557-9964</issn><issn>1557-9964</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNpNkE1Lw0AQhhdRbK3-AEEkRy-pszubZPdo0_pBCwWp4C1sk4mupEnNpmr_vYmt4mkG5nlfhoexcw5DzkFfL-LRaChA4BBFJBDxgPV5EES-1qE87HYZ-IEOscdOnHsDEFKDPGY9jDiCCKDPppOvdWFsaZYFedOy-iwoeyFvbF1ji8I0tiq9vKq9eemP6cOm5MWv5Brv2X80Wy8ujHM2t-kPeMqOclM4OtvPAXu6nSzie382v3uIb2Z-ikI0fm44X_IQlVqGEUDEQwlRkCkAUlIbpYVGJdJMGATAkDim1N44kMgznQkcsKtd77qu3jftN8nKupTad0uqNi4RCjSADEXUonyHpnXlXE15sq7tytTbhEPSOUw6h0nnMNk7bDOX-_rNckXZX-JXWgtc7ABLRP8KOZeBUvgN8XhzcA</recordid><startdate>20240701</startdate><enddate>20240701</enddate><creator>Termritthikun, Chakkrit</creator><creator>Umer, Ayaz</creator><creator>Suwanwimolkul, Suwichaya</creator><creator>Xia, Feng</creator><creator>Lee, Ivan</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-2370-636X</orcidid><orcidid>https://orcid.org/0000-0002-8324-1859</orcidid><orcidid>https://orcid.org/0000-0002-1508-3123</orcidid><orcidid>https://orcid.org/0000-0001-7369-9711</orcidid><orcidid>https://orcid.org/0000-0002-2826-6367</orcidid></search><sort><creationdate>20240701</creationdate><title>Explainable Knowledge Distillation for On-Device Chest X-Ray Classification</title><author>Termritthikun, Chakkrit ; Umer, Ayaz ; Suwanwimolkul, Suwichaya ; Xia, Feng ; Lee, Ivan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c322t-fa11b16388b67007164075d800e849a8929382cd2a30036e13ce00e10e2fd9d23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>chest X-ray</topic><topic>Computational modeling</topic><topic>Computer architecture</topic><topic>Databases, Factual</topic><topic>Deep Learning</topic><topic>Diseases</topic><topic>explainable artificial intelligence</topic><topic>Humans</topic><topic>Image classification</topic><topic>Knowledge distillation</topic><topic>on-device</topic><topic>Radiographic Image Interpretation, Computer-Assisted - methods</topic><topic>Radiography, Thoracic - methods</topic><topic>Transformers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Termritthikun, Chakkrit</creatorcontrib><creatorcontrib>Umer, Ayaz</creatorcontrib><creatorcontrib>Suwanwimolkul, Suwichaya</creatorcontrib><creatorcontrib>Xia, Feng</creatorcontrib><creatorcontrib>Lee, Ivan</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE/ACM transactions on computational biology and bioinformatics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Termritthikun, Chakkrit</au><au>Umer, Ayaz</au><au>Suwanwimolkul, Suwichaya</au><au>Xia, Feng</au><au>Lee, Ivan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Explainable Knowledge Distillation for On-Device Chest X-Ray Classification</atitle><jtitle>IEEE/ACM transactions on computational biology and bioinformatics</jtitle><stitle>TCBB</stitle><addtitle>IEEE/ACM Trans Comput Biol Bioinform</addtitle><date>2024-07-01</date><risdate>2024</risdate><volume>21</volume><issue>4</issue><spage>846</spage><epage>856</epage><pages>846-856</pages><issn>1545-5963</issn><issn>1557-9964</issn><eissn>1557-9964</eissn><coden>ITCBCY</coden><abstract>Automated multi-label chest X-rays (CXR) image classification has achieved substantial progress in clinical diagnosis via utilizing sophisticated deep learning approaches. However, most deep models have high computational demands, which makes them less feasible for compact devices with low computational requirements. To overcome this problem, we propose a knowledge distillation (KD) strategy to create the compact deep learning model for the real-time multi-label CXR image classification. We study different alternatives of CNNs and Transforms as the teacher to distill the knowledge to a smaller student. Then, we employed explainable artificial intelligence (XAI) to provide the visual explanation for the model decision improved by the KD. Our results on three benchmark CXR datasets show that our KD strategy provides the improved performance on the compact student model, thus being the feasible choice for many limited hardware platforms. For instance, when using DenseNet161 as the teacher network, EEEA-Net-C2 achieved an AUC of 83.7%, 87.1%, and 88.7% on the ChestX-ray14, CheXpert, and PadChest datasets, respectively, with fewer parameters of 4.7 million and computational cost of 0.3 billion FLOPS.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>37130250</pmid><doi>10.1109/TCBB.2023.3272333</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0002-2370-636X</orcidid><orcidid>https://orcid.org/0000-0002-8324-1859</orcidid><orcidid>https://orcid.org/0000-0002-1508-3123</orcidid><orcidid>https://orcid.org/0000-0001-7369-9711</orcidid><orcidid>https://orcid.org/0000-0002-2826-6367</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1545-5963
ispartof IEEE/ACM transactions on computational biology and bioinformatics, 2024-07, Vol.21 (4), p.846-856
issn 1545-5963
1557-9964
1557-9964
language eng
recordid cdi_proquest_miscellaneous_2809004627
source IEEE Electronic Library (IEL)
subjects Algorithms
Artificial Intelligence
chest X-ray
Computational modeling
Computer architecture
Databases, Factual
Deep Learning
Diseases
explainable artificial intelligence
Humans
Image classification
Knowledge distillation
on-device
Radiographic Image Interpretation, Computer-Assisted - methods
Radiography, Thoracic - methods
Transformers
title Explainable Knowledge Distillation for On-Device Chest X-Ray Classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T13%3A06%3A22IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Explainable%20Knowledge%20Distillation%20for%20On-Device%20Chest%20X-Ray%20Classification&rft.jtitle=IEEE/ACM%20transactions%20on%20computational%20biology%20and%20bioinformatics&rft.au=Termritthikun,%20Chakkrit&rft.date=2024-07-01&rft.volume=21&rft.issue=4&rft.spage=846&rft.epage=856&rft.pages=846-856&rft.issn=1545-5963&rft.eissn=1557-9964&rft.coden=ITCBCY&rft_id=info:doi/10.1109/TCBB.2023.3272333&rft_dat=%3Cproquest_RIE%3E2809004627%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2809004627&rft_id=info:pmid/37130250&rft_ieee_id=10114588&rfr_iscdi=true