An explainable artificial intelligence model for multiple lung diseases classification from chest X-ray images using fine-tuned transfer learning

Traditional deep learning models are often considered “black boxes” due to their lack of interpretability, which limits their therapeutic use despite their success in classification tasks. This study aims to improve the interpretability of diagnoses for COVID-19, pneumonia, and tuberculosis from X-r...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Decision analytics journal 2024-09, Vol.12, p.100499, Article 100499
Hauptverfasser: Mahamud, Eram, Fahad, Nafiz, Assaduzzaman, Md, Zain, S.M., Goh, Kah Ong Michael, Morol, Md. Kishor
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Traditional deep learning models are often considered “black boxes” due to their lack of interpretability, which limits their therapeutic use despite their success in classification tasks. This study aims to improve the interpretability of diagnoses for COVID-19, pneumonia, and tuberculosis from X-ray images using an enhanced DenseNet201 model within a transfer learning framework. We incorporated Explainable Artificial Intelligence (XAI) techniques, including SHAP, LIME, Grad-CAM, and Grad-CAM++, to make the model’s decisions more understandable. To enhance image clarity and detail, we applied preprocessing methods such as Denoising Autoencoder, Contrast Limited Adaptive Histogram Equalization (CLAHE), and Gamma Correction. An ablation study was conducted to identify the optimal parameters for the proposed approach. Our model’s performance was compared with other transfer learning-based models like EfficientNetB0, InceptionV3, and LeNet using evaluation metrics. The model that included data augmentation techniques achieved the best results, with an accuracy of 99.20%, and precision and recall of 99%. This demonstrates the critical role of data augmentation in improving model performance. SHAP and LIME provided significant insights into the model’s decision-making process, while Grad-CAM and Grad-CAM++ highlighted specific image features and regions influencing the model’s classifications. These techniques enhanced transparency and trust in AI-assisted diagnoses. Finally, we developed an Android-based system using the most effective model to support medical specialists in their decision-making process. •Propose an enhanced DenseNet201 model integrating transfer learning and explainable artificial intelligence to classify multiple lung diseases from X-ray images.•Employ explainable artificial intelligence methods and advanced image preprocessing to enhance diagnostic accuracy and model transparency.•Show that the model outperformed others like EfficientNetB0, InceptionV3 and LeNet in accuracy, precision, and recall when using data augmentation.•Develop a practical tool with an Android app to assist medical specialists in real-time decision-making.•Bridge advanced computational techniques with practical needs in medical diagnostics to enhance quick and reliable lung disease diagnosis.
ISSN:2772-6622
2772-6622
DOI:10.1016/j.dajour.2024.100499