Lung Segmentation-Based Pulmonary Disease Classification Using Deep Neural Networks

Interpreting chest x-ray (CXR) to find anomalies in the thoracic region is a tedious job and can consume an ample amount of radiologist's time when there are thousands of them to process. In such scenarios, the Computer-Aided Diagnostic (CAD) systems can help radiologists by doing the trivial p...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2021, Vol.9, p.125202-125214
Hauptverfasser: Zaidi, S. Zainab Yousuf, Akram, M. Usman, Jameel, Amina, Alghamdi, Norah Saleh
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Interpreting chest x-ray (CXR) to find anomalies in the thoracic region is a tedious job and can consume an ample amount of radiologist's time when there are thousands of them to process. In such scenarios, the Computer-Aided Diagnostic (CAD) systems can help radiologists by doing the trivial processing and presenting the information in a meaningful way so that, the radiologist can make more accurate decisions by spending less amount of time and energy. This research study intends to propose a better, accurate, and efficient CNN based pulmonary disease diagnosis system using CXR images. In the proposed system, the capabilities of deep neural network architecture are exploited by proposing a custom CNN architecture with additional layers and modified hyperparameters to meet the required results. The input CXR is examined for healthy or infected at the surface level and the infected images are further processed for class level label classification. The lung region is segmented from the entire input CXR image to reduce the amount of noise and increase the processing efficiency by processing less overall information. The proposed model is evaluated on the benchmark split of the NIH chest x-ray dataset and achieves better segmentation and classification results when compared to the state of the art approaches.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3110904