Utilizing Generative Adversarial Network for Synthetic Image Generation to Address Imbalance Challenges in Chest X-Ray Image Classification
Deep learning-based classifiers need lots of image data to train. Unfortunately, not all real-world cases are supported by a huge amount of image data. One of the cases are images for classification of pneumonia infections with chest X-rays images. This study proposes a way of synthesizing chest X-r...
Gespeichert in:
Veröffentlicht in: | Journal of ICT Research and Applications 2023-12, Vol.17 (3), p.373-384 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deep learning-based classifiers need lots of image data to train. Unfortunately, not all real-world cases are supported by a huge amount of image data. One of the cases are images for classification of pneumonia infections with chest X-rays images. This study proposes a way of synthesizing chest X-rays with abnormal conditions in order to use the synthesized images for classification purposes. A GAN-based technique can generate synthetic images with greater quality that resemble original images thus can provide a more balanced data distribution than other approaches. To indirectly evaluate the quality of our GAN-based synthetic images, we used CNN-based classification architectures on diverse datasets. Three scenarios examined the effects of synthetic picture categorization. Scenario-1: adding 90% of synthesized images to the original images into the training dataset. Scenario-2: adding 50% of synthesized images to the original images. Scenario-3: adding 10% of synthesized image to the original images. The classification test revealed significantly increased F1 scores in all scenarios. Our study also emphasizes the significance of addressing the problem of imbalanced collections of chest X-ray images and the capability of GANs to alleviate this issue. |
---|---|
ISSN: | 2337-5787 2338-5499 |
DOI: | 10.5614/itbj.ict.res.appl.2023.17.3.6 |