CycleGAN-based image translation from MRI to CT scans

In recent years, the application of deep learning techniques to medical image analysis has shown promising results in improving diagnosis and treatment of diseases. One such technique is CycleGAN, a variant of Generative Adversarial Networks (GANs) that enables unpaired image-to-image translation. T...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of physics. Conference series 2023-12, Vol.2646 (1), p.12016
Hauptverfasser: Cai, Yingchao, Li, Mengxiao, Liu, Shiqi, Zhou, Changhao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In recent years, the application of deep learning techniques to medical image analysis has shown promising results in improving diagnosis and treatment of diseases. One such technique is CycleGAN, a variant of Generative Adversarial Networks (GANs) that enables unpaired image-to-image translation. This paper presents a CycleGAN-based approach for transforming CT and MRI scans, which can provide doctors with more diagnostic information and assist in the prediction and diagnosis of tumors. Our experiments are based on brain scan images collected from the Kaggle dataset, with no paired information available. The generator and discriminator models of the CycleGAN are trained with the Adam optimizer and a cycle consistency loss weight (λ) of 10. The total training time is about 12 days, and the model is tested for 75 epochs with a fixed learning rate of 0.0002. The results demonstrate the effectiveness of the proposed method, achieving high-quality image translation from MRI to CT scans. The advantages of CycleGAN in medical image analysis include its ability to handle unpaired data, perform cross-domain image translation, ensure cycle consistency, and generate diverse outputs. Future work can further explore the use of CycleGAN for other medical image analysis tasks and investigate how to optimize the model performance.
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/2646/1/012016