RADD-CycleGAN: unsupervised reconstruction of high-quality ultrasound image based on CycleGAN with residual attention and dual-domain discrimination

Plane wave (PW) imaging is fast, but limited by poor imaging quality. Coherent PW compounding (CPWC) improves image quality but decrease frame rate. In this study, we propose a modified CycleGAN model that combines a residual attention module with a space-frequency dual-domain discriminator, termed...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Physics in medicine & biology 2024-12, Vol.69 (24), p.245018
Hauptverfasser: Si, Mateng, Wu, Musheng, Wang, Qing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Plane wave (PW) imaging is fast, but limited by poor imaging quality. Coherent PW compounding (CPWC) improves image quality but decrease frame rate. In this study, we propose a modified CycleGAN model that combines a residual attention module with a space-frequency dual-domain discriminator, termed RADD-CycleGAN, to rapidly reconstruct high-quality ultrasound images. To enhance the ability to reconstruct image details, we specially design a process of hybrid dynamic and static channel selection followed by the frequency domain discriminator. The low-quality images are generated by the 3-angle CPWC, while the high-quality images are generated as real images (ground truth) by the 75-angle CPWC. The training set includes unpaired images, whereas the images in the test set are paired to verify the validity and superiority of the proposed model. Finally, we respectively design ablation and comparison experiments to evaluate the model performance. Compared with the basic CycleGAN, our proposed method reaches a better performance, with a 7.8% increase in the peak signal-to-noise ratio and a 22.2% increase in the structural similarity index measure. The experimental results show that our method achieves the best unsupervised reconstruction from low quality images in comparison with several state-of-the-art methods.
ISSN:0031-9155
1361-6560
1361-6560
DOI:10.1088/1361-6560/ad997f