NIRN: Self-supervised noisy image reconstruction network for real-world image denoising

Existing image denoising methods for synthetic noise have made great progress. However, the distribution of real-world noise is more complicated, and it is difficult to obtain noise-free images of training sets for deep learning. Although there have been a few attempts in training with only the inpu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied intelligence (Dordrecht, Netherlands) Netherlands), 2022-11, Vol.52 (14), p.16683-16700
Hauptverfasser: Li, Xiaopeng, Fan, Cien, Zhao, Chen, Zou, Lian, Tian, Sheng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Existing image denoising methods for synthetic noise have made great progress. However, the distribution of real-world noise is more complicated, and it is difficult to obtain noise-free images of training sets for deep learning. Although there have been a few attempts in training with only the input noisy images, they have not achieved satisfactory results in real-world image denoising. Based on various priors of noisy images, we propose a novel Noisy Image Reconstruction Network (NIRN) which have an excellent performance with one input noisy image. The network is mainly composed of a clean image generator and a noise generator to separate the image into two latent layers, a noise layer and a noise-free layer. We constrain the two generators with deep image prior and noise prior, and conduct their adversarial training process with the reconstruction loss to exclude the possibility of overfitting. Besides, our method also supports multi-frame image denoising, which can make full use of the noise randomness between frames to get better results. Extensive experiments have demonstrated the superiority of our method NIRN over the state-of-the-art on both synthetic noise and real-world noise, in terms of both visual effect and quantitative metrics.
ISSN:0924-669X
1573-7497
DOI:10.1007/s10489-022-03333-6