Accurate and Fast Image Denoising via Attention Guided Scaling

Image denoising is a classical topic yet still a challenging problem, especially for reducing noise from the texture information. Feature scaling (e.g., downscale and upscale) is a widely practice in image denoising to enlarge receptive field size and save resources. However, such a common operation...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing 2021, Vol.30, p.6255-6265
Hauptverfasser: Zhang, Yulun, Li, Kunpeng, Li, Kai, Sun, Gan, Kong, Yu, Fu, Yun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Image denoising is a classical topic yet still a challenging problem, especially for reducing noise from the texture information. Feature scaling (e.g., downscale and upscale) is a widely practice in image denoising to enlarge receptive field size and save resources. However, such a common operation would lose some visual informative details. To address those problems, we propose fast and accurate image denoising via attention guided scaling (AGS). We find that the main informative feature channel and visual primitives during the scaling should keep similar. We then propose to extract the global channel-wise attention to maintain main channel information. Moreover, we propose to collect global descriptors by considering the entire spatial feature. And we then distribute the global descriptors to local positions of the scaled feature, based on their specific needs. We further introduce AGS for adversarial training, resulting in a more powerful discriminator. Extensive experiments show the effectiveness of our proposed method, where we clearly surpass all the state-of-the-art methods on most popular synthetic and real-world denoising benchmarks quantitatively and visually. We further show that our network contributes to other high-level vision applications and improves their performances significantly.
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2021.3093396