State-of-art analysis of image denoising methods using convolutional neural networks
Convolutional neural networks (CNNs) are deep neural networks that can be trained on large databases and show outstanding performance on object classification, segmentation, image denoising etc. In the past few years, several image denoising techniques have been developed to improve the quality of a...
Gespeichert in:
Veröffentlicht in: | IET image processing 2019-11, Vol.13 (13), p.2367-2380 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Convolutional neural networks (CNNs) are deep neural networks that can be trained on large databases and show outstanding performance on object classification, segmentation, image denoising etc. In the past few years, several image denoising techniques have been developed to improve the quality of an image. The CNN based image denoising models have shown improvement in denoising performance as compared to non-CNN methods like block-matching and three-dimensional (3D) filtering, contemporary wavelet and Markov random field approaches etc. which had remained state-of-the-art for years. This study provides a comprehensive study of state-of-the-art image denoising methods using CNN. The literature associated with different CNNs used for image restoration like residual learning based models (DnCNN-S, DnCNN-B, IDCNN), non-locality reinforced (NN3D), fast and flexible network (FFDNet), deep shrinkage CNN (SCNN), a model for mixed noise reduction, denoising prior driven network (PDNN) are reviewed. DnCNN-S and PDNN remove Gaussian noise of fixed level, whereas DnCNN-B, IDCNN, NN3D and SCNN are used for blind Gaussian denoising. FFDNet is used for spatially variant Gaussian noise. The performance of these CNN models is analysed on BSD-68 and Set-12 datasets. PDNN shows the best result in terms of PSNR for both BSD-68 and Set-12 datasets. |
---|---|
ISSN: | 1751-9659 1751-9667 |
DOI: | 10.1049/iet-ipr.2019.0157 |