A deep convolutional neural network for salt-and-pepper noise removal using selective convolutional blocks

In recent years, there has been an unprecedented upsurge in applying deep learning approaches, specifically convolutional neural networks (CNNs), to solve image denoising problems, owing to their superior performance. However, CNNs mostly rely on Gaussian noise, and there is a conspicuous lack of ex...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied soft computing 2023-09, Vol.145, p.110535, Article 110535
Hauptverfasser: Rafiee, Ahmad Ali, Farhang, Mahmoud
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In recent years, there has been an unprecedented upsurge in applying deep learning approaches, specifically convolutional neural networks (CNNs), to solve image denoising problems, owing to their superior performance. However, CNNs mostly rely on Gaussian noise, and there is a conspicuous lack of exploiting CNNs for salt-and-pepper (SAP) noise reduction. In this paper, we proposed a deep CNN model, namely SeConvNet, to suppress SAP noise in gray-scale and color images. To meet this objective, we introduce a new selective convolutional (SeConv) block. SeConvNet is compared to state-of-the-art SAP denoising methods using extensive experiments on various common datasets. The results illustrate that the proposed SeConvNet model effectively restores images corrupted by SAP noise and surpasses all its counterparts at both quantitative criteria and visual effects, especially at high and very high noise densities. •A convolutional neural network (SeConvNet) is proposed to denoise salt-and-pepper.•A new selective convolutional (SeConv) block is employed in SeConvNet.•SeConvNet can denoise both gray-scale and color images.•SeConvNet is highly effective at high and very high noise densities.•SeConvNet outperforms other state-of-the-art salt-and-pepper denoising methods.
ISSN:1568-4946
1872-9681
DOI:10.1016/j.asoc.2023.110535