Single Image Defogging Based on Multi-Channel Convolutional MSRCR

In order to solve the problem of image degradation in foggy weather, a single image defogging method based on a multi-scale retinex with color restoration (MSRCR) of multi-channel convolution (MC) is proposed. The whole defogging process mainly consists of four key parts: estimation of illumination...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019, Vol.7, p.72492-72504
Hauptverfasser: Zhang, Weidong, Dong, Lili, Pan, Xipeng, Zhou, Jingchun, Qin, Li, Xu, Wenhai
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In order to solve the problem of image degradation in foggy weather, a single image defogging method based on a multi-scale retinex with color restoration (MSRCR) of multi-channel convolution (MC) is proposed. The whole defogging process mainly consists of four key parts: estimation of illumination components, guided filter operation, reconstruction of fog-free images, and white balance operation. First, the multi-scale Gaussian kernels are employed to extract precise features to estimate the illumination component. After that, the MSRCR method is applied to enhance the global contrast, detail information, and color restoration of the image. Second, the smoothing constraints of both illumination component and reflected component are considered together by using the guided filter twice, thus the enhanced image satisfies the smoothing constraint and the noise in the enhanced image is reduced. Third, the enhanced image by the MSRCR and the image processed by the secondary guided filter are fused by linear weighting to reconstruct the final fog-free image. Finally, in order to eliminate the influence of illumination on the color of the defogged image, the final defogged image is processed by white balance. The experimental results demonstrated that the proposed method can outperform state-of-the-art methods in both qualitative and quantitative comparisons.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2920403