EAA-Net: A novel edge assisted attention network for single image dehazing

Traditional dehazing convolutional neural networks (CNNs) learn the feature maps only from hazy images to the corresponding hazy-free ones, which usually lead to some important feature loss like texture information. The paper proposes an effective edge assisted attention network (EAA-Net) for single...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge-based systems 2021-09, Vol.228, p.107279, Article 107279
Hauptverfasser: Wang, Chao, Shen, Hao-Zhen, Fan, Fan, Shao, Ming-Wen, Yang, Chuan-Sheng, Luo, Jian-Cheng, Deng, Liang-Jian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Traditional dehazing convolutional neural networks (CNNs) learn the feature maps only from hazy images to the corresponding hazy-free ones, which usually lead to some important feature loss like texture information. The paper proposes an effective edge assisted attention network (EAA-Net) for single image haze removal, aiming to preserve the texture information and improve the overall quality of the dehazed outcomes. The proposed EAA-Net mainly consists of three parts, i.e., a dehaze branch (DB), an edge branch (EB), and a feature fusion residual block (FFRB). In order to solve the loss of the texture information during the dehazing process, the forward information of the EB is concatenated with the DB’s information, and then the fusion result is passed to the FFRB. In addition, the multilevel information boost module is employed into the dehaze branch as a dehaze unit, and it can enhance the multilevel feature information for a better dehazing effect. Extensive experiments conducted on benchmark datasets demonstrate that the proposed EAA-Net performs favorably against some recent state-of-the-art approaches, which includes a better capacity of detail and color maintaining, and competitive consequences both quantitatively and qualitatively.
ISSN:0950-7051
1872-7409
DOI:10.1016/j.knosys.2021.107279