Dual Projection Fusion for Reference-Based Image Super-Resolution
Reference-based image super-resolution (RefSR) methods have achieved performance superior to that of single image super-resolution (SISR) methods by transferring texture details from an additional high-resolution (HR) reference image to the low-resolution (LR) image. However, existing RefSR methods...
Gespeichert in:
Veröffentlicht in: | Sensors (Basel, Switzerland) Switzerland), 2022-05, Vol.22 (11), p.4119 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Reference-based image super-resolution (RefSR) methods have achieved performance superior to that of single image super-resolution (SISR) methods by transferring texture details from an additional high-resolution (HR) reference image to the low-resolution (LR) image. However, existing RefSR methods simply add or concatenate the transferred texture feature with the LR features, which cannot effectively fuse the information of these two independently extracted features. Therefore, this paper proposes a dual projection fusion for reference-based image super-resolution (DPFSR), which enables the network to focus more on the different information between feature sources through inter-residual projection operations, ensuring effective filling of detailed information in the LR feature. Moreover, this paper also proposes a novel backbone called the deep channel attention connection network (DCACN), which is capable of extracting valuable high-frequency components from the LR space to further facilitate the effectiveness of image reconstruction. Experimental results show that we achieve the best peak signal-to-noise ratio (PSNR) and structure similarity (SSIM) performance compared with the state-of-the-art (SOTA) SISR and RefSR methods. Visual results demonstrate that the proposed method in this paper recovers more natural and realistic texture details. |
---|---|
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s22114119 |