TCSR: Lightweight Transformer and CNN Interaction Network for Image Super-Resolution
Convolutional neural network (CNN) has achieved impressive success in lightweight image super-resolution (SR) methods, yet the nature of its local operations constrains the SR performance. Recent Transformer has attracted increasing attention in lightweight SR methods owing to its remarkable global...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024, Vol.12, p.174782-174795 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Convolutional neural network (CNN) has achieved impressive success in lightweight image super-resolution (SR) methods, yet the nature of its local operations constrains the SR performance. Recent Transformer has attracted increasing attention in lightweight SR methods owing to its remarkable global feature extraction capacity. However, the huge computational cost makes it challenging for lightweight SR methods to efficiently utilize Transformer to exploit global contextual information from shallow to intermediate layers. To address these issues, we propose a novel lightweight Transformer and CNN interaction network for image Super-Resolution (TCSR), which fully leverages the complementary strengths of Transformer and CNN. Specifically, an efficient lightweight Transformer and CNN Interaction Block (TCIB) is designed to extract local and global features at various stages of the network, resulting in favorable hybrid features that significantly improve the quality of reconstructed images. Then, we construct a lightweight Reversed UNet (RUNet) to progressively aggregate hybrid features as well as to better trade-off the reconstruction accuracy and efficiency. Furthermore, we introduce a Refinement module to further refine edge and texture details with global information. Experimental results on numerous benchmarks validate that the proposed TCSR achieves superior performance with fewer parameters and less computational overhead than state-of-the-art lightweight methods. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3476369 |