New architecture of deep recursive convolution networks for super-resolution
More existing methods of single image super-resolution (SR) often direct super-resolving the details, but when the upsampling factor is larger, it is challenging to reconstruct high-frequency details. Lately, deep convolution neural networks have made significant progress with regard to SR. However,...
Gespeichert in:
Veröffentlicht in: | Knowledge-based systems 2019-08, Vol.178, p.98-110 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | More existing methods of single image super-resolution (SR) often direct super-resolving the details, but when the upsampling factor is larger, it is challenging to reconstruct high-frequency details. Lately, deep convolution neural networks have made significant progress with regard to SR. However, with an increase in networks width and depth, the information used for reconstruction is becoming increasingly weaker, and the training of neural networks is becoming more difficult. This paper proposes a novel architecture of a deep recursive convolution neural networks used to reconstruct a high-resolution image from an original low-resolution (LR) image in a step-by-step manner. The architecture consists of three parts: an embedding network, cascaded fine extraction blocks, and reconstruction networks. Concretely, a wide convolution is used to extract more features from the original LR images, cascaded fine extraction blocks are employed to extract more useful information through a step-by-step approach and remove redundant information, and a deconvolution operation is utilized to restore the features. The proposed networks adopt a residual-feature learning scheme, and the Caffe framework is chosen for training the networks. The experimental results show that the proposed method exhibits a superior performance compared with various other state-of-the-art methods. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2019.04.021 |