CCR: Clustering and Collaborative Representation for Fast Single Image Super-Resolution

Clustering and collaborative representation (CCR) have recently been used in fast single image super-resolution (SR). In this paper, we propose an effective and fast single image super-resolution (SR) algorithm by combining clustering and collaborative representation. In particular, we first cluster...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2016-03, Vol.18 (3), p.405-417
Hauptverfasser: Zhang, Yongbing, Zhang, Yulun, Zhang, Jian, Dai, Qionghai
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Clustering and collaborative representation (CCR) have recently been used in fast single image super-resolution (SR). In this paper, we propose an effective and fast single image super-resolution (SR) algorithm by combining clustering and collaborative representation. In particular, we first cluster the feature space of low-resolution (LR) images into multiple LR feature subspaces and group the corresponding high-resolution (HR) feature subspaces. The local geometry property learned from the clustering process is used to collect numerous neighbor LR and HR feature subsets from the whole feature spaces for each cluster center. Multiple projection matrices are then computed via collaborative representation to map LR feature subspaces to HR subspaces. For an arbitrary input LR feature, the desired HR output can be estimated according to the projection matrix, whose corresponding LR cluster center is nearest to the input. Moreover, by learning statistical priors from the clustering process, our clustering-based SR algorithm would further decrease the computational time in the reconstruction phase. Extensive experimental results on commonly used datasets indicate that our proposed SR algorithm obtains compelling SR images quantitatively and qualitatively against many state-of-the-art methods.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2015.2512046