Cross Parallax Attention Network for Stereo Image Super-Resolution

Stereo super-resolution (SR) aims to enhance the spatial resolution of one camera view using additional information from the other. Previous deep-learning-based stereo SR methods indeed improved the SR performance effectively by employing additional information, but they are unable to super-resolve...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2022, Vol.24, p.202-216
Hauptverfasser: Chen, Canqiang, Qing, Chunmei, Xu, Xiangmin, Dickinson, Patrick
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Stereo super-resolution (SR) aims to enhance the spatial resolution of one camera view using additional information from the other. Previous deep-learning-based stereo SR methods indeed improved the SR performance effectively by employing additional information, but they are unable to super-resolve stereo images where there are large disparities, or different types of epipolar lines. Moreover, in these methods, one model can only super-solve images of a particular view, and for one specific scale factor. This paper proposes a cross parallax attention stereo super-resolution network (CPASSRnet) which can perform stereo SR of multiple scale factors for both views, with a single model. To overcome the difficulties of large disparity and different types of epipolar lines, a cross parallax attention module (CPAM) is presented, which captures the global correspondence of additional information for each view, relative to the other. CPAM allows the two views to exchange additional information with each other according to the generated attention maps. Quantitative and qualitative results compared with the state of the arts illustrate the superiority of CPASSRnet. Ablation experiments demonstrate that the proposed components are effective and noise tests verify the robustness of CPASSRnet.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2021.3050092