Tensor Completion Via Collaborative Sparse and Low-Rank Transforms

The transform-based tensor nuclear norm (TNN) methods have recently yielded promising results for tensor completion. The primary goal of these methods is to exploit the low-rank structure of frontal slices under the transform along the third mode. However, these methods typically neglect that the th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on computational imaging 2021, Vol.7, p.1289-1303
Hauptverfasser: Li, Ben-Zheng, Zhao, Xi-Le, Wang, Jian-Li, Chen, Yong, Jiang, Tai-Xiang, Liu, Jun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The transform-based tensor nuclear norm (TNN) methods have recently yielded promising results for tensor completion. The primary goal of these methods is to exploit the low-rank structure of frontal slices under the transform along the third mode. However, these methods typically neglect that the third mode fiber of a tensor is a one-dimensional signal, which is sparse under some transforms. In this study, we suggest a collaborative sparse and low-rank transforms model called CSLRT for third-order tensor completion. We simultaneously exploit the sparsity of third mode fibers and the low-rankness of frontal slices under learned transforms. In our work, the transformed sparsity is complementary to the transformed low-rankness, and they are organically combined and benefit from each other. Moreover, for tensors with limited correlation along the third mode (e.g., color images), we suggest three-directional CSLRT (3DCSLRT) to fully explore the transformed sparsity of fibers and transformed low-rankness of slices along three modes. To tackle proposed models, we develop multi-block proximal alternating minimization (PAM) algorithms and establish the theoretical guarantee. Extensive experimental results on real-world data demonstrate that proposed methods outperforms state-of-the-art competitors qualitatively and quantitatively.
ISSN:2573-0436
2333-9403
DOI:10.1109/TCI.2021.3126232