CoNoT: Coupled Nonlinear Transform-Based Low-Rank Tensor Representation for Multidimensional Image Completion
Recently, the transform-based tensor nuclear norm (TNN) methods have shown promising performance and drawn increasing attention in tensor completion (TC) problems. The main idea of these methods is to exploit the low-rank structure of frontal slices of the tensor under the transform. However, the tr...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2024-07, Vol.35 (7), p.8969-8983 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, the transform-based tensor nuclear norm (TNN) methods have shown promising performance and drawn increasing attention in tensor completion (TC) problems. The main idea of these methods is to exploit the low-rank structure of frontal slices of the tensor under the transform. However, the transforms in TNN methods usually treat all modes equally and do not consider the different traits of different modes (i.e., spatial and spectral/temporal modes). To address this problem, we suggest a new low-rank tensor representation based on the coupled nonlinear transform (called CoNoT) for a better low-rank approximation. Concretely, spatial and spectral/temporal transforms in the CoNoT, respectively, exploit the different traits of different modes and are coupled together to boost the implicit low-rank structure. Here, we use the convolutional neural network (CNN) as the CoNoT, which can be learned solely from an observed multidimensional image in an unsupervised manner. Based on this low-rank tensor representation, we build a new multidimensional image completion model. Moreover, we also propose an enhanced version (called Ms-CoNoT) to further exploit the spatial multiscale nature of real-world data. Extensive experiments on real-world data substantiate the superiority of the proposed models against many state-of-the-art methods both qualitatively and quantitatively. |
---|---|
ISSN: | 2162-237X 2162-2388 2162-2388 |
DOI: | 10.1109/TNNLS.2022.3217198 |