Compressed Tensor Completion: A Robust Technique for Fast and Efficient Data Reconstruction in Wireless Sensor Networks

In recent times, Compressive sensing (CS) based data collection has become an essential technique for Wireless Sensor Networks (WSN) because of its benefits of low data communication costs. This work develops a Compressed Tensor Completion (CTC) approach to recover the original signals effectively a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2022-06, Vol.22 (11), p.10794-10807
Hauptverfasser: Sekar, K., Devi, K. Suganya, Srinivasan, P.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In recent times, Compressive sensing (CS) based data collection has become an essential technique for Wireless Sensor Networks (WSN) because of its benefits of low data communication costs. This work develops a Compressed Tensor Completion (CTC) approach to recover the original signals effectively and with better accuracy from the randomly projected sea surface temperature signals. Tensor learning approaches show their superiority towards identifying and extracting correlations in spatio-temporal compressed signals. Initially, this research frames the 3D tensor. Subsequently, we implement compressed matrix factorization using Randomised Singular Value Decomposition (RSVD). This research utilizes a general strategy to define conditions between the matrices and tensor environments to obtain accurate signal measurements. The proposed approach uses Pacific Sea surface temperature data and brain image data for simulation. The simulation results show that the proposed method yields higher accuracy at a lower sampling rate, minimizing data transmission costs and extending sensor lifespan. In addition, CTC has outperformed other approaches in classifying large-scale multi-dimensional sensor data.
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2022.3169226