Context-aware Pseudo-true Video Interpolation at 6G Edge

In the 6G network, lots of edge devices facilitate the low-latency transmission of video. However, with limited processing and storage capabilities, the edge devices cannot afford to reconstruct the vast amount of video data. On the condition of edge computing in the 6G network, this article fuses a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ACM transactions on multimedia computing communications and applications 2022-11, Vol.18 (3s), p.1-17, Article 133
Hauptverfasser: Li, Ran, Wei, Wei, Hao, Peinan, Su, Jian, Sun, Fengyuan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In the 6G network, lots of edge devices facilitate the low-latency transmission of video. However, with limited processing and storage capabilities, the edge devices cannot afford to reconstruct the vast amount of video data. On the condition of edge computing in the 6G network, this article fuses a self-similarity-based context feature into Frame Rate Up-Conversion (FRUC) to generate the pseudo-true video sequences at high frame rate, and its core is the extraction of the context layer for each video frame. First, we extract the patch centered at each pixel and use the self-similarity descriptor to generate the correlation surface. Then, the expectation or skewness of the correlation surface in statistics is computed to represent its context feature. By attaching an expectation or a skewness to each pixel, the context layer is constructed and added to the video frame as a new channel. According to the context layer, we predict the motion vector field of the absent frame by using the bidirectional context match and finally produce the interpolated frame. From the experimental results, it can be seen that by deploying the proposed FRUC algorithm on edge devices, the output pseudo-true video sequences have satisfying objective and subjective qualities.
ISSN:1551-6857
1551-6865
DOI:10.1145/3555313