Video Motion Vector Recovery Method Using Decoding Partition Information
This paper presents a novel motion vector recovery and error concealment algorithm with the utilization of encoding partition information for H.264/AVC. The motion vectors for each missing pixel location are derived using available neighboring pixel motion vectors, which contribute to the generation...
Gespeichert in:
Veröffentlicht in: | Journal of display technology 2016-11, Vol.12 (11), p.1451-1463 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper presents a novel motion vector recovery and error concealment algorithm with the utilization of encoding partition information for H.264/AVC. The motion vectors for each missing pixel location are derived using available neighboring pixel motion vectors, which contribute to the generation of the missing motion vectors inversely proportional to the distance between them. The motion extrapolation method is used to project the encoding partition information from the reference frame into the current frame with different levels of overlapping of lost pixels. The different levels of overlapping can help determine the estimated encoding partition information in the lost macroblock (MB). Finally, the pixels that are determined to be of the same estimated partition share the same motion vector in order to maintain the integrity of the estimated moving objects in the lost MB. This proposed pixel-based motion vector with partition (PMVP) method compares with the state-of-the-art Zhou's method, Lin's method, and Lie's method. For total average in packet loss rates of 3%, 7%, 16%, and 20%, PMVP is better than Zhou by 0.88, 1.02, 1.05, and 1.01 dB, respectively; Lin by 0.22, 0.32, 0.35, and 0.33 dB, respectively; and Lie by 4.12, 4.98, 4.15, and 3.88 dB, respectively. Therefore, the proposed PMVP performs the best on average among all the methods. |
---|---|
ISSN: | 1551-319X 1558-9323 |
DOI: | 10.1109/JDT.2016.2595640 |