New Bidirectional Motion Estimation Using Mesh-Based Frame Interpolation for Videoconferencing Applications
To overcome these drawbacks, a new bidirectional motion estimation algorithm for videoconferencing application at very low bit rate is proposed. The approach is based on spatio-temporal spline interpolation. The proposed algorithm processes as follows. Some frames in the original video sequence are...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | To overcome these drawbacks, a new bidirectional motion estimation algorithm for videoconferencing application at very low bit rate is proposed. The approach is based on spatio-temporal spline interpolation. The proposed algorithm processes as follows. Some frames in the original video sequence are purposely missed. Afterwards, these frames are predicted by the decoder side using only the transmitted frames with no additional information. The motion of each selected and decoded reference object in the scene is modeled by rectangular deformable mesh grid which is constructed according to a coarse to fine quad-tree decomposition tilling algorithm. An optimization step adapts the mesh nodes to the object gradient minimizing then the error of the reconstructed object in an acceptable computational time. From these nodes, a temporal cubic spline interpolation predicts the mesh nodes of the moving object in the missed frame reconstructing therefore the meshed object. While the current frame background is interpolated using temporal cubic spline. The proposed approach is integrated in the H.264/AVC video coding standard. Simulation tests conducted on videoconferencing sequences, at very low bit rate, show that the method provides interesting results in terms of rate-distortion compared to H.264/AVC video coding standard since no additional information is sent to the decoder. |
---|---|
ISSN: | 1068-0314 2375-0359 |
DOI: | 10.1109/DCC.2008.104 |