Adaptive CU partition and early skip mode detection for H.266/VVC
The Joint Video Exploration Team (JVET) has started to develop the next-generation video coding standard-H.266/Versatile Video Coding (H.266/VVC) based on H.265/High Efficiency Video Coding (H.265/HEVC) to provide higher compression performance. The H.266/VVC supports the flexible quadtree with a ne...
Gespeichert in:
Veröffentlicht in: | Multimedia tools and applications 2021-04, Vol.80 (9), p.13957-13973 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The Joint Video Exploration Team (JVET) has started to develop the next-generation video coding standard-H.266/Versatile Video Coding (H.266/VVC) based on H.265/High Efficiency Video Coding (H.265/HEVC) to provide higher compression performance. The H.266/VVC supports the flexible quadtree with a nested multi-type tree (QTMT) partition structure including quadtree (QT), binary tree (BT), and ternary tree (TT). The coding unit (CU) sizes range from 128 to 4 for the luma component or from 64 to 2 for the chroma component in the QTMT splitting structure. The introduction of small CU size, i.e., 2×N, leads to inefficient hardware implementation because it causes pipeline delayed and needs to process 2×N pixels in the hardware architecture. In addition, the inter or bi-predicted of small CU requires a higher memory bandwidth than the bi-predicted of 8×8 CU in H.266/VVC. To solve the above issues, we introduce a fast method to accelerate CU partition and mode decision, including an adaptive CU partition method and early skip mode detection method. The proposed algorithm consists of two parts: (1) adaptive remove 2×N CUs by skipping BT and TT splitting mode; (2) early skip bi-predicted or inter prediction of small CU. The experimental results demonstrate that the proposed scheme can save 47% coding time while maintaining the coding performance. |
---|---|
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-020-10252-6 |