Illumination Estimation and Compensation of Low Frame Rate Video Sequences for Wavelet-Based Video Compression
In this paper, we are interested in the compression of image sets or video with considerable changes in illumination. We develop a framework to decompose frames into illumination fields and texture in order to achieve sparser representations of frames which is beneficial for compression. Illuminatio...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on image processing 2019-09, Vol.28 (9), p.4313-4327 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we are interested in the compression of image sets or video with considerable changes in illumination. We develop a framework to decompose frames into illumination fields and texture in order to achieve sparser representations of frames which is beneficial for compression. Illumination variations or contrast ratio factors among frames are described by a full resolution multiplicative field. First, we propose a Lifting-based Illumination Adaptive Transform (LIAT) framework which incorporates illumination compensation to temporal wavelet transforms. We estimate a full resolution illumination field, taking heed of its spatial sparsity by a rate-distortion (R-D) driven framework. An affine mesh model is also developed as a point of comparison. We find the operational coding cost of the subband frames by modeling a typical t + 2D wavelet video coding system. While our general findings on R-D optimization are applicable to a range of coding frameworks, in this paper, we report results based on employing JPEG 2000 coding tools. The experimental results highlight the benefits of the proposed R-D driven illumination estimation and compensation in comparison with alternative scalable coding methods and non-scalable coding schemes of AVC and HEVC employing weighted prediction. |
---|---|
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2019.2905756 |