Stereoscopic Image Quality Assessment by Considering Binocular Visual Mechanisms
With the booming of 3-D image processing in the entertainment industry and 3-D multimedia applications todays, the technology for assessing the quality of stereoscopic image faces more challenging tasks than its 2-D counterparts, such as binocular combination, stereo matching, and binocular rivalry....
Gespeichert in:
Veröffentlicht in: | IEEE access 2018-01, Vol.6, p.51337-51347 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the booming of 3-D image processing in the entertainment industry and 3-D multimedia applications todays, the technology for assessing the quality of stereoscopic image faces more challenging tasks than its 2-D counterparts, such as binocular combination, stereo matching, and binocular rivalry. In this paper, a novel stereoscopic image quality assessment method is proposed by jointly exploring binocular fusion and rivalry models. In the quality-aware feature extraction stage, multi-scale binocular combination and binocular rivalry energy responses are first generated from reference and distorted stereopairs. In addition, considering that the changes of luminance affect the quality of stereoscopic image greatly, multi-scale visual features relating to image quality are obtained from its luminance maps as another binocular combination and rivalry features. Then, the dissimilarity of quality-aware features between the reference stereopair and its distorted version is quantified. Finally, such dissimilarities are mapped into an objective score to represent the perceptual quality of stereoscopic image through the pooling strategy of support vector regression. Experiments conducted on LIVE 3-D databases demonstrate that the proposed method achieves 96.61% and 96.03% in terms of Pearson's linear correlation coefficient on Database Phase I and Phase II, respectively, outperforming most of the state-of-the-art methods. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2018.2869525 |