Learning a No-reference Quality Predictor of Stereoscopic Images by Visual Binocular Properties

In this work, we develop a novel no-reference (NR) quality assessment metric for stereoscopic images based on monocular and binocular features, motivated by visual perception properties of the human visual system (HVS) named binocular rivalry and binocular integration. To be more specific, we first...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019-01, Vol.7, p.1-1
Hauptverfasser: Fang, Yuming, Yan, Jiebin, Wang, Jiheng, Liu, Xuelin, Zhai, Guangtao, Le Callet, Patrick
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this work, we develop a novel no-reference (NR) quality assessment metric for stereoscopic images based on monocular and binocular features, motivated by visual perception properties of the human visual system (HVS) named binocular rivalry and binocular integration. To be more specific, we first calculate the normalized intensity feature maps of right-and left-view images through local contrast normalization, where statistical intensity features are extracted by the histogram of the normalized intensity feature map to represent monocular features. Then, we compute the disparity map of stereoscopic image, with which we extract structure feature map of stereoscopic image based on local binary pattern (LBP). We further extract statistical structure features and statistical depth features from structure feature map and disparity map by histogram to represent binocular features. Finally, we adopt support vector regression (SVR) to train the mapping function from the extracted monocular and binocular features to subjective quality scores. Comparison experiments are conducted on four large-scale stereoscopic image databases and the results demonstrate the promising performance of the proposed method in stereoscopic image quality assessment.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2941112