Objective estimation of subjective image quality assessment using multi-parameter prediction

Objective evaluation of a subjective image quality assessment plays a significant role in the various image processing applications, such as compression, interpolation and noise reduction. The subjective image quality assessment does not only depend on some objective measurable artefacts, but also o...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET image processing 2019-11, Vol.13 (13), p.2428-2435
Hauptverfasser: Maksimović-Moićević, Sanja, Lukač, Željko, Temerinac, Miodrag
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Objective evaluation of a subjective image quality assessment plays a significant role in the various image processing applications, such as compression, interpolation and noise reduction. The subjective image quality assessment does not only depend on some objective measurable artefacts, but also on image content and kind of distortions. Thus, a multi-parameter prediction of the objective image quality assessment is proposed in this study. The prediction parameters are found minimising the mean square error related to the known subjective image quality measure (DMOS). This approach includes mostly used image quality metrics (peak signal-to-noise ratio, multi-scale structural similarity image measure, feature similarity image measure, video quality measure) and two-dimensional image quality metrics (2D IQM). The proposed multi-parameter prediction has been verified on the test image database (LIVE) for compression, noise and blur distortions with available subjective image quality measures (DMOS). More reliable estimations are obtained using multi-parameter prediction instead of only one measure. The best results are reached when an image content indicator is combined with the 2D IQM measure separately for different kinds of distortions.
ISSN:1751-9659
1751-9667
DOI:10.1049/iet-ipr.2018.6143