Image Manipulation Quality Assessment
Image quality assessment (IQA) and its computational models play a vital role in modern computer vision applications. Research has traditionally focused on signal distortions arising during image compression and transmission, and their impact on perceived image quality. However, little attention is...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on circuits and systems for video technology 2024-11, p.1-1 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Image quality assessment (IQA) and its computational models play a vital role in modern computer vision applications. Research has traditionally focused on signal distortions arising during image compression and transmission, and their impact on perceived image quality. However, little attention is paid to image manipulation that alters an image using various filters. With the prevalence of image manipulation in real-life scenarios, it is critical to understand how humans perceive filter-altered images and to develop reliable IQA models capable of automatically assessing the quality of filtered images. In this paper, we build a new IQA database for filter-altered images, comprised of 360 images manipulated by various filters. To ensure the subjective IQA faithfully reflects human visual perception, we conduct a fully-controlled psychovisual experiment. Building upon the ground truth, we propose an innovative deep learning-based no-reference IQA (NR-IQA) model named IMQA that can accurately predict the perceived quality of filter-altered images. This model involves constructing an image filtering-aware module to learn discriminatory features for filter-altered images; and fuses these features with the representations generated by an image quality-aware module. Experimental results demonstrate the superior performance of the proposed IMQA model. |
---|---|
ISSN: | 1051-8215 1558-2205 |
DOI: | 10.1109/TCSVT.2024.3504854 |