Continuous and Overall Quality of Experience Evaluation for Streaming Video Based on Rich Features Exploration and Dual-Stage Attention
With the rapid development of streaming media technology, the Quality of Experience (QoE) of streaming videos becomes crucial to optimize the video compression and transmission algorithms, such as adaptive bitrate (ABR). However, the complexity of human perceptual mechanisms, particularly in relatio...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on circuits and systems for video technology 2024-11, Vol.34 (11), p.11709-11723 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the rapid development of streaming media technology, the Quality of Experience (QoE) of streaming videos becomes crucial to optimize the video compression and transmission algorithms, such as adaptive bitrate (ABR). However, the complexity of human perceptual mechanisms, particularly in relation to temporal distortions, poses substantial challenges to effective QoE monitoring. In recent years, many efforts in video quality assessment (VQA) and video QoE evaluation have highlighted the influence of a broad spectrum of features-from Quality of Service (QoS) metrics to video content understanding-on viewer experience. On this basis, we believe that there is also a dynamic relationship among these features varying with the broadcasting content. Furthermore, research indicates a significant correlation between real-time and retrospective assessments of QoE for individual videos. In response to these insights, we introduce a novel approach leveraging a unified learnable network that incorporates dual-stage attention, the temporal and cross-feature attention, to accurately predict both continuous and overall QoE for streaming videos. The results of experiments conducted on several publicly available databases demonstrate the superiority of our proposed method over the state-of-the-art metrics. |
---|---|
ISSN: | 1051-8215 1558-2205 |
DOI: | 10.1109/TCSVT.2024.3418941 |