Assessing Individual VR Sickness Through Deep Feature Fusion of VR Video and Physiological Response
Recently, VR sickness assessment for VR videos is highly demanded in industry and research fields to address VR viewing safety issues. Especially, it is difficult to evaluate VR sickness of individuals due to individual differences. To achieve the challenging goal, we focus on deep feature fusion of...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on circuits and systems for video technology 2022-05, Vol.32 (5), p.2895-2907 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, VR sickness assessment for VR videos is highly demanded in industry and research fields to address VR viewing safety issues. Especially, it is difficult to evaluate VR sickness of individuals due to individual differences. To achieve the challenging goal, we focus on deep feature fusion of sickness-related information. In this paper, we propose a novel deep learning-based assessment framework which estimates VR sickness of individual viewers with VR videos and corresponding physiological responses. We design the content stimulus guider imitating the phenomenon that humans feel VR sickness. The content stimulus guider extracts a deep stimulus feature from a VR video to reflect VR sickness caused by VR videos. In addition, we devise the physiological response guider to encode physiological responses that are acquired while humans experience VR videos. Each physiology sickness feature extractor (EEG, ECG, and GSR) in the physiological response guider is designed to suit their physiological characteristics. Extracted physiology sickness features are then fused into a deep physiology feature that comprehensively reflects individual deviations of VR sickness. Finally, the VR sickness predictor assesses individual VR sickness effectively with the fusion of the deep stimulus feature and the deep physiology feature. To validate the proposed method extensively, we built two benchmark datasets which contain 360-degree VR videos with physiological responses (EEG, ECG, and GSR) and SSQ scores. Experimental results show that the proposed method achieves meaningful correlations with human SSQ scores. Further, we validate the effectiveness of the proposed network designs by conducting analysis on feature fusion and visualization. |
---|---|
ISSN: | 1051-8215 1558-2205 |
DOI: | 10.1109/TCSVT.2021.3103544 |