Decoding dynamic affective responses to naturalistic videos with shared neural patterns

This study explored the feasibility of using shared neural patterns from brief affective episodes (viewing affective pictures) to decode extended, dynamic affective sequences in a naturalistic experience (watching movie-trailers). Twenty-eight participants viewed pictures from the International Affe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:NeuroImage (Orlando, Fla.) Fla.), 2020-08, Vol.216, p.116618-116618, Article 116618
Hauptverfasser: Chan, Hang-Yee, Smidts, Ale, Schoots, Vincent C., Sanfey, Alan G., Boksem, Maarten A.S.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This study explored the feasibility of using shared neural patterns from brief affective episodes (viewing affective pictures) to decode extended, dynamic affective sequences in a naturalistic experience (watching movie-trailers). Twenty-eight participants viewed pictures from the International Affective Picture System (IAPS) and, in a separate session, watched various movie-trailers. We first located voxels at bilateral occipital cortex (LOC) responsive to affective picture categories by GLM analysis, then performed between-subject hyperalignment on the LOC voxels based on their responses during movie-trailer watching. After hyperalignment, we trained between-subject machine learning classifiers on the affective pictures, and used the classifiers to decode affective states of an out-of-sample participant both during picture viewing and during movie-trailer watching. Within participants, neural classifiers identified valence and arousal categories of pictures, and tracked self-reported valence and arousal during video watching. In aggregate, neural classifiers produced valence and arousal time series that tracked the dynamic ratings of the movie-trailers obtained from a separate sample. Our findings provide further support for the possibility of using pre-trained neural representations to decode dynamic affective responses during a naturalistic experience. •Previous studies extracted affective neural representations by inducing brief and isolated episodes of emotions.•It is unclear whether these neural representations could capture dynamic affective changes under naturalistic conditions.•We create neural classifiers of affect from picture viewing to decode dynamic responses during watching movie-trailers.•Decoded time series of the videos correlated with in-sample summary ratings and out-of-sample continuous ratings.•Findings show possibility of decoding dynamic responses to naturalistic experience with pre-trained neural classifiers.
ISSN:1053-8119
1095-9572
DOI:10.1016/j.neuroimage.2020.116618