A Systematic Review on Multimodal Emotion Recognition: Building Blocks, Current State, Applications, and Challenges
Emotion recognition involves accurately interpreting human emotions from various sources and modalities, including questionnaires, verbal, and physiological signals. With its broad applications in affective computing, computational creativity, human-robot interactions, and market research, the field...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024, Vol.12, p.103976-104019 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Emotion recognition involves accurately interpreting human emotions from various sources and modalities, including questionnaires, verbal, and physiological signals. With its broad applications in affective computing, computational creativity, human-robot interactions, and market research, the field has seen a surge in interest in recent years. This paper presents a systematic review of multimodal emotion recognition (MER) techniques developed from 2014 to 2024, encompassing verbal, physiological signals, facial, body gesture, and speech as well as emerging methods like sketches emotion recognition. The review explores various emotion models, distinguishing between emotions, feelings, sentiments, and moods, along with human emotional expression, categorized in both artistic and non-verbal ways. It also discusses the background of automated emotion recognition systems and introduces seven criteria for evaluating modalities alongside a current state analysis of MER, drawn from the human-centric perspective of this field. By selecting the PRISMA guidelines and carefully analyzing 45 selected articles, this review provides comprehensive perspectives into existing studies, datasets, technical approaches, identified gaps, and future directions in MER. It also highlights existing challenges and current applications of the MER. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3430850 |