Towards emotion recognition from contextual information using machine learning
Emotions influence cognitive processes that underlie human behavior. Whereas experiencing negative emotions may lead to develop psychological disorders, experiencing positive emotions may improve creative thinking and promote cooperative behavior. The importance of human emotions has led to the deve...
Gespeichert in:
Veröffentlicht in: | Journal of ambient intelligence and humanized computing 2020-08, Vol.11 (8), p.3187-3207 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Emotions influence cognitive processes that underlie human behavior. Whereas experiencing negative emotions may lead to develop psychological disorders, experiencing positive emotions may improve creative thinking and promote cooperative behavior. The importance of human emotions has led to the development of automatic emotion recognition systems based on analysis of speech waveforms, facial expressions, and physiological signals as well as text data mining. However, emotions are associated with a context (in which emotions are actually experienced), hence, this work focuses on emotion recognition from contextual information. In this paper, we present a study aimed to assess the feasibility of automatically recognizing emotions from individuals’ contexts. In this study, 32 participants provided information using a mobile application about their emotions and the context (e.g., companions, activities, and locations) in which these emotions were experienced. We used machine learning techniques to build
individual
models,
general
models, and
gender-specific
models to automatically recognize emotions of participants. The empirical results show that individuals’ emotions are highly related to their context and that automatic recognition of emotions in real-world situations is feasible by using contextual data. |
---|---|
ISSN: | 1868-5137 1868-5145 |
DOI: | 10.1007/s12652-019-01485-x |