Convolutional neural network based emotion classification using electrodermal activity signals and time-frequency features

•EDA based emotion recognition is widely preferred and these signals are highly complex.•Convolution neural network and short-time Fourier transform are proposed to address this property.•Thirty-eight features, CNN model and five classifiers are employed along with three learning algorithms.•Represe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems with applications 2020-11, Vol.159, p.113571, Article 113571
Hauptverfasser: Ganapathy, Nagarajan, Veeranki, Yedukondala Rao, Swaminathan, Ramakrishnan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•EDA based emotion recognition is widely preferred and these signals are highly complex.•Convolution neural network and short-time Fourier transform are proposed to address this property.•Thirty-eight features, CNN model and five classifiers are employed along with three learning algorithms.•Representative and key features are learned using CNN to distinguish various emotions.•Experiments with public domain database shows their usefulness. In this work, an attempt has been made to classify emotional states using Electrodermal Activity (EDA) signals and Convolutional Neural Network (CNN) learned features. The EDA signals are obtained from the publicly available DEAP database and are decomposed into tonic and phasic components. The phasic component is subjected to the short-time Fourier transform. Thirty-eight features of time, frequency, and time–frequency domain are extracted from the phasic signal. These extracted features are applied to CNN to learn robust and prominent features. Five machine learning algorithms, namely linear discriminant analysis, multilayer perceptron, support vector machine, decision tree, and extreme learning machine are used for the classification. The results show that the proposed approach is able to classify the emotional states using arousal-valence dimensions. Classification using CNN learned features are found to be better than the conventional features. The trained end-to-end CNN model is found to be accurate (F-measure = 79.30% and 71.41% for arousal and valence dimensions) in classifying various emotional states. The proposed method is found to be robust in handling the dynamic variation of EDA signals for different emotional states. The results show that the proposed approach outperformed most of the state-of-the-art methods. Thus, it appears that the proposed method could be beneficial in analyzing various emotional states in both normal and clinical conditions.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2020.113571