TorchEEGEMO: A deep learning toolbox towards EEG-based emotion recognition
With deep learning (DL) development, EEG-based emotion recognition has attracted increasing attention. Diverse DL algorithms emerge and intelligently decode human emotion from EEG signals. However, the lack of a toolbox encapsulating these techniques hampers further the design, development, testing,...
Gespeichert in:
Veröffentlicht in: | Expert systems with applications 2024-09, Vol.249, p.123550, Article 123550 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With deep learning (DL) development, EEG-based emotion recognition has attracted increasing attention. Diverse DL algorithms emerge and intelligently decode human emotion from EEG signals. However, the lack of a toolbox encapsulating these techniques hampers further the design, development, testing, implementation, and management of intelligent systems. To tackle this bottleneck, we propose a Python toolbox, TorchEEGEMO, which divides the workflow into five modules: datasets, transforms, model_selection, models, and trainers. Each module includes plug-and-play functions to construct and manage a stage in the workflow. Recognizing the frequent access to time windows of interest, we introduce a window-centric parallel input/output system, bolstering the efficiency of DL systems. We finally conduct extensive experiments to provide the benchmark results of supported modules. Our extensive experimental results demonstrate the versatility and applicability of TorchEEGEMO across various scenarios.
•The first deep learning toolbox towards EEG-based emotion recognition.•A workflow that divides the recognition system into five plug-and-play modules.•Built-in functions cover datasets, transformations, models, algorithms, and more.•A novel window-centric EEG I/O is to enhance system effectiveness.•Experiments demonstrate benchmark performance across various scenarios. |
---|---|
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2024.123550 |