Population-based evolutionary search for joint hyperparameter and architecture optimization in brain-computer interface

In recent years, deep learning (DL)-based models have become the de facto standard for motor imagery brain-computer interface (MI-BCI) systems due to their notable performance. However, these models often require extensive hyperparameter optimization process to achieve optimal results. To tackle thi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems with applications 2025-03, Vol.264, p.125832, Article 125832
Hauptverfasser: Shin, Dong-Hee, Lee, Deok-Joong, Han, Ji-Wung, Son, Young-Han, Kam, Tae-Eui
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In recent years, deep learning (DL)-based models have become the de facto standard for motor imagery brain-computer interface (MI-BCI) systems due to their notable performance. However, these models often require extensive hyperparameter optimization process to achieve optimal results. To tackle this challenge, recent studies have proposed various methods to automate this process. Despite promising results, these methods overlook the architecture elements, which are crucial factors for MI-BCI system performance and are highly intertwined with hyperparameter settings. To overcome this limitation, we propose a joint optimization framework that uses a population-based evolutionary search to optimize both hyperparameters and architectures. Our framework adopts a two-stage optimization approach that alternates between hyperparameter and architecture optimization to effectively manage the complexity of the joint search process. Furthermore, we introduce a novel ensemble method that leverages diverse promising configurations to enhance generalization and robustness. Evaluations on two public MI-BCI datasets show that our framework consistently outperforms competing methods across a range of backbone models, demonstrating its effectiveness and versatility. •We propose to optimize both hyperparameter values and neural network architectures.•Population-based evolutionary search enables an adaptive optimization in parallel.•Two-stage optimization process effectively handles the complex joint search space.•Ensemble method further improves the robustness and generalization capabilities.•Our method outperforms all competing methods across a range of backbone models.
ISSN:0957-4174
DOI:10.1016/j.eswa.2024.125832