MI-BMInet: An Efficient Convolutional Neural Network for Motor Imagery Brain-Machine Interfaces With EEG Channel Selection
A brain-machine interface (BMI) based on motor imagery (MI) enables the control of devices using brain signals while the subject imagines performing a movement. It plays a key role in prosthesis control and motor rehabilitation. To improve user comfort, preserve data privacy, and reduce the system...
Gespeichert in:
Veröffentlicht in: | IEEE sensors journal 2024-03, Vol.24 (6), p.8835-8847 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 8847 |
---|---|
container_issue | 6 |
container_start_page | 8835 |
container_title | IEEE sensors journal |
container_volume | 24 |
creator | Wang, Xiaying Hersche, Michael Magno, Michele Benini, Luca |
description | A brain-machine interface (BMI) based on motor imagery (MI) enables the control of devices using brain signals while the subject imagines performing a movement. It plays a key role in prosthesis control and motor rehabilitation. To improve user comfort, preserve data privacy, and reduce the system's latency, a new trend in wearable BMIs is to execute algorithms on low-power microcontroller units (MCUs) embedded on edge devices to process the electroencephalographic (EEG) data in real-time close to the sensors. However, most of the classification models presented in the literature are too resource-demanding for low-power MCUs. This article proposes an efficient convolutional neural network (CNN) for EEG-based MI classification that achieves comparable accuracy while being orders of magnitude less resource-demanding and significantly more energy-efficient than state-of-the-art (SoA) models. To further reduce the model complexity, we propose an automatic channel selection method based on spatial filters and quantize both weights and activations to 8-bit precision with negligible accuracy loss. Finally, we implement and evaluate the proposed models on leading-edge parallel ultralow-power (PULP) MCUs. The final two-class solution consumes as little as 30 ~\mu \text{J} /inference with a runtime of 2.95 ms/inference and an accuracy of 82.51% while using 6.4\times fewer EEG channels, becoming the new SoA for embedded MI-BMI and defining a new Pareto frontier in the three-way trade-off among accuracy, resource cost, and power usage. |
doi_str_mv | 10.1109/JSEN.2024.3353146 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_10409134</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10409134</ieee_id><sourcerecordid>2956384269</sourcerecordid><originalsourceid>FETCH-LOGICAL-c209t-c1ae774ee90a5999f302fc47db6f5e85181de06fe54999b35d6ba30aa6ef64913</originalsourceid><addsrcrecordid>eNpNkDFPwzAQhSMEEqXwA5AYLDGn2LGdxGxtFEpQU4aCYIvc9ExTUhscB1R-PQntwHLvpHvvTvd53iXBI0KwuHlYpPNRgAM2opRTwsIjb0A4j30Ssfi47yn2GY1eT72zptlgTETEo4H3k2f-JM80uFs01ihVqior0A4lRn-ZunWV0bJGc2jtn7hvY9-RMhblxnU128o3sDs0sbLSfi7LdaUBZdqBVbKEBr1Ubo3SdIqStdQaarSAGsp-7bl3omTdwMVBh97zXfqU3Puzx2mWjGd-GWDh_JJIiCIGILDkQghFcaBKFq2WoeIQcxKTFeBQAWfddEn5KlxKiqUMQYVMEDr0rvd7P6z5bKFxxca0tvuqKQLBQxqzIBSdi-xdpTVNY0EVH7baSrsrCC56xEWPuOgRFwfEXeZqn6kA4J-f4e4so7_u1nf1</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2956384269</pqid></control><display><type>article</type><title>MI-BMInet: An Efficient Convolutional Neural Network for Motor Imagery Brain-Machine Interfaces With EEG Channel Selection</title><source>IEEE Electronic Library (IEL)</source><creator>Wang, Xiaying ; Hersche, Michael ; Magno, Michele ; Benini, Luca</creator><creatorcontrib>Wang, Xiaying ; Hersche, Michael ; Magno, Michele ; Benini, Luca</creatorcontrib><description><![CDATA[A brain-machine interface (BMI) based on motor imagery (MI) enables the control of devices using brain signals while the subject imagines performing a movement. It plays a key role in prosthesis control and motor rehabilitation. To improve user comfort, preserve data privacy, and reduce the system's latency, a new trend in wearable BMIs is to execute algorithms on low-power microcontroller units (MCUs) embedded on edge devices to process the electroencephalographic (EEG) data in real-time close to the sensors. However, most of the classification models presented in the literature are too resource-demanding for low-power MCUs. This article proposes an efficient convolutional neural network (CNN) for EEG-based MI classification that achieves comparable accuracy while being orders of magnitude less resource-demanding and significantly more energy-efficient than state-of-the-art (SoA) models. To further reduce the model complexity, we propose an automatic channel selection method based on spatial filters and quantize both weights and activations to 8-bit precision with negligible accuracy loss. Finally, we implement and evaluate the proposed models on leading-edge parallel ultralow-power (PULP) MCUs. The final two-class solution consumes as little as <inline-formula> <tex-math notation="LaTeX">30 ~\mu \text{J} </tex-math></inline-formula>/inference with a runtime of 2.95 ms/inference and an accuracy of 82.51% while using <inline-formula> <tex-math notation="LaTeX">6.4\times </tex-math></inline-formula> fewer EEG channels, becoming the new SoA for embedded MI-BMI and defining a new Pareto frontier in the three-way trade-off among accuracy, resource cost, and power usage.]]></description><identifier>ISSN: 1530-437X</identifier><identifier>EISSN: 1558-1748</identifier><identifier>DOI: 10.1109/JSEN.2024.3353146</identifier><identifier>CODEN: ISJEAZ</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Accuracy ; Algorithms ; Artificial neural networks ; Batteries ; Brain ; Brain modeling ; Brain–computer interfaces ; Classification ; Computational modeling ; Convolutional neural networks ; convolutional neural networks (CNNs) ; edge computing ; Electroencephalography ; embedded systems ; feature reduction ; Hardware ; Imagery ; Inference ; machine learning ; Man-machine interfaces ; motor imagery (MI) ; Neural networks ; Power management ; Prostheses ; Spatial filtering ; Task analysis ; TinyML</subject><ispartof>IEEE sensors journal, 2024-03, Vol.24 (6), p.8835-8847</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c209t-c1ae774ee90a5999f302fc47db6f5e85181de06fe54999b35d6ba30aa6ef64913</citedby><cites>FETCH-LOGICAL-c209t-c1ae774ee90a5999f302fc47db6f5e85181de06fe54999b35d6ba30aa6ef64913</cites><orcidid>0000-0003-3467-5033 ; 0000-0001-8068-3806 ; 0000-0003-3065-7639 ; 0000-0003-0368-8923</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10409134$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27922,27923,54756</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10409134$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Wang, Xiaying</creatorcontrib><creatorcontrib>Hersche, Michael</creatorcontrib><creatorcontrib>Magno, Michele</creatorcontrib><creatorcontrib>Benini, Luca</creatorcontrib><title>MI-BMInet: An Efficient Convolutional Neural Network for Motor Imagery Brain-Machine Interfaces With EEG Channel Selection</title><title>IEEE sensors journal</title><addtitle>JSEN</addtitle><description><![CDATA[A brain-machine interface (BMI) based on motor imagery (MI) enables the control of devices using brain signals while the subject imagines performing a movement. It plays a key role in prosthesis control and motor rehabilitation. To improve user comfort, preserve data privacy, and reduce the system's latency, a new trend in wearable BMIs is to execute algorithms on low-power microcontroller units (MCUs) embedded on edge devices to process the electroencephalographic (EEG) data in real-time close to the sensors. However, most of the classification models presented in the literature are too resource-demanding for low-power MCUs. This article proposes an efficient convolutional neural network (CNN) for EEG-based MI classification that achieves comparable accuracy while being orders of magnitude less resource-demanding and significantly more energy-efficient than state-of-the-art (SoA) models. To further reduce the model complexity, we propose an automatic channel selection method based on spatial filters and quantize both weights and activations to 8-bit precision with negligible accuracy loss. Finally, we implement and evaluate the proposed models on leading-edge parallel ultralow-power (PULP) MCUs. The final two-class solution consumes as little as <inline-formula> <tex-math notation="LaTeX">30 ~\mu \text{J} </tex-math></inline-formula>/inference with a runtime of 2.95 ms/inference and an accuracy of 82.51% while using <inline-formula> <tex-math notation="LaTeX">6.4\times </tex-math></inline-formula> fewer EEG channels, becoming the new SoA for embedded MI-BMI and defining a new Pareto frontier in the three-way trade-off among accuracy, resource cost, and power usage.]]></description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Batteries</subject><subject>Brain</subject><subject>Brain modeling</subject><subject>Brain–computer interfaces</subject><subject>Classification</subject><subject>Computational modeling</subject><subject>Convolutional neural networks</subject><subject>convolutional neural networks (CNNs)</subject><subject>edge computing</subject><subject>Electroencephalography</subject><subject>embedded systems</subject><subject>feature reduction</subject><subject>Hardware</subject><subject>Imagery</subject><subject>Inference</subject><subject>machine learning</subject><subject>Man-machine interfaces</subject><subject>motor imagery (MI)</subject><subject>Neural networks</subject><subject>Power management</subject><subject>Prostheses</subject><subject>Spatial filtering</subject><subject>Task analysis</subject><subject>TinyML</subject><issn>1530-437X</issn><issn>1558-1748</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkDFPwzAQhSMEEqXwA5AYLDGn2LGdxGxtFEpQU4aCYIvc9ExTUhscB1R-PQntwHLvpHvvTvd53iXBI0KwuHlYpPNRgAM2opRTwsIjb0A4j30Ssfi47yn2GY1eT72zptlgTETEo4H3k2f-JM80uFs01ihVqior0A4lRn-ZunWV0bJGc2jtn7hvY9-RMhblxnU128o3sDs0sbLSfi7LdaUBZdqBVbKEBr1Ubo3SdIqStdQaarSAGsp-7bl3omTdwMVBh97zXfqU3Puzx2mWjGd-GWDh_JJIiCIGILDkQghFcaBKFq2WoeIQcxKTFeBQAWfddEn5KlxKiqUMQYVMEDr0rvd7P6z5bKFxxca0tvuqKQLBQxqzIBSdi-xdpTVNY0EVH7baSrsrCC56xEWPuOgRFwfEXeZqn6kA4J-f4e4so7_u1nf1</recordid><startdate>20240315</startdate><enddate>20240315</enddate><creator>Wang, Xiaying</creator><creator>Hersche, Michael</creator><creator>Magno, Michele</creator><creator>Benini, Luca</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0003-3467-5033</orcidid><orcidid>https://orcid.org/0000-0001-8068-3806</orcidid><orcidid>https://orcid.org/0000-0003-3065-7639</orcidid><orcidid>https://orcid.org/0000-0003-0368-8923</orcidid></search><sort><creationdate>20240315</creationdate><title>MI-BMInet: An Efficient Convolutional Neural Network for Motor Imagery Brain-Machine Interfaces With EEG Channel Selection</title><author>Wang, Xiaying ; Hersche, Michael ; Magno, Michele ; Benini, Luca</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c209t-c1ae774ee90a5999f302fc47db6f5e85181de06fe54999b35d6ba30aa6ef64913</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Batteries</topic><topic>Brain</topic><topic>Brain modeling</topic><topic>Brain–computer interfaces</topic><topic>Classification</topic><topic>Computational modeling</topic><topic>Convolutional neural networks</topic><topic>convolutional neural networks (CNNs)</topic><topic>edge computing</topic><topic>Electroencephalography</topic><topic>embedded systems</topic><topic>feature reduction</topic><topic>Hardware</topic><topic>Imagery</topic><topic>Inference</topic><topic>machine learning</topic><topic>Man-machine interfaces</topic><topic>motor imagery (MI)</topic><topic>Neural networks</topic><topic>Power management</topic><topic>Prostheses</topic><topic>Spatial filtering</topic><topic>Task analysis</topic><topic>TinyML</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wang, Xiaying</creatorcontrib><creatorcontrib>Hersche, Michael</creatorcontrib><creatorcontrib>Magno, Michele</creatorcontrib><creatorcontrib>Benini, Luca</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE sensors journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wang, Xiaying</au><au>Hersche, Michael</au><au>Magno, Michele</au><au>Benini, Luca</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>MI-BMInet: An Efficient Convolutional Neural Network for Motor Imagery Brain-Machine Interfaces With EEG Channel Selection</atitle><jtitle>IEEE sensors journal</jtitle><stitle>JSEN</stitle><date>2024-03-15</date><risdate>2024</risdate><volume>24</volume><issue>6</issue><spage>8835</spage><epage>8847</epage><pages>8835-8847</pages><issn>1530-437X</issn><eissn>1558-1748</eissn><coden>ISJEAZ</coden><abstract><![CDATA[A brain-machine interface (BMI) based on motor imagery (MI) enables the control of devices using brain signals while the subject imagines performing a movement. It plays a key role in prosthesis control and motor rehabilitation. To improve user comfort, preserve data privacy, and reduce the system's latency, a new trend in wearable BMIs is to execute algorithms on low-power microcontroller units (MCUs) embedded on edge devices to process the electroencephalographic (EEG) data in real-time close to the sensors. However, most of the classification models presented in the literature are too resource-demanding for low-power MCUs. This article proposes an efficient convolutional neural network (CNN) for EEG-based MI classification that achieves comparable accuracy while being orders of magnitude less resource-demanding and significantly more energy-efficient than state-of-the-art (SoA) models. To further reduce the model complexity, we propose an automatic channel selection method based on spatial filters and quantize both weights and activations to 8-bit precision with negligible accuracy loss. Finally, we implement and evaluate the proposed models on leading-edge parallel ultralow-power (PULP) MCUs. The final two-class solution consumes as little as <inline-formula> <tex-math notation="LaTeX">30 ~\mu \text{J} </tex-math></inline-formula>/inference with a runtime of 2.95 ms/inference and an accuracy of 82.51% while using <inline-formula> <tex-math notation="LaTeX">6.4\times </tex-math></inline-formula> fewer EEG channels, becoming the new SoA for embedded MI-BMI and defining a new Pareto frontier in the three-way trade-off among accuracy, resource cost, and power usage.]]></abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/JSEN.2024.3353146</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0003-3467-5033</orcidid><orcidid>https://orcid.org/0000-0001-8068-3806</orcidid><orcidid>https://orcid.org/0000-0003-3065-7639</orcidid><orcidid>https://orcid.org/0000-0003-0368-8923</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1530-437X |
ispartof | IEEE sensors journal, 2024-03, Vol.24 (6), p.8835-8847 |
issn | 1530-437X 1558-1748 |
language | eng |
recordid | cdi_ieee_primary_10409134 |
source | IEEE Electronic Library (IEL) |
subjects | Accuracy Algorithms Artificial neural networks Batteries Brain Brain modeling Brain–computer interfaces Classification Computational modeling Convolutional neural networks convolutional neural networks (CNNs) edge computing Electroencephalography embedded systems feature reduction Hardware Imagery Inference machine learning Man-machine interfaces motor imagery (MI) Neural networks Power management Prostheses Spatial filtering Task analysis TinyML |
title | MI-BMInet: An Efficient Convolutional Neural Network for Motor Imagery Brain-Machine Interfaces With EEG Channel Selection |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-09T17%3A39%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=MI-BMInet:%20An%20Efficient%20Convolutional%20Neural%20Network%20for%20Motor%20Imagery%20Brain-Machine%20Interfaces%20With%20EEG%20Channel%20Selection&rft.jtitle=IEEE%20sensors%20journal&rft.au=Wang,%20Xiaying&rft.date=2024-03-15&rft.volume=24&rft.issue=6&rft.spage=8835&rft.epage=8847&rft.pages=8835-8847&rft.issn=1530-437X&rft.eissn=1558-1748&rft.coden=ISJEAZ&rft_id=info:doi/10.1109/JSEN.2024.3353146&rft_dat=%3Cproquest_RIE%3E2956384269%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2956384269&rft_id=info:pmid/&rft_ieee_id=10409134&rfr_iscdi=true |