Linking Attention-Based Multiscale CNN With Dynamical GCN for Driving Fatigue Detection

Electroencephalography (EEG) signals have been proven to be one of the most predictive and reliable indicators for estimating driving fatigue state. However, how to make full use of EEG data for driving fatigue detection remains a challenge. Many existing methods include a time-consuming manual proc...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on instrumentation and measurement 2021, Vol.70, p.1-11
Hauptverfasser: Wang, Hongtao, Xu, Linfeng, Bezerianos, Anastasios, Chen, Chuangquan, Zhang, Zhiguo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 11
container_issue
container_start_page 1
container_title IEEE transactions on instrumentation and measurement
container_volume 70
creator Wang, Hongtao
Xu, Linfeng
Bezerianos, Anastasios
Chen, Chuangquan
Zhang, Zhiguo
description Electroencephalography (EEG) signals have been proven to be one of the most predictive and reliable indicators for estimating driving fatigue state. However, how to make full use of EEG data for driving fatigue detection remains a challenge. Many existing methods include a time-consuming manual process or tedious parameter tunings for feature extraction, which is inconvenient to train and implement. On the other hand, most models ignore or manually determine EEG connectivity features between different channels, thus failing to thoroughly exploit the intrinsic interchannel relations for classification. In this article, we introduce a new attention-based multiscale convolutional neural network-dynamical graph convolutional network (AMCNN-DGCN) model, aiming to conquer these two issues in a unified end-to-end model. AMCNN-DGCN starts with attention-based multiscale temporal convolutions to automatically learn frequency filters to extract the salient pattern from raw EEG data. Subsequently, AMCNN-DGCN uses dynamical graph convolutional networks (DGCNs) to learn spatial filters, in which the adjacency matrix is adaptively determined in a data-driven way to exploit the intrinsic relationship between channels effectively. With the temporal-spatial structure, AMCNN-DGCN can capture highly discriminative features. To verify the effectiveness of AMCNN-DGCN, we conduct a simulated fatigue driving environment to collect EEG signals from 29 healthy subjects (male/female = 17/12 and age = 23.28±2.70 years) through a remote wireless cap with 24 channels. The results demonstrate that our proposed model outperforms six widely used competitive EEG models with high accuracy of 95.65%. Finally, the critical brain regions and connections for driving fatigue detection were investigated through the dynamically learned adjacency matrix.
doi_str_mv 10.1109/TIM.2020.3047502
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2478835271</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9309090</ieee_id><sourcerecordid>2478835271</sourcerecordid><originalsourceid>FETCH-LOGICAL-c338t-bfe06522f4deb43a05b589c5199b915ba44881ea812de65900dba0673c87fe4b3</originalsourceid><addsrcrecordid>eNo9kE1Lw0AQhhdRsFbvgpcFz6mzX8nusaa2Ftp6qfQYNsmkbm2TmmyE_nsTWmQOA8PzvgMPIY8MRoyBeVnPlyMOHEYCZKSAX5EBUyoKTBjyazIAYDowUoW35K5pdgAQhTIakM3Cld-u3NKx91h6V5XBq20wp8t2712T2T3SeLWiG-e_6ORU2oPrbnQWr2hR1XRSu98-PbXebVukE_SY9S335Kaw-wYfLntIPqdv6_g9WHzM5vF4EWRCaB-kBUKoOC9kjqkUFlSqtMkUMyY1TKVWSq0ZWs14jqEyAHlqIYxEpqMCZSqG5Pnce6yrnxYbn-yqti67lwmXkdZC8Yh1FJyprK6apsYiOdbuYOtTwiDp9SWdvqTXl1z0dZGnc8Qh4j9uBJhuxB-UsGpF</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2478835271</pqid></control><display><type>article</type><title>Linking Attention-Based Multiscale CNN With Dynamical GCN for Driving Fatigue Detection</title><source>IEEE Electronic Library (IEL)</source><creator>Wang, Hongtao ; Xu, Linfeng ; Bezerianos, Anastasios ; Chen, Chuangquan ; Zhang, Zhiguo</creator><creatorcontrib>Wang, Hongtao ; Xu, Linfeng ; Bezerianos, Anastasios ; Chen, Chuangquan ; Zhang, Zhiguo</creatorcontrib><description>Electroencephalography (EEG) signals have been proven to be one of the most predictive and reliable indicators for estimating driving fatigue state. However, how to make full use of EEG data for driving fatigue detection remains a challenge. Many existing methods include a time-consuming manual process or tedious parameter tunings for feature extraction, which is inconvenient to train and implement. On the other hand, most models ignore or manually determine EEG connectivity features between different channels, thus failing to thoroughly exploit the intrinsic interchannel relations for classification. In this article, we introduce a new attention-based multiscale convolutional neural network-dynamical graph convolutional network (AMCNN-DGCN) model, aiming to conquer these two issues in a unified end-to-end model. AMCNN-DGCN starts with attention-based multiscale temporal convolutions to automatically learn frequency filters to extract the salient pattern from raw EEG data. Subsequently, AMCNN-DGCN uses dynamical graph convolutional networks (DGCNs) to learn spatial filters, in which the adjacency matrix is adaptively determined in a data-driven way to exploit the intrinsic relationship between channels effectively. With the temporal-spatial structure, AMCNN-DGCN can capture highly discriminative features. To verify the effectiveness of AMCNN-DGCN, we conduct a simulated fatigue driving environment to collect EEG signals from 29 healthy subjects (male/female = 17/12 and age = 23.28±2.70 years) through a remote wireless cap with 24 channels. The results demonstrate that our proposed model outperforms six widely used competitive EEG models with high accuracy of 95.65%. Finally, the critical brain regions and connections for driving fatigue detection were investigated through the dynamically learned adjacency matrix.</description><identifier>ISSN: 0018-9456</identifier><identifier>EISSN: 1557-9662</identifier><identifier>DOI: 10.1109/TIM.2020.3047502</identifier><identifier>CODEN: IEIMAO</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Adaptation models ; Artificial neural networks ; Attention-based multiscale convolutional neural network (CNN) ; Brain modeling ; Channels ; Convolution ; Driver fatigue ; driving fatigue ; dynamical graph convolution network (GCN) ; Electroencephalography ; electroencephalography (EEG) ; Fatigue ; Feature extraction ; Frequency filters ; Model accuracy ; Process parameters ; Spatial filtering ; spatiotemporal structure ; Tuning</subject><ispartof>IEEE transactions on instrumentation and measurement, 2021, Vol.70, p.1-11</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c338t-bfe06522f4deb43a05b589c5199b915ba44881ea812de65900dba0673c87fe4b3</citedby><cites>FETCH-LOGICAL-c338t-bfe06522f4deb43a05b589c5199b915ba44881ea812de65900dba0673c87fe4b3</cites><orcidid>0000-0002-8199-6000 ; 0000-0001-7992-7965 ; 0000-0002-3811-296X ; 0000-0002-6564-5753 ; 0000-0002-6188-8424</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9309090$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,4024,27923,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9309090$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Wang, Hongtao</creatorcontrib><creatorcontrib>Xu, Linfeng</creatorcontrib><creatorcontrib>Bezerianos, Anastasios</creatorcontrib><creatorcontrib>Chen, Chuangquan</creatorcontrib><creatorcontrib>Zhang, Zhiguo</creatorcontrib><title>Linking Attention-Based Multiscale CNN With Dynamical GCN for Driving Fatigue Detection</title><title>IEEE transactions on instrumentation and measurement</title><addtitle>TIM</addtitle><description>Electroencephalography (EEG) signals have been proven to be one of the most predictive and reliable indicators for estimating driving fatigue state. However, how to make full use of EEG data for driving fatigue detection remains a challenge. Many existing methods include a time-consuming manual process or tedious parameter tunings for feature extraction, which is inconvenient to train and implement. On the other hand, most models ignore or manually determine EEG connectivity features between different channels, thus failing to thoroughly exploit the intrinsic interchannel relations for classification. In this article, we introduce a new attention-based multiscale convolutional neural network-dynamical graph convolutional network (AMCNN-DGCN) model, aiming to conquer these two issues in a unified end-to-end model. AMCNN-DGCN starts with attention-based multiscale temporal convolutions to automatically learn frequency filters to extract the salient pattern from raw EEG data. Subsequently, AMCNN-DGCN uses dynamical graph convolutional networks (DGCNs) to learn spatial filters, in which the adjacency matrix is adaptively determined in a data-driven way to exploit the intrinsic relationship between channels effectively. With the temporal-spatial structure, AMCNN-DGCN can capture highly discriminative features. To verify the effectiveness of AMCNN-DGCN, we conduct a simulated fatigue driving environment to collect EEG signals from 29 healthy subjects (male/female = 17/12 and age = 23.28±2.70 years) through a remote wireless cap with 24 channels. The results demonstrate that our proposed model outperforms six widely used competitive EEG models with high accuracy of 95.65%. Finally, the critical brain regions and connections for driving fatigue detection were investigated through the dynamically learned adjacency matrix.</description><subject>Adaptation models</subject><subject>Artificial neural networks</subject><subject>Attention-based multiscale convolutional neural network (CNN)</subject><subject>Brain modeling</subject><subject>Channels</subject><subject>Convolution</subject><subject>Driver fatigue</subject><subject>driving fatigue</subject><subject>dynamical graph convolution network (GCN)</subject><subject>Electroencephalography</subject><subject>electroencephalography (EEG)</subject><subject>Fatigue</subject><subject>Feature extraction</subject><subject>Frequency filters</subject><subject>Model accuracy</subject><subject>Process parameters</subject><subject>Spatial filtering</subject><subject>spatiotemporal structure</subject><subject>Tuning</subject><issn>0018-9456</issn><issn>1557-9662</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kE1Lw0AQhhdRsFbvgpcFz6mzX8nusaa2Ftp6qfQYNsmkbm2TmmyE_nsTWmQOA8PzvgMPIY8MRoyBeVnPlyMOHEYCZKSAX5EBUyoKTBjyazIAYDowUoW35K5pdgAQhTIakM3Cld-u3NKx91h6V5XBq20wp8t2712T2T3SeLWiG-e_6ORU2oPrbnQWr2hR1XRSu98-PbXebVukE_SY9S335Kaw-wYfLntIPqdv6_g9WHzM5vF4EWRCaB-kBUKoOC9kjqkUFlSqtMkUMyY1TKVWSq0ZWs14jqEyAHlqIYxEpqMCZSqG5Pnce6yrnxYbn-yqti67lwmXkdZC8Yh1FJyprK6apsYiOdbuYOtTwiDp9SWdvqTXl1z0dZGnc8Qh4j9uBJhuxB-UsGpF</recordid><startdate>2021</startdate><enddate>2021</enddate><creator>Wang, Hongtao</creator><creator>Xu, Linfeng</creator><creator>Bezerianos, Anastasios</creator><creator>Chen, Chuangquan</creator><creator>Zhang, Zhiguo</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0002-8199-6000</orcidid><orcidid>https://orcid.org/0000-0001-7992-7965</orcidid><orcidid>https://orcid.org/0000-0002-3811-296X</orcidid><orcidid>https://orcid.org/0000-0002-6564-5753</orcidid><orcidid>https://orcid.org/0000-0002-6188-8424</orcidid></search><sort><creationdate>2021</creationdate><title>Linking Attention-Based Multiscale CNN With Dynamical GCN for Driving Fatigue Detection</title><author>Wang, Hongtao ; Xu, Linfeng ; Bezerianos, Anastasios ; Chen, Chuangquan ; Zhang, Zhiguo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c338t-bfe06522f4deb43a05b589c5199b915ba44881ea812de65900dba0673c87fe4b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Adaptation models</topic><topic>Artificial neural networks</topic><topic>Attention-based multiscale convolutional neural network (CNN)</topic><topic>Brain modeling</topic><topic>Channels</topic><topic>Convolution</topic><topic>Driver fatigue</topic><topic>driving fatigue</topic><topic>dynamical graph convolution network (GCN)</topic><topic>Electroencephalography</topic><topic>electroencephalography (EEG)</topic><topic>Fatigue</topic><topic>Feature extraction</topic><topic>Frequency filters</topic><topic>Model accuracy</topic><topic>Process parameters</topic><topic>Spatial filtering</topic><topic>spatiotemporal structure</topic><topic>Tuning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wang, Hongtao</creatorcontrib><creatorcontrib>Xu, Linfeng</creatorcontrib><creatorcontrib>Bezerianos, Anastasios</creatorcontrib><creatorcontrib>Chen, Chuangquan</creatorcontrib><creatorcontrib>Zhang, Zhiguo</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on instrumentation and measurement</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wang, Hongtao</au><au>Xu, Linfeng</au><au>Bezerianos, Anastasios</au><au>Chen, Chuangquan</au><au>Zhang, Zhiguo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Linking Attention-Based Multiscale CNN With Dynamical GCN for Driving Fatigue Detection</atitle><jtitle>IEEE transactions on instrumentation and measurement</jtitle><stitle>TIM</stitle><date>2021</date><risdate>2021</risdate><volume>70</volume><spage>1</spage><epage>11</epage><pages>1-11</pages><issn>0018-9456</issn><eissn>1557-9662</eissn><coden>IEIMAO</coden><abstract>Electroencephalography (EEG) signals have been proven to be one of the most predictive and reliable indicators for estimating driving fatigue state. However, how to make full use of EEG data for driving fatigue detection remains a challenge. Many existing methods include a time-consuming manual process or tedious parameter tunings for feature extraction, which is inconvenient to train and implement. On the other hand, most models ignore or manually determine EEG connectivity features between different channels, thus failing to thoroughly exploit the intrinsic interchannel relations for classification. In this article, we introduce a new attention-based multiscale convolutional neural network-dynamical graph convolutional network (AMCNN-DGCN) model, aiming to conquer these two issues in a unified end-to-end model. AMCNN-DGCN starts with attention-based multiscale temporal convolutions to automatically learn frequency filters to extract the salient pattern from raw EEG data. Subsequently, AMCNN-DGCN uses dynamical graph convolutional networks (DGCNs) to learn spatial filters, in which the adjacency matrix is adaptively determined in a data-driven way to exploit the intrinsic relationship between channels effectively. With the temporal-spatial structure, AMCNN-DGCN can capture highly discriminative features. To verify the effectiveness of AMCNN-DGCN, we conduct a simulated fatigue driving environment to collect EEG signals from 29 healthy subjects (male/female = 17/12 and age = 23.28±2.70 years) through a remote wireless cap with 24 channels. The results demonstrate that our proposed model outperforms six widely used competitive EEG models with high accuracy of 95.65%. Finally, the critical brain regions and connections for driving fatigue detection were investigated through the dynamically learned adjacency matrix.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TIM.2020.3047502</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0002-8199-6000</orcidid><orcidid>https://orcid.org/0000-0001-7992-7965</orcidid><orcidid>https://orcid.org/0000-0002-3811-296X</orcidid><orcidid>https://orcid.org/0000-0002-6564-5753</orcidid><orcidid>https://orcid.org/0000-0002-6188-8424</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 0018-9456
ispartof IEEE transactions on instrumentation and measurement, 2021, Vol.70, p.1-11
issn 0018-9456
1557-9662
language eng
recordid cdi_proquest_journals_2478835271
source IEEE Electronic Library (IEL)
subjects Adaptation models
Artificial neural networks
Attention-based multiscale convolutional neural network (CNN)
Brain modeling
Channels
Convolution
Driver fatigue
driving fatigue
dynamical graph convolution network (GCN)
Electroencephalography
electroencephalography (EEG)
Fatigue
Feature extraction
Frequency filters
Model accuracy
Process parameters
Spatial filtering
spatiotemporal structure
Tuning
title Linking Attention-Based Multiscale CNN With Dynamical GCN for Driving Fatigue Detection
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T14%3A17%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Linking%20Attention-Based%20Multiscale%20CNN%20With%20Dynamical%20GCN%20for%20Driving%20Fatigue%20Detection&rft.jtitle=IEEE%20transactions%20on%20instrumentation%20and%20measurement&rft.au=Wang,%20Hongtao&rft.date=2021&rft.volume=70&rft.spage=1&rft.epage=11&rft.pages=1-11&rft.issn=0018-9456&rft.eissn=1557-9662&rft.coden=IEIMAO&rft_id=info:doi/10.1109/TIM.2020.3047502&rft_dat=%3Cproquest_RIE%3E2478835271%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2478835271&rft_id=info:pmid/&rft_ieee_id=9309090&rfr_iscdi=true