Decoding attention control and selection in visual spatial attention

Event‐related potentials (ERPs) are used extensively to investigate the neural mechanisms of attention control and selection. The univariate ERP approach, however, has left important questions inadequately answered. We addressed two questions by applying multivariate pattern classification to multic...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Human brain mapping 2020-10, Vol.41 (14), p.3900-3921
Hauptverfasser: Hong, Xiangfei, Bo, Ke, Meyyappan, Sreenivasan, Tong, Shanbao, Ding, Mingzhou
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 3921
container_issue 14
container_start_page 3900
container_title Human brain mapping
container_volume 41
creator Hong, Xiangfei
Bo, Ke
Meyyappan, Sreenivasan
Tong, Shanbao
Ding, Mingzhou
description Event‐related potentials (ERPs) are used extensively to investigate the neural mechanisms of attention control and selection. The univariate ERP approach, however, has left important questions inadequately answered. We addressed two questions by applying multivariate pattern classification to multichannel ERPs in two cued visual spatial attention experiments (N = 56): (a) impact of cueing strategies (instructional vs. probabilistic) on attention control and selection and (b) neural and behavioral effects of individual differences. Following cue onset, the decoding accuracy (cue left vs. cue right) began to rise above chance level earlier and remained higher in instructional cueing (~80 ms) than in probabilistic cueing (~160 ms), suggesting that unilateral attention focus leads to earlier and more distinct formation of the attention control set. A similar temporal sequence was also found for target‐related processing (cued target vs. uncued target), suggesting earlier and stronger attention selection under instructional cueing. Across the two experiments: (a) individuals with higher cue‐related decoding accuracy showed higher magnitude of attentional modulation of target‐evoked N1 amplitude, suggesting that better formation of anticipatory attentional state leads to stronger modulation of target processing, and (b) individuals with higher target‐related decoding accuracy showed faster reaction times (or larger cueing effects), suggesting that stronger selection of task‐relevant information leads to better behavioral performance. Taken together, multichannel ERPs combined with machine learning decoding yields new insights into attention control and selection that complement the univariate ERP approach, and along with the univariate ERP approach, provides a more comprehensive methodology to the study of visual spatial attention. Covert spatial attention can be decoded from multichannel event‐related potential patterns during both attention control (cue left vs. cue right) and attention selection (cued target vs. uncued target). Decoding accuracy (cue left vs. cue right) during attention control predicted the magnitude of attentional modulation of target‐related N1 component. Decoding accuracy (cued target vs. uncued target) during attention selection predicted behavioral performance.
doi_str_mv 10.1002/hbm.25094
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_7469865</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2414000943</sourcerecordid><originalsourceid>FETCH-LOGICAL-c5094-112b04f79467db059c4946dd5ac492302761767d15f6cd33dbd4f1fb67b735143</originalsourceid><addsrcrecordid>eNp1kUtLAzEUhYMoWqsL_4AMuNHF1DwnzUZQ6wsqbnQdMkmmRqZJncwo_femD4sKru4h98vJCQeAIwQHCEJ8_lpOB5hBQbdAD0HBc4gE2V7oguWCcrQH9mN8gxAhBtEu2COYUTxkuAdGI6uDcX6Sqba1vnXBZzr4tgl1przJoq2tXp46n3242Kk6izPVujQ3Nw7ATqXqaA_Xsw9ebm-er-_z8dPdw_XlONeLcDlCuIS04oIW3JSQCU2TNIapJDCBmBeIpxViVaENIaY0tEJVWfCSE4Yo6YOLle-sK6fW6PR6o2o5a9xUNXMZlJO_N969ykn4kJwWYliwZHC6NmjCe2djK6cualvXytvQRYkpohCmrCShJ3_Qt9A1Pn0vUUQwUXDIE3W2onQTYmxstQmDoFx0I1M3ctlNYo9_pt-Q32Uk4HwFfLrazv93kvdXjyvLL4KzmBQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2439596707</pqid></control><display><type>article</type><title>Decoding attention control and selection in visual spatial attention</title><source>Wiley Journals</source><source>DOAJ Directory of Open Access Journals</source><source>Wiley Online Library Open Access</source><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><creator>Hong, Xiangfei ; Bo, Ke ; Meyyappan, Sreenivasan ; Tong, Shanbao ; Ding, Mingzhou</creator><creatorcontrib>Hong, Xiangfei ; Bo, Ke ; Meyyappan, Sreenivasan ; Tong, Shanbao ; Ding, Mingzhou</creatorcontrib><description>Event‐related potentials (ERPs) are used extensively to investigate the neural mechanisms of attention control and selection. The univariate ERP approach, however, has left important questions inadequately answered. We addressed two questions by applying multivariate pattern classification to multichannel ERPs in two cued visual spatial attention experiments (N = 56): (a) impact of cueing strategies (instructional vs. probabilistic) on attention control and selection and (b) neural and behavioral effects of individual differences. Following cue onset, the decoding accuracy (cue left vs. cue right) began to rise above chance level earlier and remained higher in instructional cueing (~80 ms) than in probabilistic cueing (~160 ms), suggesting that unilateral attention focus leads to earlier and more distinct formation of the attention control set. A similar temporal sequence was also found for target‐related processing (cued target vs. uncued target), suggesting earlier and stronger attention selection under instructional cueing. Across the two experiments: (a) individuals with higher cue‐related decoding accuracy showed higher magnitude of attentional modulation of target‐evoked N1 amplitude, suggesting that better formation of anticipatory attentional state leads to stronger modulation of target processing, and (b) individuals with higher target‐related decoding accuracy showed faster reaction times (or larger cueing effects), suggesting that stronger selection of task‐relevant information leads to better behavioral performance. Taken together, multichannel ERPs combined with machine learning decoding yields new insights into attention control and selection that complement the univariate ERP approach, and along with the univariate ERP approach, provides a more comprehensive methodology to the study of visual spatial attention. Covert spatial attention can be decoded from multichannel event‐related potential patterns during both attention control (cue left vs. cue right) and attention selection (cued target vs. uncued target). Decoding accuracy (cue left vs. cue right) during attention control predicted the magnitude of attentional modulation of target‐related N1 component. Decoding accuracy (cued target vs. uncued target) during attention selection predicted behavioral performance.</description><identifier>ISSN: 1065-9471</identifier><identifier>EISSN: 1097-0193</identifier><identifier>DOI: 10.1002/hbm.25094</identifier><identifier>PMID: 32542852</identifier><language>eng</language><publisher>Hoboken, USA: John Wiley &amp; Sons, Inc</publisher><subject>Accuracy ; Attention ; Auditory evoked potentials ; decoding ; Event-related potentials ; event‐related potential ; Learning algorithms ; Machine learning ; Modulation ; pattern classification ; Questions ; reaction time ; selective attention ; Visual perception ; Visual stimuli</subject><ispartof>Human brain mapping, 2020-10, Vol.41 (14), p.3900-3921</ispartof><rights>2020 The Authors. published by Wiley Periodicals, Inc.</rights><rights>2020 The Authors. Human Brain Mapping published by Wiley Periodicals, Inc.</rights><rights>2020. This article is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c5094-112b04f79467db059c4946dd5ac492302761767d15f6cd33dbd4f1fb67b735143</citedby><cites>FETCH-LOGICAL-c5094-112b04f79467db059c4946dd5ac492302761767d15f6cd33dbd4f1fb67b735143</cites><orcidid>0000-0001-7475-3185</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC7469865/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC7469865/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,864,885,1417,11562,27924,27925,45574,45575,46052,46476,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32542852$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Hong, Xiangfei</creatorcontrib><creatorcontrib>Bo, Ke</creatorcontrib><creatorcontrib>Meyyappan, Sreenivasan</creatorcontrib><creatorcontrib>Tong, Shanbao</creatorcontrib><creatorcontrib>Ding, Mingzhou</creatorcontrib><title>Decoding attention control and selection in visual spatial attention</title><title>Human brain mapping</title><addtitle>Hum Brain Mapp</addtitle><description>Event‐related potentials (ERPs) are used extensively to investigate the neural mechanisms of attention control and selection. The univariate ERP approach, however, has left important questions inadequately answered. We addressed two questions by applying multivariate pattern classification to multichannel ERPs in two cued visual spatial attention experiments (N = 56): (a) impact of cueing strategies (instructional vs. probabilistic) on attention control and selection and (b) neural and behavioral effects of individual differences. Following cue onset, the decoding accuracy (cue left vs. cue right) began to rise above chance level earlier and remained higher in instructional cueing (~80 ms) than in probabilistic cueing (~160 ms), suggesting that unilateral attention focus leads to earlier and more distinct formation of the attention control set. A similar temporal sequence was also found for target‐related processing (cued target vs. uncued target), suggesting earlier and stronger attention selection under instructional cueing. Across the two experiments: (a) individuals with higher cue‐related decoding accuracy showed higher magnitude of attentional modulation of target‐evoked N1 amplitude, suggesting that better formation of anticipatory attentional state leads to stronger modulation of target processing, and (b) individuals with higher target‐related decoding accuracy showed faster reaction times (or larger cueing effects), suggesting that stronger selection of task‐relevant information leads to better behavioral performance. Taken together, multichannel ERPs combined with machine learning decoding yields new insights into attention control and selection that complement the univariate ERP approach, and along with the univariate ERP approach, provides a more comprehensive methodology to the study of visual spatial attention. Covert spatial attention can be decoded from multichannel event‐related potential patterns during both attention control (cue left vs. cue right) and attention selection (cued target vs. uncued target). Decoding accuracy (cue left vs. cue right) during attention control predicted the magnitude of attentional modulation of target‐related N1 component. Decoding accuracy (cued target vs. uncued target) during attention selection predicted behavioral performance.</description><subject>Accuracy</subject><subject>Attention</subject><subject>Auditory evoked potentials</subject><subject>decoding</subject><subject>Event-related potentials</subject><subject>event‐related potential</subject><subject>Learning algorithms</subject><subject>Machine learning</subject><subject>Modulation</subject><subject>pattern classification</subject><subject>Questions</subject><subject>reaction time</subject><subject>selective attention</subject><subject>Visual perception</subject><subject>Visual stimuli</subject><issn>1065-9471</issn><issn>1097-0193</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>24P</sourceid><sourceid>WIN</sourceid><recordid>eNp1kUtLAzEUhYMoWqsL_4AMuNHF1DwnzUZQ6wsqbnQdMkmmRqZJncwo_femD4sKru4h98vJCQeAIwQHCEJ8_lpOB5hBQbdAD0HBc4gE2V7oguWCcrQH9mN8gxAhBtEu2COYUTxkuAdGI6uDcX6Sqba1vnXBZzr4tgl1przJoq2tXp46n3242Kk6izPVujQ3Nw7ATqXqaA_Xsw9ebm-er-_z8dPdw_XlONeLcDlCuIS04oIW3JSQCU2TNIapJDCBmBeIpxViVaENIaY0tEJVWfCSE4Yo6YOLle-sK6fW6PR6o2o5a9xUNXMZlJO_N969ykn4kJwWYliwZHC6NmjCe2djK6cualvXytvQRYkpohCmrCShJ3_Qt9A1Pn0vUUQwUXDIE3W2onQTYmxstQmDoFx0I1M3ctlNYo9_pt-Q32Uk4HwFfLrazv93kvdXjyvLL4KzmBQ</recordid><startdate>20201001</startdate><enddate>20201001</enddate><creator>Hong, Xiangfei</creator><creator>Bo, Ke</creator><creator>Meyyappan, Sreenivasan</creator><creator>Tong, Shanbao</creator><creator>Ding, Mingzhou</creator><general>John Wiley &amp; Sons, Inc</general><scope>24P</scope><scope>WIN</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QR</scope><scope>7TK</scope><scope>7U7</scope><scope>8FD</scope><scope>C1K</scope><scope>FR3</scope><scope>K9.</scope><scope>P64</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0001-7475-3185</orcidid></search><sort><creationdate>20201001</creationdate><title>Decoding attention control and selection in visual spatial attention</title><author>Hong, Xiangfei ; Bo, Ke ; Meyyappan, Sreenivasan ; Tong, Shanbao ; Ding, Mingzhou</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c5094-112b04f79467db059c4946dd5ac492302761767d15f6cd33dbd4f1fb67b735143</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Accuracy</topic><topic>Attention</topic><topic>Auditory evoked potentials</topic><topic>decoding</topic><topic>Event-related potentials</topic><topic>event‐related potential</topic><topic>Learning algorithms</topic><topic>Machine learning</topic><topic>Modulation</topic><topic>pattern classification</topic><topic>Questions</topic><topic>reaction time</topic><topic>selective attention</topic><topic>Visual perception</topic><topic>Visual stimuli</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hong, Xiangfei</creatorcontrib><creatorcontrib>Bo, Ke</creatorcontrib><creatorcontrib>Meyyappan, Sreenivasan</creatorcontrib><creatorcontrib>Tong, Shanbao</creatorcontrib><creatorcontrib>Ding, Mingzhou</creatorcontrib><collection>Wiley Online Library Open Access</collection><collection>Wiley Online Library Free Content</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Chemoreception Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Toxicology Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>Engineering Research Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Human brain mapping</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hong, Xiangfei</au><au>Bo, Ke</au><au>Meyyappan, Sreenivasan</au><au>Tong, Shanbao</au><au>Ding, Mingzhou</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Decoding attention control and selection in visual spatial attention</atitle><jtitle>Human brain mapping</jtitle><addtitle>Hum Brain Mapp</addtitle><date>2020-10-01</date><risdate>2020</risdate><volume>41</volume><issue>14</issue><spage>3900</spage><epage>3921</epage><pages>3900-3921</pages><issn>1065-9471</issn><eissn>1097-0193</eissn><abstract>Event‐related potentials (ERPs) are used extensively to investigate the neural mechanisms of attention control and selection. The univariate ERP approach, however, has left important questions inadequately answered. We addressed two questions by applying multivariate pattern classification to multichannel ERPs in two cued visual spatial attention experiments (N = 56): (a) impact of cueing strategies (instructional vs. probabilistic) on attention control and selection and (b) neural and behavioral effects of individual differences. Following cue onset, the decoding accuracy (cue left vs. cue right) began to rise above chance level earlier and remained higher in instructional cueing (~80 ms) than in probabilistic cueing (~160 ms), suggesting that unilateral attention focus leads to earlier and more distinct formation of the attention control set. A similar temporal sequence was also found for target‐related processing (cued target vs. uncued target), suggesting earlier and stronger attention selection under instructional cueing. Across the two experiments: (a) individuals with higher cue‐related decoding accuracy showed higher magnitude of attentional modulation of target‐evoked N1 amplitude, suggesting that better formation of anticipatory attentional state leads to stronger modulation of target processing, and (b) individuals with higher target‐related decoding accuracy showed faster reaction times (or larger cueing effects), suggesting that stronger selection of task‐relevant information leads to better behavioral performance. Taken together, multichannel ERPs combined with machine learning decoding yields new insights into attention control and selection that complement the univariate ERP approach, and along with the univariate ERP approach, provides a more comprehensive methodology to the study of visual spatial attention. Covert spatial attention can be decoded from multichannel event‐related potential patterns during both attention control (cue left vs. cue right) and attention selection (cued target vs. uncued target). Decoding accuracy (cue left vs. cue right) during attention control predicted the magnitude of attentional modulation of target‐related N1 component. Decoding accuracy (cued target vs. uncued target) during attention selection predicted behavioral performance.</abstract><cop>Hoboken, USA</cop><pub>John Wiley &amp; Sons, Inc</pub><pmid>32542852</pmid><doi>10.1002/hbm.25094</doi><tpages>22</tpages><orcidid>https://orcid.org/0000-0001-7475-3185</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1065-9471
ispartof Human brain mapping, 2020-10, Vol.41 (14), p.3900-3921
issn 1065-9471
1097-0193
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_7469865
source Wiley Journals; DOAJ Directory of Open Access Journals; Wiley Online Library Open Access; EZB-FREE-00999 freely available EZB journals; PubMed Central
subjects Accuracy
Attention
Auditory evoked potentials
decoding
Event-related potentials
event‐related potential
Learning algorithms
Machine learning
Modulation
pattern classification
Questions
reaction time
selective attention
Visual perception
Visual stimuli
title Decoding attention control and selection in visual spatial attention
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T14%3A07%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Decoding%20attention%20control%20and%20selection%20in%20visual%20spatial%20attention&rft.jtitle=Human%20brain%20mapping&rft.au=Hong,%20Xiangfei&rft.date=2020-10-01&rft.volume=41&rft.issue=14&rft.spage=3900&rft.epage=3921&rft.pages=3900-3921&rft.issn=1065-9471&rft.eissn=1097-0193&rft_id=info:doi/10.1002/hbm.25094&rft_dat=%3Cproquest_pubme%3E2414000943%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2439596707&rft_id=info:pmid/32542852&rfr_iscdi=true