Multisensory integration of speech and gestures in a naturalistic paradigm
Speech comprehension is crucial for human social interaction, relying on the integration of auditory and visual cues across various levels of representation. While research has extensively studied multisensory integration (MSI) using idealised, well‐controlled stimuli, there is a need to understand...
Gespeichert in:
Veröffentlicht in: | Human brain mapping 2024-08, Vol.45 (11), p.e26797-n/a |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | n/a |
---|---|
container_issue | 11 |
container_start_page | e26797 |
container_title | Human brain mapping |
container_volume | 45 |
creator | Matyjek, Magdalena Kita, Sotaro Torralba Cuello, Mireia Soto Faraco, Salvador |
description | Speech comprehension is crucial for human social interaction, relying on the integration of auditory and visual cues across various levels of representation. While research has extensively studied multisensory integration (MSI) using idealised, well‐controlled stimuli, there is a need to understand this process in response to complex, naturalistic stimuli encountered in everyday life. This study investigated behavioural and neural MSI in neurotypical adults experiencing audio‐visual speech within a naturalistic, social context. Our novel paradigm incorporated a broader social situational context, complete words, and speech‐supporting iconic gestures, allowing for context‐based pragmatics and semantic priors. We investigated MSI in the presence of unimodal (auditory or visual) or complementary, bimodal speech signals. During audio‐visual speech trials, compared to unimodal trials, participants more accurately recognised spoken words and showed a more pronounced suppression of alpha power—an indicator of heightened integration load. Importantly, on the neural level, these effects surpassed mere summation of unimodal responses, suggesting non‐linear MSI mechanisms. Overall, our findings demonstrate that typically developing adults integrate audio‐visual speech and gesture information to facilitate speech comprehension in noisy environments, highlighting the importance of studying MSI in ecologically valid contexts.
We explored brain mechanisms underlying multisensory integration (MSI) of audio‐visual speech and gestures in an ecologically valid context. Audio‐visual versus only audio or visual trials triggered enhanced word recognition and increased alpha suppression indicating heightened integration load. This contributes to our understanding of MSI in real‐life contexts. |
doi_str_mv | 10.1002/hbm.26797 |
format | Article |
fullrecord | <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_11263810</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3083681189</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3347-f348e939f0d1c879199428d09088819990ff50666def58f6d930802d75adb47c3</originalsourceid><addsrcrecordid>eNp1kU9P3DAQxS1EVSj0wBdAkbjQQ2AmTvznVFHUFioQl_ZseWN71yixFzsp2m9f06WoVOrJM5qfnt_MI-QI4QwBmvPVYjxrGJd8h-wjSF4DSrr7VLOuli3HPfIu53sAxA7wLdmjElpE3u2Tb7fzMPlsQ45pU_kw2WXSk4-hiq7Ka2v7VaWDqZY2T3OyuSCVroIujR58nnxfrXXSxi_HQ_LG6SHb98_vAfnx5fP3y6v65u7r9eXFTd1T2vLa0VZYSaUDg73gEqVsG2FAghCiNBKc64AxZqzrhGNGUhDQGN5ps2h5Tw_Ix63uel6M1vQ2TMWLWic_6rRRUXv1ehL8Si3jT4XYMCoQisLps0KKD3PZTI0-93YYdLBxzqp8SJlAFLKgJ_-g93FOoexXqOKsKydlhfqwpfoUc07WvbhBUE8RqRKR-h1RYY__tv9C_smkAOdb4NEPdvN_JXX16XYr-QtBUZp-</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3093051156</pqid></control><display><type>article</type><title>Multisensory integration of speech and gestures in a naturalistic paradigm</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Wiley Online Library Journals Frontfile Complete</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Wiley-Blackwell Open Access Titles</source><source>PubMed Central</source><creator>Matyjek, Magdalena ; Kita, Sotaro ; Torralba Cuello, Mireia ; Soto Faraco, Salvador</creator><creatorcontrib>Matyjek, Magdalena ; Kita, Sotaro ; Torralba Cuello, Mireia ; Soto Faraco, Salvador</creatorcontrib><description>Speech comprehension is crucial for human social interaction, relying on the integration of auditory and visual cues across various levels of representation. While research has extensively studied multisensory integration (MSI) using idealised, well‐controlled stimuli, there is a need to understand this process in response to complex, naturalistic stimuli encountered in everyday life. This study investigated behavioural and neural MSI in neurotypical adults experiencing audio‐visual speech within a naturalistic, social context. Our novel paradigm incorporated a broader social situational context, complete words, and speech‐supporting iconic gestures, allowing for context‐based pragmatics and semantic priors. We investigated MSI in the presence of unimodal (auditory or visual) or complementary, bimodal speech signals. During audio‐visual speech trials, compared to unimodal trials, participants more accurately recognised spoken words and showed a more pronounced suppression of alpha power—an indicator of heightened integration load. Importantly, on the neural level, these effects surpassed mere summation of unimodal responses, suggesting non‐linear MSI mechanisms. Overall, our findings demonstrate that typically developing adults integrate audio‐visual speech and gesture information to facilitate speech comprehension in noisy environments, highlighting the importance of studying MSI in ecologically valid contexts.
We explored brain mechanisms underlying multisensory integration (MSI) of audio‐visual speech and gestures in an ecologically valid context. Audio‐visual versus only audio or visual trials triggered enhanced word recognition and increased alpha suppression indicating heightened integration load. This contributes to our understanding of MSI in real‐life contexts.</description><identifier>ISSN: 1065-9471</identifier><identifier>ISSN: 1097-0193</identifier><identifier>EISSN: 1097-0193</identifier><identifier>DOI: 10.1002/hbm.26797</identifier><identifier>PMID: 39041175</identifier><language>eng</language><publisher>Hoboken, USA: John Wiley & Sons, Inc</publisher><subject>Acoustic Stimulation ; Adult ; Adults ; Audio data ; audio‐visual speech ; Brain - physiology ; Comprehension - physiology ; Context ; EEG ; Electroencephalography ; Female ; Gestures ; Humans ; iconic gestures ; Information processing ; Male ; multisensory integration ; Photic Stimulation - methods ; Semantics ; Sensory integration ; Social behavior ; Social factors ; Speech ; Speech - physiology ; Speech Perception - physiology ; Validity ; Visual Perception - physiology ; Visual stimuli ; Voice recognition ; Words (language) ; Young Adult</subject><ispartof>Human brain mapping, 2024-08, Vol.45 (11), p.e26797-n/a</ispartof><rights>2024 The Author(s). published by Wiley Periodicals LLC.</rights><rights>2024 The Author(s). Human Brain Mapping published by Wiley Periodicals LLC.</rights><rights>2024. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c3347-f348e939f0d1c879199428d09088819990ff50666def58f6d930802d75adb47c3</cites><orcidid>0000-0003-4546-6480 ; 0000-0002-4799-3762 ; 0000-0003-3035-3918 ; 0000-0002-0088-3654</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC11263810/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC11263810/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,725,778,782,862,883,1414,11549,27911,27912,45561,45562,46039,46463,53778,53780</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39041175$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Matyjek, Magdalena</creatorcontrib><creatorcontrib>Kita, Sotaro</creatorcontrib><creatorcontrib>Torralba Cuello, Mireia</creatorcontrib><creatorcontrib>Soto Faraco, Salvador</creatorcontrib><title>Multisensory integration of speech and gestures in a naturalistic paradigm</title><title>Human brain mapping</title><addtitle>Hum Brain Mapp</addtitle><description>Speech comprehension is crucial for human social interaction, relying on the integration of auditory and visual cues across various levels of representation. While research has extensively studied multisensory integration (MSI) using idealised, well‐controlled stimuli, there is a need to understand this process in response to complex, naturalistic stimuli encountered in everyday life. This study investigated behavioural and neural MSI in neurotypical adults experiencing audio‐visual speech within a naturalistic, social context. Our novel paradigm incorporated a broader social situational context, complete words, and speech‐supporting iconic gestures, allowing for context‐based pragmatics and semantic priors. We investigated MSI in the presence of unimodal (auditory or visual) or complementary, bimodal speech signals. During audio‐visual speech trials, compared to unimodal trials, participants more accurately recognised spoken words and showed a more pronounced suppression of alpha power—an indicator of heightened integration load. Importantly, on the neural level, these effects surpassed mere summation of unimodal responses, suggesting non‐linear MSI mechanisms. Overall, our findings demonstrate that typically developing adults integrate audio‐visual speech and gesture information to facilitate speech comprehension in noisy environments, highlighting the importance of studying MSI in ecologically valid contexts.
We explored brain mechanisms underlying multisensory integration (MSI) of audio‐visual speech and gestures in an ecologically valid context. Audio‐visual versus only audio or visual trials triggered enhanced word recognition and increased alpha suppression indicating heightened integration load. This contributes to our understanding of MSI in real‐life contexts.</description><subject>Acoustic Stimulation</subject><subject>Adult</subject><subject>Adults</subject><subject>Audio data</subject><subject>audio‐visual speech</subject><subject>Brain - physiology</subject><subject>Comprehension - physiology</subject><subject>Context</subject><subject>EEG</subject><subject>Electroencephalography</subject><subject>Female</subject><subject>Gestures</subject><subject>Humans</subject><subject>iconic gestures</subject><subject>Information processing</subject><subject>Male</subject><subject>multisensory integration</subject><subject>Photic Stimulation - methods</subject><subject>Semantics</subject><subject>Sensory integration</subject><subject>Social behavior</subject><subject>Social factors</subject><subject>Speech</subject><subject>Speech - physiology</subject><subject>Speech Perception - physiology</subject><subject>Validity</subject><subject>Visual Perception - physiology</subject><subject>Visual stimuli</subject><subject>Voice recognition</subject><subject>Words (language)</subject><subject>Young Adult</subject><issn>1065-9471</issn><issn>1097-0193</issn><issn>1097-0193</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>24P</sourceid><sourceid>WIN</sourceid><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNp1kU9P3DAQxS1EVSj0wBdAkbjQQ2AmTvznVFHUFioQl_ZseWN71yixFzsp2m9f06WoVOrJM5qfnt_MI-QI4QwBmvPVYjxrGJd8h-wjSF4DSrr7VLOuli3HPfIu53sAxA7wLdmjElpE3u2Tb7fzMPlsQ45pU_kw2WXSk4-hiq7Ka2v7VaWDqZY2T3OyuSCVroIujR58nnxfrXXSxi_HQ_LG6SHb98_vAfnx5fP3y6v65u7r9eXFTd1T2vLa0VZYSaUDg73gEqVsG2FAghCiNBKc64AxZqzrhGNGUhDQGN5ps2h5Tw_Ix63uel6M1vQ2TMWLWic_6rRRUXv1ehL8Si3jT4XYMCoQisLps0KKD3PZTI0-93YYdLBxzqp8SJlAFLKgJ_-g93FOoexXqOKsKydlhfqwpfoUc07WvbhBUE8RqRKR-h1RYY__tv9C_smkAOdb4NEPdvN_JXX16XYr-QtBUZp-</recordid><startdate>20240801</startdate><enddate>20240801</enddate><creator>Matyjek, Magdalena</creator><creator>Kita, Sotaro</creator><creator>Torralba Cuello, Mireia</creator><creator>Soto Faraco, Salvador</creator><general>John Wiley & Sons, Inc</general><scope>24P</scope><scope>WIN</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QR</scope><scope>7TK</scope><scope>7U7</scope><scope>7X7</scope><scope>7XB</scope><scope>8FD</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>P64</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0003-4546-6480</orcidid><orcidid>https://orcid.org/0000-0002-4799-3762</orcidid><orcidid>https://orcid.org/0000-0003-3035-3918</orcidid><orcidid>https://orcid.org/0000-0002-0088-3654</orcidid></search><sort><creationdate>20240801</creationdate><title>Multisensory integration of speech and gestures in a naturalistic paradigm</title><author>Matyjek, Magdalena ; Kita, Sotaro ; Torralba Cuello, Mireia ; Soto Faraco, Salvador</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3347-f348e939f0d1c879199428d09088819990ff50666def58f6d930802d75adb47c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Acoustic Stimulation</topic><topic>Adult</topic><topic>Adults</topic><topic>Audio data</topic><topic>audio‐visual speech</topic><topic>Brain - physiology</topic><topic>Comprehension - physiology</topic><topic>Context</topic><topic>EEG</topic><topic>Electroencephalography</topic><topic>Female</topic><topic>Gestures</topic><topic>Humans</topic><topic>iconic gestures</topic><topic>Information processing</topic><topic>Male</topic><topic>multisensory integration</topic><topic>Photic Stimulation - methods</topic><topic>Semantics</topic><topic>Sensory integration</topic><topic>Social behavior</topic><topic>Social factors</topic><topic>Speech</topic><topic>Speech - physiology</topic><topic>Speech Perception - physiology</topic><topic>Validity</topic><topic>Visual Perception - physiology</topic><topic>Visual stimuli</topic><topic>Voice recognition</topic><topic>Words (language)</topic><topic>Young Adult</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Matyjek, Magdalena</creatorcontrib><creatorcontrib>Kita, Sotaro</creatorcontrib><creatorcontrib>Torralba Cuello, Mireia</creatorcontrib><creatorcontrib>Soto Faraco, Salvador</creatorcontrib><collection>Wiley-Blackwell Open Access Titles</collection><collection>Wiley Free Content</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Chemoreception Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Toxicology Abstracts</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Technology Research Database</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Human brain mapping</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Matyjek, Magdalena</au><au>Kita, Sotaro</au><au>Torralba Cuello, Mireia</au><au>Soto Faraco, Salvador</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multisensory integration of speech and gestures in a naturalistic paradigm</atitle><jtitle>Human brain mapping</jtitle><addtitle>Hum Brain Mapp</addtitle><date>2024-08-01</date><risdate>2024</risdate><volume>45</volume><issue>11</issue><spage>e26797</spage><epage>n/a</epage><pages>e26797-n/a</pages><issn>1065-9471</issn><issn>1097-0193</issn><eissn>1097-0193</eissn><abstract>Speech comprehension is crucial for human social interaction, relying on the integration of auditory and visual cues across various levels of representation. While research has extensively studied multisensory integration (MSI) using idealised, well‐controlled stimuli, there is a need to understand this process in response to complex, naturalistic stimuli encountered in everyday life. This study investigated behavioural and neural MSI in neurotypical adults experiencing audio‐visual speech within a naturalistic, social context. Our novel paradigm incorporated a broader social situational context, complete words, and speech‐supporting iconic gestures, allowing for context‐based pragmatics and semantic priors. We investigated MSI in the presence of unimodal (auditory or visual) or complementary, bimodal speech signals. During audio‐visual speech trials, compared to unimodal trials, participants more accurately recognised spoken words and showed a more pronounced suppression of alpha power—an indicator of heightened integration load. Importantly, on the neural level, these effects surpassed mere summation of unimodal responses, suggesting non‐linear MSI mechanisms. Overall, our findings demonstrate that typically developing adults integrate audio‐visual speech and gesture information to facilitate speech comprehension in noisy environments, highlighting the importance of studying MSI in ecologically valid contexts.
We explored brain mechanisms underlying multisensory integration (MSI) of audio‐visual speech and gestures in an ecologically valid context. Audio‐visual versus only audio or visual trials triggered enhanced word recognition and increased alpha suppression indicating heightened integration load. This contributes to our understanding of MSI in real‐life contexts.</abstract><cop>Hoboken, USA</cop><pub>John Wiley & Sons, Inc</pub><pmid>39041175</pmid><doi>10.1002/hbm.26797</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0003-4546-6480</orcidid><orcidid>https://orcid.org/0000-0002-4799-3762</orcidid><orcidid>https://orcid.org/0000-0003-3035-3918</orcidid><orcidid>https://orcid.org/0000-0002-0088-3654</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1065-9471 |
ispartof | Human brain mapping, 2024-08, Vol.45 (11), p.e26797-n/a |
issn | 1065-9471 1097-0193 1097-0193 |
language | eng |
recordid | cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_11263810 |
source | MEDLINE; DOAJ Directory of Open Access Journals; Wiley Online Library Journals Frontfile Complete; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Wiley-Blackwell Open Access Titles; PubMed Central |
subjects | Acoustic Stimulation Adult Adults Audio data audio‐visual speech Brain - physiology Comprehension - physiology Context EEG Electroencephalography Female Gestures Humans iconic gestures Information processing Male multisensory integration Photic Stimulation - methods Semantics Sensory integration Social behavior Social factors Speech Speech - physiology Speech Perception - physiology Validity Visual Perception - physiology Visual stimuli Voice recognition Words (language) Young Adult |
title | Multisensory integration of speech and gestures in a naturalistic paradigm |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-15T18%3A26%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multisensory%20integration%20of%20speech%20and%20gestures%20in%20a%20naturalistic%20paradigm&rft.jtitle=Human%20brain%20mapping&rft.au=Matyjek,%20Magdalena&rft.date=2024-08-01&rft.volume=45&rft.issue=11&rft.spage=e26797&rft.epage=n/a&rft.pages=e26797-n/a&rft.issn=1065-9471&rft.eissn=1097-0193&rft_id=info:doi/10.1002/hbm.26797&rft_dat=%3Cproquest_pubme%3E3083681189%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3093051156&rft_id=info:pmid/39041175&rfr_iscdi=true |