A visual or tactile signal makes auditory speech detection more efficient by reducing uncertainty

Acoustic speech is easier to detect in noise when the talker can be seen. This finding could be explained by integration of multisensory inputs or refinement of auditory processing from visual guidance. In two experiments, we studied two‐interval forced‐choice detection of an auditory ‘ba’ in acoust...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The European journal of neuroscience 2014-04, Vol.39 (8), p.1323-1331
Hauptverfasser: Tjan, Bosco S., Chao, Ewen, Bernstein, Lynne E.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1331
container_issue 8
container_start_page 1323
container_title The European journal of neuroscience
container_volume 39
creator Tjan, Bosco S.
Chao, Ewen
Bernstein, Lynne E.
description Acoustic speech is easier to detect in noise when the talker can be seen. This finding could be explained by integration of multisensory inputs or refinement of auditory processing from visual guidance. In two experiments, we studied two‐interval forced‐choice detection of an auditory ‘ba’ in acoustic noise, paired with various visual and tactile stimuli that were identically presented in the two observation intervals. Detection thresholds were reduced under the multisensory conditions vs. the auditory‐only condition, even though the visual and/or tactile stimuli alone could not inform the correct response. Results were analysed relative to an ideal observer for which intrinsic (internal) noise and efficiency were independent contributors to detection sensitivity. Across experiments, intrinsic noise was unaffected by the multisensory stimuli, arguing against the merging (integrating) of multisensory inputs into a unitary speech signal, but sampling efficiency was increased to varying degrees, supporting refinement of knowledge about the auditory stimulus. The steepness of the psychometric functions decreased with increasing sampling efficiency, suggesting that the ‘task‐irrelevant’ visual and tactile stimuli reduced uncertainty about the acoustic signal. Visible speech was not superior for enhancing auditory speech detection. Our results reject multisensory neuronal integration and speech‐specific neural processing as explanations for the enhanced auditory speech detection under noisy conditions. Instead, they support a more rudimentary form of multisensory interaction: the otherwise task‐irrelevant sensory systems inform the auditory system about when to listen. When acoustic speech is buried in noise, a task‐irrelevant visual and/or vibrotactile stimulus can enhance its detectability. Within an ideal observer model, enhancement is attributable to reduced noise intrinsic to the perceptual system and/or improved statistical sampling efficiency. Experiments here support only improved efficiency via uncertainty reduction and offer no evidence for change in internal noise. This pattern of results argues against enhancement due to multisensory integration.
doi_str_mv 10.1111/ejn.12471
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_3997613</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1518618948</sourcerecordid><originalsourceid>FETCH-LOGICAL-c5821-8301c737b9a5d2dd8d6792ac4b6c85756cbeda2f9c8fa3caadee9ed93874d48f3</originalsourceid><addsrcrecordid>eNqNkU1v1DAQhiMEokvhwB9AviDBIa0df8S-IJVVKaCqXIpAXKyJPdm6zSaLnRTy7zHsdoEDEr5Y8jwz71hPUTxl9Ijlc4zX_RGrRM3uFQsmFC2NVPp-saBG8lIz9fmgeJTSNaVUKyEfFgeVEJQqWS0KOCG3IU3QkSGSEdwYOiQprPr8soYbTAQmH8YhziRtEN0V8ThixoaerIeIBNs2uID9SJqZRPSTC_2KTL3DOELox_lx8aCFLuGT3X1YfHxzerl8W55_OHu3PDkvndQVKzWnzNW8bgxIX3mvvapNBU40ymlZS-Ua9FC1xukWuAPwiAa94boWXuiWHxavtnM3U7NG7_JKETq7iWENcbYDBPt3pQ9XdjXcWm5MrRjPA17sBsTh64RptOuQHHYd9DhMyTJZCW5yHP0PlGnFtBE6oy-3qItDShHb_UaM2p_2bLZnf9nL7LM_v7An73Rl4PkOgOSgayP0LqTfnBbSaCkzd7zlvmWd878T7en7i7voctsR0ojf9x0Qb6zKVqT9dHFm9eVSflGvudX8B_gdwrQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1518618948</pqid></control><display><type>article</type><title>A visual or tactile signal makes auditory speech detection more efficient by reducing uncertainty</title><source>MEDLINE</source><source>Wiley Online Library Journals Frontfile Complete</source><creator>Tjan, Bosco S. ; Chao, Ewen ; Bernstein, Lynne E.</creator><creatorcontrib>Tjan, Bosco S. ; Chao, Ewen ; Bernstein, Lynne E.</creatorcontrib><description>Acoustic speech is easier to detect in noise when the talker can be seen. This finding could be explained by integration of multisensory inputs or refinement of auditory processing from visual guidance. In two experiments, we studied two‐interval forced‐choice detection of an auditory ‘ba’ in acoustic noise, paired with various visual and tactile stimuli that were identically presented in the two observation intervals. Detection thresholds were reduced under the multisensory conditions vs. the auditory‐only condition, even though the visual and/or tactile stimuli alone could not inform the correct response. Results were analysed relative to an ideal observer for which intrinsic (internal) noise and efficiency were independent contributors to detection sensitivity. Across experiments, intrinsic noise was unaffected by the multisensory stimuli, arguing against the merging (integrating) of multisensory inputs into a unitary speech signal, but sampling efficiency was increased to varying degrees, supporting refinement of knowledge about the auditory stimulus. The steepness of the psychometric functions decreased with increasing sampling efficiency, suggesting that the ‘task‐irrelevant’ visual and tactile stimuli reduced uncertainty about the acoustic signal. Visible speech was not superior for enhancing auditory speech detection. Our results reject multisensory neuronal integration and speech‐specific neural processing as explanations for the enhanced auditory speech detection under noisy conditions. Instead, they support a more rudimentary form of multisensory interaction: the otherwise task‐irrelevant sensory systems inform the auditory system about when to listen. When acoustic speech is buried in noise, a task‐irrelevant visual and/or vibrotactile stimulus can enhance its detectability. Within an ideal observer model, enhancement is attributable to reduced noise intrinsic to the perceptual system and/or improved statistical sampling efficiency. Experiments here support only improved efficiency via uncertainty reduction and offer no evidence for change in internal noise. This pattern of results argues against enhancement due to multisensory integration.</description><identifier>ISSN: 0953-816X</identifier><identifier>EISSN: 1460-9568</identifier><identifier>DOI: 10.1111/ejn.12471</identifier><identifier>PMID: 24400652</identifier><language>eng</language><publisher>Oxford: Blackwell Publishing Ltd</publisher><subject>Adult ; Biological and medical sciences ; Female ; Fundamental and applied biological sciences. Psychology ; Humans ; ideal-observer analysis ; Male ; Models, Neurological ; multisensory enhancement ; speech detection ; Speech Perception - physiology ; Touch ; Touch Perception ; Uncertainty ; Vertebrates: nervous system and sense organs ; Vision, Ocular ; Visual Perception</subject><ispartof>The European journal of neuroscience, 2014-04, Vol.39 (8), p.1323-1331</ispartof><rights>2014 Federation of European Neuroscience Societies and John Wiley &amp; Sons Ltd</rights><rights>2015 INIST-CNRS</rights><rights>2014 Federation of European Neuroscience Societies and John Wiley &amp; Sons Ltd.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c5821-8301c737b9a5d2dd8d6792ac4b6c85756cbeda2f9c8fa3caadee9ed93874d48f3</citedby><cites>FETCH-LOGICAL-c5821-8301c737b9a5d2dd8d6792ac4b6c85756cbeda2f9c8fa3caadee9ed93874d48f3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1111%2Fejn.12471$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1111%2Fejn.12471$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>230,314,776,780,881,1411,27901,27902,45550,45551</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=28459855$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/24400652$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Tjan, Bosco S.</creatorcontrib><creatorcontrib>Chao, Ewen</creatorcontrib><creatorcontrib>Bernstein, Lynne E.</creatorcontrib><title>A visual or tactile signal makes auditory speech detection more efficient by reducing uncertainty</title><title>The European journal of neuroscience</title><addtitle>Eur J Neurosci</addtitle><description>Acoustic speech is easier to detect in noise when the talker can be seen. This finding could be explained by integration of multisensory inputs or refinement of auditory processing from visual guidance. In two experiments, we studied two‐interval forced‐choice detection of an auditory ‘ba’ in acoustic noise, paired with various visual and tactile stimuli that were identically presented in the two observation intervals. Detection thresholds were reduced under the multisensory conditions vs. the auditory‐only condition, even though the visual and/or tactile stimuli alone could not inform the correct response. Results were analysed relative to an ideal observer for which intrinsic (internal) noise and efficiency were independent contributors to detection sensitivity. Across experiments, intrinsic noise was unaffected by the multisensory stimuli, arguing against the merging (integrating) of multisensory inputs into a unitary speech signal, but sampling efficiency was increased to varying degrees, supporting refinement of knowledge about the auditory stimulus. The steepness of the psychometric functions decreased with increasing sampling efficiency, suggesting that the ‘task‐irrelevant’ visual and tactile stimuli reduced uncertainty about the acoustic signal. Visible speech was not superior for enhancing auditory speech detection. Our results reject multisensory neuronal integration and speech‐specific neural processing as explanations for the enhanced auditory speech detection under noisy conditions. Instead, they support a more rudimentary form of multisensory interaction: the otherwise task‐irrelevant sensory systems inform the auditory system about when to listen. When acoustic speech is buried in noise, a task‐irrelevant visual and/or vibrotactile stimulus can enhance its detectability. Within an ideal observer model, enhancement is attributable to reduced noise intrinsic to the perceptual system and/or improved statistical sampling efficiency. Experiments here support only improved efficiency via uncertainty reduction and offer no evidence for change in internal noise. This pattern of results argues against enhancement due to multisensory integration.</description><subject>Adult</subject><subject>Biological and medical sciences</subject><subject>Female</subject><subject>Fundamental and applied biological sciences. Psychology</subject><subject>Humans</subject><subject>ideal-observer analysis</subject><subject>Male</subject><subject>Models, Neurological</subject><subject>multisensory enhancement</subject><subject>speech detection</subject><subject>Speech Perception - physiology</subject><subject>Touch</subject><subject>Touch Perception</subject><subject>Uncertainty</subject><subject>Vertebrates: nervous system and sense organs</subject><subject>Vision, Ocular</subject><subject>Visual Perception</subject><issn>0953-816X</issn><issn>1460-9568</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNqNkU1v1DAQhiMEokvhwB9AviDBIa0df8S-IJVVKaCqXIpAXKyJPdm6zSaLnRTy7zHsdoEDEr5Y8jwz71hPUTxl9Ijlc4zX_RGrRM3uFQsmFC2NVPp-saBG8lIz9fmgeJTSNaVUKyEfFgeVEJQqWS0KOCG3IU3QkSGSEdwYOiQprPr8soYbTAQmH8YhziRtEN0V8ThixoaerIeIBNs2uID9SJqZRPSTC_2KTL3DOELox_lx8aCFLuGT3X1YfHxzerl8W55_OHu3PDkvndQVKzWnzNW8bgxIX3mvvapNBU40ymlZS-Ua9FC1xukWuAPwiAa94boWXuiWHxavtnM3U7NG7_JKETq7iWENcbYDBPt3pQ9XdjXcWm5MrRjPA17sBsTh64RptOuQHHYd9DhMyTJZCW5yHP0PlGnFtBE6oy-3qItDShHb_UaM2p_2bLZnf9nL7LM_v7An73Rl4PkOgOSgayP0LqTfnBbSaCkzd7zlvmWd878T7en7i7voctsR0ojf9x0Qb6zKVqT9dHFm9eVSflGvudX8B_gdwrQ</recordid><startdate>201404</startdate><enddate>201404</enddate><creator>Tjan, Bosco S.</creator><creator>Chao, Ewen</creator><creator>Bernstein, Lynne E.</creator><general>Blackwell Publishing Ltd</general><general>Blackwell</general><scope>BSCLL</scope><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>7TK</scope><scope>5PM</scope></search><sort><creationdate>201404</creationdate><title>A visual or tactile signal makes auditory speech detection more efficient by reducing uncertainty</title><author>Tjan, Bosco S. ; Chao, Ewen ; Bernstein, Lynne E.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c5821-8301c737b9a5d2dd8d6792ac4b6c85756cbeda2f9c8fa3caadee9ed93874d48f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Adult</topic><topic>Biological and medical sciences</topic><topic>Female</topic><topic>Fundamental and applied biological sciences. Psychology</topic><topic>Humans</topic><topic>ideal-observer analysis</topic><topic>Male</topic><topic>Models, Neurological</topic><topic>multisensory enhancement</topic><topic>speech detection</topic><topic>Speech Perception - physiology</topic><topic>Touch</topic><topic>Touch Perception</topic><topic>Uncertainty</topic><topic>Vertebrates: nervous system and sense organs</topic><topic>Vision, Ocular</topic><topic>Visual Perception</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Tjan, Bosco S.</creatorcontrib><creatorcontrib>Chao, Ewen</creatorcontrib><creatorcontrib>Bernstein, Lynne E.</creatorcontrib><collection>Istex</collection><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>Neurosciences Abstracts</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>The European journal of neuroscience</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Tjan, Bosco S.</au><au>Chao, Ewen</au><au>Bernstein, Lynne E.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A visual or tactile signal makes auditory speech detection more efficient by reducing uncertainty</atitle><jtitle>The European journal of neuroscience</jtitle><addtitle>Eur J Neurosci</addtitle><date>2014-04</date><risdate>2014</risdate><volume>39</volume><issue>8</issue><spage>1323</spage><epage>1331</epage><pages>1323-1331</pages><issn>0953-816X</issn><eissn>1460-9568</eissn><abstract>Acoustic speech is easier to detect in noise when the talker can be seen. This finding could be explained by integration of multisensory inputs or refinement of auditory processing from visual guidance. In two experiments, we studied two‐interval forced‐choice detection of an auditory ‘ba’ in acoustic noise, paired with various visual and tactile stimuli that were identically presented in the two observation intervals. Detection thresholds were reduced under the multisensory conditions vs. the auditory‐only condition, even though the visual and/or tactile stimuli alone could not inform the correct response. Results were analysed relative to an ideal observer for which intrinsic (internal) noise and efficiency were independent contributors to detection sensitivity. Across experiments, intrinsic noise was unaffected by the multisensory stimuli, arguing against the merging (integrating) of multisensory inputs into a unitary speech signal, but sampling efficiency was increased to varying degrees, supporting refinement of knowledge about the auditory stimulus. The steepness of the psychometric functions decreased with increasing sampling efficiency, suggesting that the ‘task‐irrelevant’ visual and tactile stimuli reduced uncertainty about the acoustic signal. Visible speech was not superior for enhancing auditory speech detection. Our results reject multisensory neuronal integration and speech‐specific neural processing as explanations for the enhanced auditory speech detection under noisy conditions. Instead, they support a more rudimentary form of multisensory interaction: the otherwise task‐irrelevant sensory systems inform the auditory system about when to listen. When acoustic speech is buried in noise, a task‐irrelevant visual and/or vibrotactile stimulus can enhance its detectability. Within an ideal observer model, enhancement is attributable to reduced noise intrinsic to the perceptual system and/or improved statistical sampling efficiency. Experiments here support only improved efficiency via uncertainty reduction and offer no evidence for change in internal noise. This pattern of results argues against enhancement due to multisensory integration.</abstract><cop>Oxford</cop><pub>Blackwell Publishing Ltd</pub><pmid>24400652</pmid><doi>10.1111/ejn.12471</doi><tpages>9</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0953-816X
ispartof The European journal of neuroscience, 2014-04, Vol.39 (8), p.1323-1331
issn 0953-816X
1460-9568
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_3997613
source MEDLINE; Wiley Online Library Journals Frontfile Complete
subjects Adult
Biological and medical sciences
Female
Fundamental and applied biological sciences. Psychology
Humans
ideal-observer analysis
Male
Models, Neurological
multisensory enhancement
speech detection
Speech Perception - physiology
Touch
Touch Perception
Uncertainty
Vertebrates: nervous system and sense organs
Vision, Ocular
Visual Perception
title A visual or tactile signal makes auditory speech detection more efficient by reducing uncertainty
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T22%3A32%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20visual%20or%20tactile%20signal%20makes%20auditory%20speech%20detection%20more%20efficient%20by%20reducing%20uncertainty&rft.jtitle=The%20European%20journal%20of%20neuroscience&rft.au=Tjan,%20Bosco%20S.&rft.date=2014-04&rft.volume=39&rft.issue=8&rft.spage=1323&rft.epage=1331&rft.pages=1323-1331&rft.issn=0953-816X&rft.eissn=1460-9568&rft_id=info:doi/10.1111/ejn.12471&rft_dat=%3Cproquest_pubme%3E1518618948%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1518618948&rft_id=info:pmid/24400652&rfr_iscdi=true