What you see isn’t always what you get: Auditory word signals trump consciously perceived words in lexical access
•Face-to-face speech involves both auditory and visual components.•We test when auditory–visual integration occurs relative to lexical access.•Lexical access precedes integration if the auditory signal is a real word.•However, AV-integration still occurs as long as the two signals are compatible.•Th...
Gespeichert in:
Veröffentlicht in: | Cognition 2016-06, Vol.151, p.96-107 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 107 |
---|---|
container_issue | |
container_start_page | 96 |
container_title | Cognition |
container_volume | 151 |
creator | Ostrand, Rachel Blumstein, Sheila E. Ferreira, Victor S. Morgan, James L. |
description | •Face-to-face speech involves both auditory and visual components.•We test when auditory–visual integration occurs relative to lexical access.•Lexical access precedes integration if the auditory signal is a real word.•However, AV-integration still occurs as long as the two signals are compatible.•Thus, a comprehender can lexically access one word yet consciously perceive a different word.
Human speech perception often includes both an auditory and visual component. A conflict in these signals can result in the McGurk illusion, in which the listener perceives a fusion of the two streams, implying that information from both has been integrated. We report two experiments investigating whether auditory–visual integration of speech occurs before or after lexical access, and whether the visual signal influences lexical access at all. Subjects were presented with McGurk or Congruent primes and performed a lexical decision task on related or unrelated targets. Although subjects perceived the McGurk illusion, McGurk and Congruent primes with matching real-word auditory signals equivalently primed targets that were semantically related to the auditory signal, but not targets related to the McGurk percept. We conclude that the time course of auditory–visual integration is dependent on the lexicality of the auditory and visual input signals, and that listeners can lexically access one word and yet consciously perceive another. |
doi_str_mv | 10.1016/j.cognition.2016.02.019 |
format | Article |
fullrecord | <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_4850493</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0010027716300476</els_id><sourcerecordid>1780514278</sourcerecordid><originalsourceid>FETCH-LOGICAL-c475t-a12d941fb3b9a802515403702d00efa78071826b1990b55faebb341fe27060413</originalsourceid><addsrcrecordid>eNqFkc1u1DAUhS0EotPCK1Av2SS9dpJx0kWlUUVppUpsQCwtx7mZepSxp7YzQ3a8Bq_XJ8HDtCNYsbLk-51zfw4h5wxyBmx-scq1W1oTjbM5Tx858BxY84rMWC2KTNRF_ZrMABhkwIU4IachrACg5KJ-S064AMaAsxkJ3x9UpJMbaUCkJtinn78iVcNOTYHuXmpLjJd0MXYmOj_RnfMdDWZp1RBo9ON6Q7WzQRs3hmGiG_QazRa7P2CgxtIBfxitBqq0xhDekTd9kuL75_eMfLv59PX6Nrv_8vnuenGf6VJUMVOMd03J-rZoG1UDr1hVQiGAdwDYK1GDYDWft6xpoK2qXmHbFonHtN0cSlackauD72Zs19hptNGrQW68WSs_SaeM_LdizYNcuq0s6wrKpkgGH58NvHscMUS5NkHjMCiLaVfJ0gwV2980oeKAau9C8Ngf2zCQ-8jkSh4jk_vIJHCZIkvKD39PedS9ZJSAxQHAdKutQS_TqdFq7IxHHWXnzH-b_AZRx6-L</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1780514278</pqid></control><display><type>article</type><title>What you see isn’t always what you get: Auditory word signals trump consciously perceived words in lexical access</title><source>MEDLINE</source><source>Elsevier ScienceDirect Journals</source><creator>Ostrand, Rachel ; Blumstein, Sheila E. ; Ferreira, Victor S. ; Morgan, James L.</creator><creatorcontrib>Ostrand, Rachel ; Blumstein, Sheila E. ; Ferreira, Victor S. ; Morgan, James L.</creatorcontrib><description>•Face-to-face speech involves both auditory and visual components.•We test when auditory–visual integration occurs relative to lexical access.•Lexical access precedes integration if the auditory signal is a real word.•However, AV-integration still occurs as long as the two signals are compatible.•Thus, a comprehender can lexically access one word yet consciously perceive a different word.
Human speech perception often includes both an auditory and visual component. A conflict in these signals can result in the McGurk illusion, in which the listener perceives a fusion of the two streams, implying that information from both has been integrated. We report two experiments investigating whether auditory–visual integration of speech occurs before or after lexical access, and whether the visual signal influences lexical access at all. Subjects were presented with McGurk or Congruent primes and performed a lexical decision task on related or unrelated targets. Although subjects perceived the McGurk illusion, McGurk and Congruent primes with matching real-word auditory signals equivalently primed targets that were semantically related to the auditory signal, but not targets related to the McGurk percept. We conclude that the time course of auditory–visual integration is dependent on the lexicality of the auditory and visual input signals, and that listeners can lexically access one word and yet consciously perceive another.</description><identifier>ISSN: 0010-0277</identifier><identifier>EISSN: 1873-7838</identifier><identifier>DOI: 10.1016/j.cognition.2016.02.019</identifier><identifier>PMID: 27011021</identifier><language>eng</language><publisher>Netherlands: Elsevier B.V</publisher><subject>Acoustic Stimulation - methods ; Adolescent ; Adult ; Auditory Perception - physiology ; Auditory–visual integration ; Female ; Humans ; Lexical access ; Male ; McGurk effect ; Multisensory perception ; Photic Stimulation - methods ; Pilot Projects ; Random Allocation ; Reaction Time - physiology ; Speech Perception - physiology ; Visual Perception - physiology ; Young Adult</subject><ispartof>Cognition, 2016-06, Vol.151, p.96-107</ispartof><rights>2016 Elsevier B.V.</rights><rights>Copyright © 2016 Elsevier B.V. All rights reserved.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c475t-a12d941fb3b9a802515403702d00efa78071826b1990b55faebb341fe27060413</citedby><cites>FETCH-LOGICAL-c475t-a12d941fb3b9a802515403702d00efa78071826b1990b55faebb341fe27060413</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0010027716300476$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>230,314,776,780,881,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/27011021$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Ostrand, Rachel</creatorcontrib><creatorcontrib>Blumstein, Sheila E.</creatorcontrib><creatorcontrib>Ferreira, Victor S.</creatorcontrib><creatorcontrib>Morgan, James L.</creatorcontrib><title>What you see isn’t always what you get: Auditory word signals trump consciously perceived words in lexical access</title><title>Cognition</title><addtitle>Cognition</addtitle><description>•Face-to-face speech involves both auditory and visual components.•We test when auditory–visual integration occurs relative to lexical access.•Lexical access precedes integration if the auditory signal is a real word.•However, AV-integration still occurs as long as the two signals are compatible.•Thus, a comprehender can lexically access one word yet consciously perceive a different word.
Human speech perception often includes both an auditory and visual component. A conflict in these signals can result in the McGurk illusion, in which the listener perceives a fusion of the two streams, implying that information from both has been integrated. We report two experiments investigating whether auditory–visual integration of speech occurs before or after lexical access, and whether the visual signal influences lexical access at all. Subjects were presented with McGurk or Congruent primes and performed a lexical decision task on related or unrelated targets. Although subjects perceived the McGurk illusion, McGurk and Congruent primes with matching real-word auditory signals equivalently primed targets that were semantically related to the auditory signal, but not targets related to the McGurk percept. We conclude that the time course of auditory–visual integration is dependent on the lexicality of the auditory and visual input signals, and that listeners can lexically access one word and yet consciously perceive another.</description><subject>Acoustic Stimulation - methods</subject><subject>Adolescent</subject><subject>Adult</subject><subject>Auditory Perception - physiology</subject><subject>Auditory–visual integration</subject><subject>Female</subject><subject>Humans</subject><subject>Lexical access</subject><subject>Male</subject><subject>McGurk effect</subject><subject>Multisensory perception</subject><subject>Photic Stimulation - methods</subject><subject>Pilot Projects</subject><subject>Random Allocation</subject><subject>Reaction Time - physiology</subject><subject>Speech Perception - physiology</subject><subject>Visual Perception - physiology</subject><subject>Young Adult</subject><issn>0010-0277</issn><issn>1873-7838</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2016</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNqFkc1u1DAUhS0EotPCK1Av2SS9dpJx0kWlUUVppUpsQCwtx7mZepSxp7YzQ3a8Bq_XJ8HDtCNYsbLk-51zfw4h5wxyBmx-scq1W1oTjbM5Tx858BxY84rMWC2KTNRF_ZrMABhkwIU4IachrACg5KJ-S064AMaAsxkJ3x9UpJMbaUCkJtinn78iVcNOTYHuXmpLjJd0MXYmOj_RnfMdDWZp1RBo9ON6Q7WzQRs3hmGiG_QazRa7P2CgxtIBfxitBqq0xhDekTd9kuL75_eMfLv59PX6Nrv_8vnuenGf6VJUMVOMd03J-rZoG1UDr1hVQiGAdwDYK1GDYDWft6xpoK2qXmHbFonHtN0cSlackauD72Zs19hptNGrQW68WSs_SaeM_LdizYNcuq0s6wrKpkgGH58NvHscMUS5NkHjMCiLaVfJ0gwV2980oeKAau9C8Ngf2zCQ-8jkSh4jk_vIJHCZIkvKD39PedS9ZJSAxQHAdKutQS_TqdFq7IxHHWXnzH-b_AZRx6-L</recordid><startdate>20160601</startdate><enddate>20160601</enddate><creator>Ostrand, Rachel</creator><creator>Blumstein, Sheila E.</creator><creator>Ferreira, Victor S.</creator><creator>Morgan, James L.</creator><general>Elsevier B.V</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>5PM</scope></search><sort><creationdate>20160601</creationdate><title>What you see isn’t always what you get: Auditory word signals trump consciously perceived words in lexical access</title><author>Ostrand, Rachel ; Blumstein, Sheila E. ; Ferreira, Victor S. ; Morgan, James L.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c475t-a12d941fb3b9a802515403702d00efa78071826b1990b55faebb341fe27060413</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2016</creationdate><topic>Acoustic Stimulation - methods</topic><topic>Adolescent</topic><topic>Adult</topic><topic>Auditory Perception - physiology</topic><topic>Auditory–visual integration</topic><topic>Female</topic><topic>Humans</topic><topic>Lexical access</topic><topic>Male</topic><topic>McGurk effect</topic><topic>Multisensory perception</topic><topic>Photic Stimulation - methods</topic><topic>Pilot Projects</topic><topic>Random Allocation</topic><topic>Reaction Time - physiology</topic><topic>Speech Perception - physiology</topic><topic>Visual Perception - physiology</topic><topic>Young Adult</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ostrand, Rachel</creatorcontrib><creatorcontrib>Blumstein, Sheila E.</creatorcontrib><creatorcontrib>Ferreira, Victor S.</creatorcontrib><creatorcontrib>Morgan, James L.</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Cognition</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ostrand, Rachel</au><au>Blumstein, Sheila E.</au><au>Ferreira, Victor S.</au><au>Morgan, James L.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>What you see isn’t always what you get: Auditory word signals trump consciously perceived words in lexical access</atitle><jtitle>Cognition</jtitle><addtitle>Cognition</addtitle><date>2016-06-01</date><risdate>2016</risdate><volume>151</volume><spage>96</spage><epage>107</epage><pages>96-107</pages><issn>0010-0277</issn><eissn>1873-7838</eissn><abstract>•Face-to-face speech involves both auditory and visual components.•We test when auditory–visual integration occurs relative to lexical access.•Lexical access precedes integration if the auditory signal is a real word.•However, AV-integration still occurs as long as the two signals are compatible.•Thus, a comprehender can lexically access one word yet consciously perceive a different word.
Human speech perception often includes both an auditory and visual component. A conflict in these signals can result in the McGurk illusion, in which the listener perceives a fusion of the two streams, implying that information from both has been integrated. We report two experiments investigating whether auditory–visual integration of speech occurs before or after lexical access, and whether the visual signal influences lexical access at all. Subjects were presented with McGurk or Congruent primes and performed a lexical decision task on related or unrelated targets. Although subjects perceived the McGurk illusion, McGurk and Congruent primes with matching real-word auditory signals equivalently primed targets that were semantically related to the auditory signal, but not targets related to the McGurk percept. We conclude that the time course of auditory–visual integration is dependent on the lexicality of the auditory and visual input signals, and that listeners can lexically access one word and yet consciously perceive another.</abstract><cop>Netherlands</cop><pub>Elsevier B.V</pub><pmid>27011021</pmid><doi>10.1016/j.cognition.2016.02.019</doi><tpages>12</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0010-0277 |
ispartof | Cognition, 2016-06, Vol.151, p.96-107 |
issn | 0010-0277 1873-7838 |
language | eng |
recordid | cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_4850493 |
source | MEDLINE; Elsevier ScienceDirect Journals |
subjects | Acoustic Stimulation - methods Adolescent Adult Auditory Perception - physiology Auditory–visual integration Female Humans Lexical access Male McGurk effect Multisensory perception Photic Stimulation - methods Pilot Projects Random Allocation Reaction Time - physiology Speech Perception - physiology Visual Perception - physiology Young Adult |
title | What you see isn’t always what you get: Auditory word signals trump consciously perceived words in lexical access |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-06T16%3A00%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=What%20you%20see%20isn%E2%80%99t%20always%20what%20you%20get:%20Auditory%20word%20signals%20trump%20consciously%20perceived%20words%20in%20lexical%20access&rft.jtitle=Cognition&rft.au=Ostrand,%20Rachel&rft.date=2016-06-01&rft.volume=151&rft.spage=96&rft.epage=107&rft.pages=96-107&rft.issn=0010-0277&rft.eissn=1873-7838&rft_id=info:doi/10.1016/j.cognition.2016.02.019&rft_dat=%3Cproquest_pubme%3E1780514278%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1780514278&rft_id=info:pmid/27011021&rft_els_id=S0010027716300476&rfr_iscdi=true |