The sound of motion in spoken language: Visual information conveyed by acoustic properties of speech

Language is generally viewed as conveying information through symbols whose form is arbitrarily related to their meaning. This arbitrary relation is often assumed to also characterize the mental representations underlying language comprehension. We explore the idea that visuo-spatial information can...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Cognition 2007-12, Vol.105 (3), p.681-690
Hauptverfasser: Shintel, Hadas, Nusbaum, Howard C.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 690
container_issue 3
container_start_page 681
container_title Cognition
container_volume 105
creator Shintel, Hadas
Nusbaum, Howard C.
description Language is generally viewed as conveying information through symbols whose form is arbitrarily related to their meaning. This arbitrary relation is often assumed to also characterize the mental representations underlying language comprehension. We explore the idea that visuo-spatial information can be analogically conveyed through acoustic properties of speech and that such information is integrated into an analog perceptual representation as a natural part of comprehension. Listeners heard sentences describing objects, spoken at varying speaking rates. After each sentence, participants saw a picture of an object and judged whether it had been mentioned in the sentence. Participants were faster to recognize the object when motion implied by speaking rate matched the motion implied by the picture. Results suggest that visuo-spatial referential information can be analogically conveyed and represented.
doi_str_mv 10.1016/j.cognition.2006.11.005
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_85659732</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ776925</ericid><els_id>S0010027706002368</els_id><sourcerecordid>20414328</sourcerecordid><originalsourceid>FETCH-LOGICAL-c579t-133c39c7a8e60a8eacc527dbeeabfd1732336dc4aa92b5b9abf56173485e84133</originalsourceid><addsrcrecordid>eNqFkUtv1TAQhSMEopfCP0DgDewS_Ihf7KqqvFSJTWFrOc7k1pckDnZS6f57HG7ULu_GluZ8c2ZGpyjeE1wRTMSnQ-XCfvSzD2NFMRYVIRXG_FmxI0qyUiqmnhc7jAkuMZXyoniV0gFjXFOpXhYXRBItiMa7or27B5TCMrYodGgIqyPyI0pT-AMj6u24X-wePqPfPi22z1IX4mD_Yy6MD3CEFjVHZF1Y0uwdmmKYIM4e0mqYJgB3_7p40dk-wZvtvyx-fbm5u_5W3v78-v366rZ0XOq5JIw5pp20CgTOj3WOU9k2ALbpWiIZZUy0rrZW04Y3Ole5yOVacVB17r4sPp588xJ_F0izGXxy0OcrIK9nFBdcrzbnQKGYYLXEZ0EmJJVU67MgxTWpGVUZlCfQxZBShM5M0Q82Hg3BZs3WHMxjtmbN1hBicra58902YmkGaJ_6tjAz8GEDbHK276IdnU9PnCZccy0y9_bEQfTuUb75IaXQdJ1ztck5qgcP0STnYXTQ-ghuNm3wZ3f9BwSo0OI</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>20414328</pqid></control><display><type>article</type><title>The sound of motion in spoken language: Visual information conveyed by acoustic properties of speech</title><source>MEDLINE</source><source>Elsevier ScienceDirect Journals</source><creator>Shintel, Hadas ; Nusbaum, Howard C.</creator><creatorcontrib>Shintel, Hadas ; Nusbaum, Howard C.</creatorcontrib><description>Language is generally viewed as conveying information through symbols whose form is arbitrarily related to their meaning. This arbitrary relation is often assumed to also characterize the mental representations underlying language comprehension. We explore the idea that visuo-spatial information can be analogically conveyed through acoustic properties of speech and that such information is integrated into an analog perceptual representation as a natural part of comprehension. Listeners heard sentences describing objects, spoken at varying speaking rates. After each sentence, participants saw a picture of an object and judged whether it had been mentioned in the sentence. Participants were faster to recognize the object when motion implied by speaking rate matched the motion implied by the picture. Results suggest that visuo-spatial referential information can be analogically conveyed and represented.</description><identifier>ISSN: 0010-0277</identifier><identifier>EISSN: 1873-7838</identifier><identifier>DOI: 10.1016/j.cognition.2006.11.005</identifier><identifier>PMID: 17196190</identifier><identifier>CODEN: CGTNAU</identifier><language>eng</language><publisher>Amsterdam: Elsevier B.V</publisher><subject>Acoustics ; Biological and medical sciences ; Cognition ; Cognitive Processes ; Fundamental and applied biological sciences. Psychology ; Humans ; Language ; Linguistics ; Listening Comprehension ; Motion ; Object ; Oral Language ; Perceptual representations ; Pictorial Stimuli ; Production and perception of spoken language ; Prosody ; Psychology. Psychoanalysis. Psychiatry ; Psychology. Psychophysiology ; Sentences ; Sound ; Spatial Ability ; Speech ; Speech Acoustics ; Speech Communication ; Speech Perception ; Spoken language comprehension ; Task Analysis ; Visual Perception</subject><ispartof>Cognition, 2007-12, Vol.105 (3), p.681-690</ispartof><rights>2006 Elsevier B.V.</rights><rights>2007 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c579t-133c39c7a8e60a8eacc527dbeeabfd1732336dc4aa92b5b9abf56173485e84133</citedby><cites>FETCH-LOGICAL-c579t-133c39c7a8e60a8eacc527dbeeabfd1732336dc4aa92b5b9abf56173485e84133</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0010027706002368$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ776925$$DView record in ERIC$$Hfree_for_read</backlink><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=19159596$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/17196190$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Shintel, Hadas</creatorcontrib><creatorcontrib>Nusbaum, Howard C.</creatorcontrib><title>The sound of motion in spoken language: Visual information conveyed by acoustic properties of speech</title><title>Cognition</title><addtitle>Cognition</addtitle><description>Language is generally viewed as conveying information through symbols whose form is arbitrarily related to their meaning. This arbitrary relation is often assumed to also characterize the mental representations underlying language comprehension. We explore the idea that visuo-spatial information can be analogically conveyed through acoustic properties of speech and that such information is integrated into an analog perceptual representation as a natural part of comprehension. Listeners heard sentences describing objects, spoken at varying speaking rates. After each sentence, participants saw a picture of an object and judged whether it had been mentioned in the sentence. Participants were faster to recognize the object when motion implied by speaking rate matched the motion implied by the picture. Results suggest that visuo-spatial referential information can be analogically conveyed and represented.</description><subject>Acoustics</subject><subject>Biological and medical sciences</subject><subject>Cognition</subject><subject>Cognitive Processes</subject><subject>Fundamental and applied biological sciences. Psychology</subject><subject>Humans</subject><subject>Language</subject><subject>Linguistics</subject><subject>Listening Comprehension</subject><subject>Motion</subject><subject>Object</subject><subject>Oral Language</subject><subject>Perceptual representations</subject><subject>Pictorial Stimuli</subject><subject>Production and perception of spoken language</subject><subject>Prosody</subject><subject>Psychology. Psychoanalysis. Psychiatry</subject><subject>Psychology. Psychophysiology</subject><subject>Sentences</subject><subject>Sound</subject><subject>Spatial Ability</subject><subject>Speech</subject><subject>Speech Acoustics</subject><subject>Speech Communication</subject><subject>Speech Perception</subject><subject>Spoken language comprehension</subject><subject>Task Analysis</subject><subject>Visual Perception</subject><issn>0010-0277</issn><issn>1873-7838</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2007</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNqFkUtv1TAQhSMEopfCP0DgDewS_Ihf7KqqvFSJTWFrOc7k1pckDnZS6f57HG7ULu_GluZ8c2ZGpyjeE1wRTMSnQ-XCfvSzD2NFMRYVIRXG_FmxI0qyUiqmnhc7jAkuMZXyoniV0gFjXFOpXhYXRBItiMa7or27B5TCMrYodGgIqyPyI0pT-AMj6u24X-wePqPfPi22z1IX4mD_Yy6MD3CEFjVHZF1Y0uwdmmKYIM4e0mqYJgB3_7p40dk-wZvtvyx-fbm5u_5W3v78-v366rZ0XOq5JIw5pp20CgTOj3WOU9k2ALbpWiIZZUy0rrZW04Y3Ole5yOVacVB17r4sPp588xJ_F0izGXxy0OcrIK9nFBdcrzbnQKGYYLXEZ0EmJJVU67MgxTWpGVUZlCfQxZBShM5M0Q82Hg3BZs3WHMxjtmbN1hBicra58902YmkGaJ_6tjAz8GEDbHK276IdnU9PnCZccy0y9_bEQfTuUb75IaXQdJ1ztck5qgcP0STnYXTQ-ghuNm3wZ3f9BwSo0OI</recordid><startdate>20071201</startdate><enddate>20071201</enddate><creator>Shintel, Hadas</creator><creator>Nusbaum, Howard C.</creator><general>Elsevier B.V</general><general>Elsevier</general><general>Elsevier Science</general><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7TK</scope><scope>8BJ</scope><scope>FQK</scope><scope>JBE</scope><scope>7X8</scope><scope>7T9</scope></search><sort><creationdate>20071201</creationdate><title>The sound of motion in spoken language: Visual information conveyed by acoustic properties of speech</title><author>Shintel, Hadas ; Nusbaum, Howard C.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c579t-133c39c7a8e60a8eacc527dbeeabfd1732336dc4aa92b5b9abf56173485e84133</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2007</creationdate><topic>Acoustics</topic><topic>Biological and medical sciences</topic><topic>Cognition</topic><topic>Cognitive Processes</topic><topic>Fundamental and applied biological sciences. Psychology</topic><topic>Humans</topic><topic>Language</topic><topic>Linguistics</topic><topic>Listening Comprehension</topic><topic>Motion</topic><topic>Object</topic><topic>Oral Language</topic><topic>Perceptual representations</topic><topic>Pictorial Stimuli</topic><topic>Production and perception of spoken language</topic><topic>Prosody</topic><topic>Psychology. Psychoanalysis. Psychiatry</topic><topic>Psychology. Psychophysiology</topic><topic>Sentences</topic><topic>Sound</topic><topic>Spatial Ability</topic><topic>Speech</topic><topic>Speech Acoustics</topic><topic>Speech Communication</topic><topic>Speech Perception</topic><topic>Spoken language comprehension</topic><topic>Task Analysis</topic><topic>Visual Perception</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Shintel, Hadas</creatorcontrib><creatorcontrib>Nusbaum, Howard C.</creatorcontrib><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Neurosciences Abstracts</collection><collection>International Bibliography of the Social Sciences (IBSS)</collection><collection>International Bibliography of the Social Sciences</collection><collection>International Bibliography of the Social Sciences</collection><collection>MEDLINE - Academic</collection><collection>Linguistics and Language Behavior Abstracts (LLBA)</collection><jtitle>Cognition</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Shintel, Hadas</au><au>Nusbaum, Howard C.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ776925</ericid><atitle>The sound of motion in spoken language: Visual information conveyed by acoustic properties of speech</atitle><jtitle>Cognition</jtitle><addtitle>Cognition</addtitle><date>2007-12-01</date><risdate>2007</risdate><volume>105</volume><issue>3</issue><spage>681</spage><epage>690</epage><pages>681-690</pages><issn>0010-0277</issn><eissn>1873-7838</eissn><coden>CGTNAU</coden><abstract>Language is generally viewed as conveying information through symbols whose form is arbitrarily related to their meaning. This arbitrary relation is often assumed to also characterize the mental representations underlying language comprehension. We explore the idea that visuo-spatial information can be analogically conveyed through acoustic properties of speech and that such information is integrated into an analog perceptual representation as a natural part of comprehension. Listeners heard sentences describing objects, spoken at varying speaking rates. After each sentence, participants saw a picture of an object and judged whether it had been mentioned in the sentence. Participants were faster to recognize the object when motion implied by speaking rate matched the motion implied by the picture. Results suggest that visuo-spatial referential information can be analogically conveyed and represented.</abstract><cop>Amsterdam</cop><pub>Elsevier B.V</pub><pmid>17196190</pmid><doi>10.1016/j.cognition.2006.11.005</doi><tpages>10</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0010-0277
ispartof Cognition, 2007-12, Vol.105 (3), p.681-690
issn 0010-0277
1873-7838
language eng
recordid cdi_proquest_miscellaneous_85659732
source MEDLINE; Elsevier ScienceDirect Journals
subjects Acoustics
Biological and medical sciences
Cognition
Cognitive Processes
Fundamental and applied biological sciences. Psychology
Humans
Language
Linguistics
Listening Comprehension
Motion
Object
Oral Language
Perceptual representations
Pictorial Stimuli
Production and perception of spoken language
Prosody
Psychology. Psychoanalysis. Psychiatry
Psychology. Psychophysiology
Sentences
Sound
Spatial Ability
Speech
Speech Acoustics
Speech Communication
Speech Perception
Spoken language comprehension
Task Analysis
Visual Perception
title The sound of motion in spoken language: Visual information conveyed by acoustic properties of speech
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T16%3A10%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20sound%20of%20motion%20in%20spoken%20language:%20Visual%20information%20conveyed%20by%20acoustic%20properties%20of%20speech&rft.jtitle=Cognition&rft.au=Shintel,%20Hadas&rft.date=2007-12-01&rft.volume=105&rft.issue=3&rft.spage=681&rft.epage=690&rft.pages=681-690&rft.issn=0010-0277&rft.eissn=1873-7838&rft.coden=CGTNAU&rft_id=info:doi/10.1016/j.cognition.2006.11.005&rft_dat=%3Cproquest_cross%3E20414328%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=20414328&rft_id=info:pmid/17196190&rft_ericid=EJ776925&rft_els_id=S0010027706002368&rfr_iscdi=true