Tracking the Costs of Clear and Loud Speech: Interactions Between Speech Motor Control and Concurrent Visuomotor Tracking
Purpose: Prior work has demonstrated that competing tasks impact habitual speech production. The purpose of this investigation was to quantify the extent to which clear and loud speech are affected by concurrent performance of an attention-demanding task. Method: Speech kinematics and acoustics were...
Gespeichert in:
Veröffentlicht in: | Journal of speech, language, and hearing research language, and hearing research, 2021-06, Vol.64, p.2182-2195 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 2195 |
---|---|
container_issue | |
container_start_page | 2182 |
container_title | Journal of speech, language, and hearing research |
container_volume | 64 |
creator | Whitfield, Jason A Holdosh, Serena R Kriegel, Zoe Sullivan, Lauren E Fullenkamp, Adam M |
description | Purpose: Prior work has demonstrated that competing tasks impact habitual speech production. The purpose of this investigation was to quantify the extent to which clear and loud speech are affected by concurrent performance of an attention-demanding task. Method: Speech kinematics and acoustics were collected while participants spoke using habitual, loud, and clear speech styles. The styles were performed in isolation and while performing a secondary tracking task. Results: Compared to the habitual style, speakers exhibited expected increases in lip aperture range of motion and speech intensity for the clear and loud styles. During concurrent visuomotor tracking, there was a decrease in lip aperture range of motion and speech intensity for the habitual style. Tracking performance during habitual speech did not differ from single-task tracking. For loud and clear speech, speakers retained the gains in speech intensity and range of motion, respectively, while concurrently tracking. A reduction in tracking performance was observed during concurrent loud and clear speech, compared to tracking alone. Conclusions: These data suggest that loud and clear speech may help to mitigate motor interference associated with concurrent performance of an attention-demanding task. Additionally, reductions in tracking accuracy observed during concurrent loud and clear speech may suggest that these higher effort speaking styles require greater attentional resources than habitual speech. |
doi_str_mv | 10.m44/2020_JSLHR-20-00264 |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2563495760</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2563495760</sourcerecordid><originalsourceid>FETCH-proquest_journals_25634957603</originalsourceid><addsrcrecordid>eNqNjcFqwkAQhpeCUGv7Aj0N9Jx2sm5S7bFBUbGXKr2GEMcajTu6O0vx7buI3p3LP_B9M79Szym-7o1506ixnC3mk-9EY4Koc3OnummWDZJhivpePXi_xTipybvqtHRVvWvsL8iGoGAvHngNRUuVg8quYM5hBYsDUb35gKkVir40bD18kvwR2QuELxZ28YMVx-35NO51cI6swE_jA-_PxrXwUXXWVevp6ZI99TIeLYtJcnB8DOSl3HJwNqJSZ3nfDLP3HPu3Wf_Ez1Mz</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2563495760</pqid></control><display><type>article</type><title>Tracking the Costs of Clear and Loud Speech: Interactions Between Speech Motor Control and Concurrent Visuomotor Tracking</title><source>Education Source</source><source>Alma/SFX Local Collection</source><creator>Whitfield, Jason A ; Holdosh, Serena R ; Kriegel, Zoe ; Sullivan, Lauren E ; Fullenkamp, Adam M</creator><creatorcontrib>Whitfield, Jason A ; Holdosh, Serena R ; Kriegel, Zoe ; Sullivan, Lauren E ; Fullenkamp, Adam M</creatorcontrib><description>Purpose: Prior work has demonstrated that competing tasks impact habitual speech production. The purpose of this investigation was to quantify the extent to which clear and loud speech are affected by concurrent performance of an attention-demanding task. Method: Speech kinematics and acoustics were collected while participants spoke using habitual, loud, and clear speech styles. The styles were performed in isolation and while performing a secondary tracking task. Results: Compared to the habitual style, speakers exhibited expected increases in lip aperture range of motion and speech intensity for the clear and loud styles. During concurrent visuomotor tracking, there was a decrease in lip aperture range of motion and speech intensity for the habitual style. Tracking performance during habitual speech did not differ from single-task tracking. For loud and clear speech, speakers retained the gains in speech intensity and range of motion, respectively, while concurrently tracking. A reduction in tracking performance was observed during concurrent loud and clear speech, compared to tracking alone. Conclusions: These data suggest that loud and clear speech may help to mitigate motor interference associated with concurrent performance of an attention-demanding task. Additionally, reductions in tracking accuracy observed during concurrent loud and clear speech may suggest that these higher effort speaking styles require greater attentional resources than habitual speech.</description><identifier>EISSN: 1558-9102</identifier><identifier>DOI: 10.m44/2020_JSLHR-20-00264</identifier><language>eng</language><publisher>Rockville: American Speech-Language-Hearing Association</publisher><subject>Acoustics ; Articulation (Speech) ; Attention ; Communication ; Direct Instruction ; Dysarthria ; Hearing Impairments ; Hearing loss ; Kinematics ; Learning transfer ; Lips ; Motion ; Noise ; Range of motion ; Sound intensity ; Speaking ; Speech ; Speech motor control ; Speech production ; Speech styles ; Teaching Methods</subject><ispartof>Journal of speech, language, and hearing research, 2021-06, Vol.64, p.2182-2195</ispartof><rights>Copyright American Speech-Language-Hearing Association Jun 2021</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Whitfield, Jason A</creatorcontrib><creatorcontrib>Holdosh, Serena R</creatorcontrib><creatorcontrib>Kriegel, Zoe</creatorcontrib><creatorcontrib>Sullivan, Lauren E</creatorcontrib><creatorcontrib>Fullenkamp, Adam M</creatorcontrib><title>Tracking the Costs of Clear and Loud Speech: Interactions Between Speech Motor Control and Concurrent Visuomotor Tracking</title><title>Journal of speech, language, and hearing research</title><description>Purpose: Prior work has demonstrated that competing tasks impact habitual speech production. The purpose of this investigation was to quantify the extent to which clear and loud speech are affected by concurrent performance of an attention-demanding task. Method: Speech kinematics and acoustics were collected while participants spoke using habitual, loud, and clear speech styles. The styles were performed in isolation and while performing a secondary tracking task. Results: Compared to the habitual style, speakers exhibited expected increases in lip aperture range of motion and speech intensity for the clear and loud styles. During concurrent visuomotor tracking, there was a decrease in lip aperture range of motion and speech intensity for the habitual style. Tracking performance during habitual speech did not differ from single-task tracking. For loud and clear speech, speakers retained the gains in speech intensity and range of motion, respectively, while concurrently tracking. A reduction in tracking performance was observed during concurrent loud and clear speech, compared to tracking alone. Conclusions: These data suggest that loud and clear speech may help to mitigate motor interference associated with concurrent performance of an attention-demanding task. Additionally, reductions in tracking accuracy observed during concurrent loud and clear speech may suggest that these higher effort speaking styles require greater attentional resources than habitual speech.</description><subject>Acoustics</subject><subject>Articulation (Speech)</subject><subject>Attention</subject><subject>Communication</subject><subject>Direct Instruction</subject><subject>Dysarthria</subject><subject>Hearing Impairments</subject><subject>Hearing loss</subject><subject>Kinematics</subject><subject>Learning transfer</subject><subject>Lips</subject><subject>Motion</subject><subject>Noise</subject><subject>Range of motion</subject><subject>Sound intensity</subject><subject>Speaking</subject><subject>Speech</subject><subject>Speech motor control</subject><subject>Speech production</subject><subject>Speech styles</subject><subject>Teaching Methods</subject><issn>1558-9102</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqNjcFqwkAQhpeCUGv7Aj0N9Jx2sm5S7bFBUbGXKr2GEMcajTu6O0vx7buI3p3LP_B9M79Szym-7o1506ixnC3mk-9EY4Koc3OnummWDZJhivpePXi_xTipybvqtHRVvWvsL8iGoGAvHngNRUuVg8quYM5hBYsDUb35gKkVir40bD18kvwR2QuELxZ28YMVx-35NO51cI6swE_jA-_PxrXwUXXWVevp6ZI99TIeLYtJcnB8DOSl3HJwNqJSZ3nfDLP3HPu3Wf_Ez1Mz</recordid><startdate>20210601</startdate><enddate>20210601</enddate><creator>Whitfield, Jason A</creator><creator>Holdosh, Serena R</creator><creator>Kriegel, Zoe</creator><creator>Sullivan, Lauren E</creator><creator>Fullenkamp, Adam M</creator><general>American Speech-Language-Hearing Association</general><scope>0-V</scope><scope>3V.</scope><scope>7RV</scope><scope>7T9</scope><scope>7X7</scope><scope>7XB</scope><scope>88B</scope><scope>88E</scope><scope>88G</scope><scope>88I</scope><scope>88J</scope><scope>8A4</scope><scope>8AF</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ALSLI</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>CJNVE</scope><scope>CPGLG</scope><scope>CRLPW</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB0</scope><scope>M0P</scope><scope>M0S</scope><scope>M1P</scope><scope>M2M</scope><scope>M2O</scope><scope>M2P</scope><scope>M2R</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>PQEDU</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>Q9U</scope><scope>S0X</scope></search><sort><creationdate>20210601</creationdate><title>Tracking the Costs of Clear and Loud Speech: Interactions Between Speech Motor Control and Concurrent Visuomotor Tracking</title><author>Whitfield, Jason A ; Holdosh, Serena R ; Kriegel, Zoe ; Sullivan, Lauren E ; Fullenkamp, Adam M</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_25634957603</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Acoustics</topic><topic>Articulation (Speech)</topic><topic>Attention</topic><topic>Communication</topic><topic>Direct Instruction</topic><topic>Dysarthria</topic><topic>Hearing Impairments</topic><topic>Hearing loss</topic><topic>Kinematics</topic><topic>Learning transfer</topic><topic>Lips</topic><topic>Motion</topic><topic>Noise</topic><topic>Range of motion</topic><topic>Sound intensity</topic><topic>Speaking</topic><topic>Speech</topic><topic>Speech motor control</topic><topic>Speech production</topic><topic>Speech styles</topic><topic>Teaching Methods</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Whitfield, Jason A</creatorcontrib><creatorcontrib>Holdosh, Serena R</creatorcontrib><creatorcontrib>Kriegel, Zoe</creatorcontrib><creatorcontrib>Sullivan, Lauren E</creatorcontrib><creatorcontrib>Fullenkamp, Adam M</creatorcontrib><collection>ProQuest Social Sciences Premium Collection</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing & Allied Health Database</collection><collection>Linguistics and Language Behavior Abstracts (LLBA)</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Education Database (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Psychology Database (Alumni)</collection><collection>Science Database (Alumni Edition)</collection><collection>Social Science Database (Alumni Edition)</collection><collection>Education Periodicals</collection><collection>STEM Database</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Social Science Premium Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>Education Collection</collection><collection>Linguistics Collection</collection><collection>Linguistics Database</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Nursing & Allied Health Database (Alumni Edition)</collection><collection>Education Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>ProQuest Psychology</collection><collection>Research Library</collection><collection>Science Database</collection><collection>Social Science Database</collection><collection>Research Library (Corporate)</collection><collection>Nursing & Allied Health Premium</collection><collection>ProQuest One Education</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>ProQuest Central Basic</collection><collection>SIRS Editorial</collection><jtitle>Journal of speech, language, and hearing research</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Whitfield, Jason A</au><au>Holdosh, Serena R</au><au>Kriegel, Zoe</au><au>Sullivan, Lauren E</au><au>Fullenkamp, Adam M</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Tracking the Costs of Clear and Loud Speech: Interactions Between Speech Motor Control and Concurrent Visuomotor Tracking</atitle><jtitle>Journal of speech, language, and hearing research</jtitle><date>2021-06-01</date><risdate>2021</risdate><volume>64</volume><spage>2182</spage><epage>2195</epage><pages>2182-2195</pages><eissn>1558-9102</eissn><abstract>Purpose: Prior work has demonstrated that competing tasks impact habitual speech production. The purpose of this investigation was to quantify the extent to which clear and loud speech are affected by concurrent performance of an attention-demanding task. Method: Speech kinematics and acoustics were collected while participants spoke using habitual, loud, and clear speech styles. The styles were performed in isolation and while performing a secondary tracking task. Results: Compared to the habitual style, speakers exhibited expected increases in lip aperture range of motion and speech intensity for the clear and loud styles. During concurrent visuomotor tracking, there was a decrease in lip aperture range of motion and speech intensity for the habitual style. Tracking performance during habitual speech did not differ from single-task tracking. For loud and clear speech, speakers retained the gains in speech intensity and range of motion, respectively, while concurrently tracking. A reduction in tracking performance was observed during concurrent loud and clear speech, compared to tracking alone. Conclusions: These data suggest that loud and clear speech may help to mitigate motor interference associated with concurrent performance of an attention-demanding task. Additionally, reductions in tracking accuracy observed during concurrent loud and clear speech may suggest that these higher effort speaking styles require greater attentional resources than habitual speech.</abstract><cop>Rockville</cop><pub>American Speech-Language-Hearing Association</pub><doi>10.m44/2020_JSLHR-20-00264</doi></addata></record> |
fulltext | fulltext |
identifier | EISSN: 1558-9102 |
ispartof | Journal of speech, language, and hearing research, 2021-06, Vol.64, p.2182-2195 |
issn | 1558-9102 |
language | eng |
recordid | cdi_proquest_journals_2563495760 |
source | Education Source; Alma/SFX Local Collection |
subjects | Acoustics Articulation (Speech) Attention Communication Direct Instruction Dysarthria Hearing Impairments Hearing loss Kinematics Learning transfer Lips Motion Noise Range of motion Sound intensity Speaking Speech Speech motor control Speech production Speech styles Teaching Methods |
title | Tracking the Costs of Clear and Loud Speech: Interactions Between Speech Motor Control and Concurrent Visuomotor Tracking |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T12%3A03%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Tracking%20the%20Costs%20of%20Clear%20and%20Loud%20Speech:%20Interactions%20Between%20Speech%20Motor%20Control%20and%20Concurrent%20Visuomotor%20Tracking&rft.jtitle=Journal%20of%20speech,%20language,%20and%20hearing%20research&rft.au=Whitfield,%20Jason%20A&rft.date=2021-06-01&rft.volume=64&rft.spage=2182&rft.epage=2195&rft.pages=2182-2195&rft.eissn=1558-9102&rft_id=info:doi/10.m44/2020_JSLHR-20-00264&rft_dat=%3Cproquest%3E2563495760%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2563495760&rft_id=info:pmid/&rfr_iscdi=true |