Supra-normal skills in processing of visuo-auditory prosodic information by cochlear-implanted deaf patients

•Cochlear implanted (CI) patients present deficits at processing auditory and visual prosodic cues.•CI patients are much more proficient at fusing visual and auditory prosodic cues than controls.•CI patients develop specific oculomotor strategies for faces during processing prosodic information.•CI...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Hearing research 2021-10, Vol.410, p.108330-108330, Article 108330
Hauptverfasser: Lasfargues-Delannoy, Anne, Strelnikov, Kuzma, Deguine, Olivier, Marx, Mathieu, Barone, Pascal
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 108330
container_issue
container_start_page 108330
container_title Hearing research
container_volume 410
creator Lasfargues-Delannoy, Anne
Strelnikov, Kuzma
Deguine, Olivier
Marx, Mathieu
Barone, Pascal
description •Cochlear implanted (CI) patients present deficits at processing auditory and visual prosodic cues.•CI patients are much more proficient at fusing visual and auditory prosodic cues than controls.•CI patients develop specific oculomotor strategies for faces during processing prosodic information.•CI patients present supra-normal skills for multisensory integration of speech related information. Cochlear implanted (CI) adults with acquired deafness are known to depend on multisensory integration skills (MSI) for speech comprehension through the fusion of speech reading skills and their deficient auditory perception. But, little is known on how CI patients perceive prosodic information relating to speech content. Our study aimed to identify how CI patients use MSI between visual and auditory information to process paralinguistic prosodic information of multimodal speech and the visual strategies employed. A psychophysics assessment was developed, in which CI patients and hearing controls (NH) had to distinguish between a question and a statement. The controls were separated into two age groups (young and aged-matched) to dissociate any effect of aging. In addition, the oculomotor strategies used when facing a speaker in this prosodic decision task were recorded using an eye-tracking device and compared to controls. This study confirmed that prosodic processing is multisensory but it revealed that CI patients showed significant supra-normal audiovisual integration for prosodic information compared to hearing controls irrespective of age. This study clearly showed that CI patients had a visuo-auditory gain more than 3 times larger than that observed in hearing controls. Furthermore, CI participants performed better in the visuo-auditory situation through a specific oculomotor exploration of the face as they significantly fixate the mouth region more than young NH participants who fixate the eyes, whereas the aged-matched controls presented an intermediate exploration pattern equally reported between the eyes and mouth. To conclude, our study demonstrated that CI patients have supra-normal skills MSI when integrating visual and auditory linguistic prosodic information, and a specific adaptive strategy developed as it participates directly in speech content comprehension.
doi_str_mv 10.1016/j.heares.2021.108330
format Article
fullrecord <record><control><sourceid>proquest_hal_p</sourceid><recordid>TN_cdi_hal_primary_oai_HAL_hal_03411773v1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0378595521001647</els_id><sourcerecordid>2570372283</sourcerecordid><originalsourceid>FETCH-LOGICAL-c442t-60b6e99a092b33381175fe98870e7e15ceb2e6076bf2a15fe464d44f5f3e10b13</originalsourceid><addsrcrecordid>eNp9kctq3DAUhkVoSKbTvkEpWrYLT3TzbVMIIe0EBrJIshayfNTRVLZcyR6Yt4-MJ1lmJXHOd27_j9A3SjaU0OLmsNmDChA3jDCaQhXn5AKtaFVWWV7V9BNaET7_6zy_Rp9jPBBCcy7YFbrmQtRMCLFC7mkagsp6HzrlcPxnnYvY9ngIXkOMtv-LvcFHGyefqam1ow-nORl9a3UCzVw4Wt_j5oS113uXlspsNzjVj9DiFpTBQyKgH-MXdGmUi_D1_K7Ry-_757tttnv883B3u8u0EGzMCtIUUNeK1KzhnFeUlrmBuqpKAiXQXEPDoCBl0RimaEqJQrRCmNxwoKShfI1-Ln33yskh2E6Fk_TKyu3tTs4xwkVqWvLjzP5Y2HTU_wniKDsbNbi0P_gpSpaXSUbGkrxrJBZUp_tjAPPemxI5eyIPcvFEzp7IxZNU9v08YWo6aN-L3kxIwK8FgKTJ0UKQUSe9NLQ2gB5l6-3HE14BbjGfqg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2570372283</pqid></control><display><type>article</type><title>Supra-normal skills in processing of visuo-auditory prosodic information by cochlear-implanted deaf patients</title><source>MEDLINE</source><source>Elsevier ScienceDirect Journals</source><creator>Lasfargues-Delannoy, Anne ; Strelnikov, Kuzma ; Deguine, Olivier ; Marx, Mathieu ; Barone, Pascal</creator><creatorcontrib>Lasfargues-Delannoy, Anne ; Strelnikov, Kuzma ; Deguine, Olivier ; Marx, Mathieu ; Barone, Pascal</creatorcontrib><description>•Cochlear implanted (CI) patients present deficits at processing auditory and visual prosodic cues.•CI patients are much more proficient at fusing visual and auditory prosodic cues than controls.•CI patients develop specific oculomotor strategies for faces during processing prosodic information.•CI patients present supra-normal skills for multisensory integration of speech related information. Cochlear implanted (CI) adults with acquired deafness are known to depend on multisensory integration skills (MSI) for speech comprehension through the fusion of speech reading skills and their deficient auditory perception. But, little is known on how CI patients perceive prosodic information relating to speech content. Our study aimed to identify how CI patients use MSI between visual and auditory information to process paralinguistic prosodic information of multimodal speech and the visual strategies employed. A psychophysics assessment was developed, in which CI patients and hearing controls (NH) had to distinguish between a question and a statement. The controls were separated into two age groups (young and aged-matched) to dissociate any effect of aging. In addition, the oculomotor strategies used when facing a speaker in this prosodic decision task were recorded using an eye-tracking device and compared to controls. This study confirmed that prosodic processing is multisensory but it revealed that CI patients showed significant supra-normal audiovisual integration for prosodic information compared to hearing controls irrespective of age. This study clearly showed that CI patients had a visuo-auditory gain more than 3 times larger than that observed in hearing controls. Furthermore, CI participants performed better in the visuo-auditory situation through a specific oculomotor exploration of the face as they significantly fixate the mouth region more than young NH participants who fixate the eyes, whereas the aged-matched controls presented an intermediate exploration pattern equally reported between the eyes and mouth. To conclude, our study demonstrated that CI patients have supra-normal skills MSI when integrating visual and auditory linguistic prosodic information, and a specific adaptive strategy developed as it participates directly in speech content comprehension.</description><identifier>ISSN: 0378-5955</identifier><identifier>EISSN: 1878-5891</identifier><identifier>DOI: 10.1016/j.heares.2021.108330</identifier><identifier>PMID: 34492444</identifier><language>eng</language><publisher>Netherlands: Elsevier B.V</publisher><subject>Acoustic Stimulation ; Cochlear implant ; Cochlear Implantation ; Cochlear Implants ; Cognitive science ; Deafness - diagnosis ; Deafness - surgery ; Eye-tracking ; Humans ; Multisensory integration ; Prosody ; Psychology ; Speech Perception</subject><ispartof>Hearing research, 2021-10, Vol.410, p.108330-108330, Article 108330</ispartof><rights>2021</rights><rights>Copyright © 2021. Published by Elsevier B.V.</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c442t-60b6e99a092b33381175fe98870e7e15ceb2e6076bf2a15fe464d44f5f3e10b13</citedby><cites>FETCH-LOGICAL-c442t-60b6e99a092b33381175fe98870e7e15ceb2e6076bf2a15fe464d44f5f3e10b13</cites><orcidid>0000-0001-6439-3032 ; 0000-0003-0125-7468 ; 0000-0002-6613-2300 ; 0000-0003-4174-2580 ; 0000-0001-8295-8885</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.heares.2021.108330$$EHTML$$P50$$Gelsevier$$Hfree_for_read</linktohtml><link.rule.ids>230,314,777,781,882,3537,27905,27906,45976</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/34492444$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink><backlink>$$Uhttps://hal.science/hal-03411773$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Lasfargues-Delannoy, Anne</creatorcontrib><creatorcontrib>Strelnikov, Kuzma</creatorcontrib><creatorcontrib>Deguine, Olivier</creatorcontrib><creatorcontrib>Marx, Mathieu</creatorcontrib><creatorcontrib>Barone, Pascal</creatorcontrib><title>Supra-normal skills in processing of visuo-auditory prosodic information by cochlear-implanted deaf patients</title><title>Hearing research</title><addtitle>Hear Res</addtitle><description>•Cochlear implanted (CI) patients present deficits at processing auditory and visual prosodic cues.•CI patients are much more proficient at fusing visual and auditory prosodic cues than controls.•CI patients develop specific oculomotor strategies for faces during processing prosodic information.•CI patients present supra-normal skills for multisensory integration of speech related information. Cochlear implanted (CI) adults with acquired deafness are known to depend on multisensory integration skills (MSI) for speech comprehension through the fusion of speech reading skills and their deficient auditory perception. But, little is known on how CI patients perceive prosodic information relating to speech content. Our study aimed to identify how CI patients use MSI between visual and auditory information to process paralinguistic prosodic information of multimodal speech and the visual strategies employed. A psychophysics assessment was developed, in which CI patients and hearing controls (NH) had to distinguish between a question and a statement. The controls were separated into two age groups (young and aged-matched) to dissociate any effect of aging. In addition, the oculomotor strategies used when facing a speaker in this prosodic decision task were recorded using an eye-tracking device and compared to controls. This study confirmed that prosodic processing is multisensory but it revealed that CI patients showed significant supra-normal audiovisual integration for prosodic information compared to hearing controls irrespective of age. This study clearly showed that CI patients had a visuo-auditory gain more than 3 times larger than that observed in hearing controls. Furthermore, CI participants performed better in the visuo-auditory situation through a specific oculomotor exploration of the face as they significantly fixate the mouth region more than young NH participants who fixate the eyes, whereas the aged-matched controls presented an intermediate exploration pattern equally reported between the eyes and mouth. To conclude, our study demonstrated that CI patients have supra-normal skills MSI when integrating visual and auditory linguistic prosodic information, and a specific adaptive strategy developed as it participates directly in speech content comprehension.</description><subject>Acoustic Stimulation</subject><subject>Cochlear implant</subject><subject>Cochlear Implantation</subject><subject>Cochlear Implants</subject><subject>Cognitive science</subject><subject>Deafness - diagnosis</subject><subject>Deafness - surgery</subject><subject>Eye-tracking</subject><subject>Humans</subject><subject>Multisensory integration</subject><subject>Prosody</subject><subject>Psychology</subject><subject>Speech Perception</subject><issn>0378-5955</issn><issn>1878-5891</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp9kctq3DAUhkVoSKbTvkEpWrYLT3TzbVMIIe0EBrJIshayfNTRVLZcyR6Yt4-MJ1lmJXHOd27_j9A3SjaU0OLmsNmDChA3jDCaQhXn5AKtaFVWWV7V9BNaET7_6zy_Rp9jPBBCcy7YFbrmQtRMCLFC7mkagsp6HzrlcPxnnYvY9ngIXkOMtv-LvcFHGyefqam1ow-nORl9a3UCzVw4Wt_j5oS113uXlspsNzjVj9DiFpTBQyKgH-MXdGmUi_D1_K7Ry-_757tttnv883B3u8u0EGzMCtIUUNeK1KzhnFeUlrmBuqpKAiXQXEPDoCBl0RimaEqJQrRCmNxwoKShfI1-Ln33yskh2E6Fk_TKyu3tTs4xwkVqWvLjzP5Y2HTU_wniKDsbNbi0P_gpSpaXSUbGkrxrJBZUp_tjAPPemxI5eyIPcvFEzp7IxZNU9v08YWo6aN-L3kxIwK8FgKTJ0UKQUSe9NLQ2gB5l6-3HE14BbjGfqg</recordid><startdate>202110</startdate><enddate>202110</enddate><creator>Lasfargues-Delannoy, Anne</creator><creator>Strelnikov, Kuzma</creator><creator>Deguine, Olivier</creator><creator>Marx, Mathieu</creator><creator>Barone, Pascal</creator><general>Elsevier B.V</general><general>Elsevier</general><scope>6I.</scope><scope>AAFTH</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>1XC</scope><scope>VOOES</scope><orcidid>https://orcid.org/0000-0001-6439-3032</orcidid><orcidid>https://orcid.org/0000-0003-0125-7468</orcidid><orcidid>https://orcid.org/0000-0002-6613-2300</orcidid><orcidid>https://orcid.org/0000-0003-4174-2580</orcidid><orcidid>https://orcid.org/0000-0001-8295-8885</orcidid></search><sort><creationdate>202110</creationdate><title>Supra-normal skills in processing of visuo-auditory prosodic information by cochlear-implanted deaf patients</title><author>Lasfargues-Delannoy, Anne ; Strelnikov, Kuzma ; Deguine, Olivier ; Marx, Mathieu ; Barone, Pascal</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c442t-60b6e99a092b33381175fe98870e7e15ceb2e6076bf2a15fe464d44f5f3e10b13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Acoustic Stimulation</topic><topic>Cochlear implant</topic><topic>Cochlear Implantation</topic><topic>Cochlear Implants</topic><topic>Cognitive science</topic><topic>Deafness - diagnosis</topic><topic>Deafness - surgery</topic><topic>Eye-tracking</topic><topic>Humans</topic><topic>Multisensory integration</topic><topic>Prosody</topic><topic>Psychology</topic><topic>Speech Perception</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lasfargues-Delannoy, Anne</creatorcontrib><creatorcontrib>Strelnikov, Kuzma</creatorcontrib><creatorcontrib>Deguine, Olivier</creatorcontrib><creatorcontrib>Marx, Mathieu</creatorcontrib><creatorcontrib>Barone, Pascal</creatorcontrib><collection>ScienceDirect Open Access Titles</collection><collection>Elsevier:ScienceDirect:Open Access</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>Hyper Article en Ligne (HAL) (Open Access)</collection><jtitle>Hearing research</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lasfargues-Delannoy, Anne</au><au>Strelnikov, Kuzma</au><au>Deguine, Olivier</au><au>Marx, Mathieu</au><au>Barone, Pascal</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Supra-normal skills in processing of visuo-auditory prosodic information by cochlear-implanted deaf patients</atitle><jtitle>Hearing research</jtitle><addtitle>Hear Res</addtitle><date>2021-10</date><risdate>2021</risdate><volume>410</volume><spage>108330</spage><epage>108330</epage><pages>108330-108330</pages><artnum>108330</artnum><issn>0378-5955</issn><eissn>1878-5891</eissn><abstract>•Cochlear implanted (CI) patients present deficits at processing auditory and visual prosodic cues.•CI patients are much more proficient at fusing visual and auditory prosodic cues than controls.•CI patients develop specific oculomotor strategies for faces during processing prosodic information.•CI patients present supra-normal skills for multisensory integration of speech related information. Cochlear implanted (CI) adults with acquired deafness are known to depend on multisensory integration skills (MSI) for speech comprehension through the fusion of speech reading skills and their deficient auditory perception. But, little is known on how CI patients perceive prosodic information relating to speech content. Our study aimed to identify how CI patients use MSI between visual and auditory information to process paralinguistic prosodic information of multimodal speech and the visual strategies employed. A psychophysics assessment was developed, in which CI patients and hearing controls (NH) had to distinguish between a question and a statement. The controls were separated into two age groups (young and aged-matched) to dissociate any effect of aging. In addition, the oculomotor strategies used when facing a speaker in this prosodic decision task were recorded using an eye-tracking device and compared to controls. This study confirmed that prosodic processing is multisensory but it revealed that CI patients showed significant supra-normal audiovisual integration for prosodic information compared to hearing controls irrespective of age. This study clearly showed that CI patients had a visuo-auditory gain more than 3 times larger than that observed in hearing controls. Furthermore, CI participants performed better in the visuo-auditory situation through a specific oculomotor exploration of the face as they significantly fixate the mouth region more than young NH participants who fixate the eyes, whereas the aged-matched controls presented an intermediate exploration pattern equally reported between the eyes and mouth. To conclude, our study demonstrated that CI patients have supra-normal skills MSI when integrating visual and auditory linguistic prosodic information, and a specific adaptive strategy developed as it participates directly in speech content comprehension.</abstract><cop>Netherlands</cop><pub>Elsevier B.V</pub><pmid>34492444</pmid><doi>10.1016/j.heares.2021.108330</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0001-6439-3032</orcidid><orcidid>https://orcid.org/0000-0003-0125-7468</orcidid><orcidid>https://orcid.org/0000-0002-6613-2300</orcidid><orcidid>https://orcid.org/0000-0003-4174-2580</orcidid><orcidid>https://orcid.org/0000-0001-8295-8885</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0378-5955
ispartof Hearing research, 2021-10, Vol.410, p.108330-108330, Article 108330
issn 0378-5955
1878-5891
language eng
recordid cdi_hal_primary_oai_HAL_hal_03411773v1
source MEDLINE; Elsevier ScienceDirect Journals
subjects Acoustic Stimulation
Cochlear implant
Cochlear Implantation
Cochlear Implants
Cognitive science
Deafness - diagnosis
Deafness - surgery
Eye-tracking
Humans
Multisensory integration
Prosody
Psychology
Speech Perception
title Supra-normal skills in processing of visuo-auditory prosodic information by cochlear-implanted deaf patients
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T05%3A17%3A22IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_hal_p&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Supra-normal%20skills%20in%20processing%20of%20visuo-auditory%20prosodic%20information%20by%20cochlear-implanted%20deaf%20patients&rft.jtitle=Hearing%20research&rft.au=Lasfargues-Delannoy,%20Anne&rft.date=2021-10&rft.volume=410&rft.spage=108330&rft.epage=108330&rft.pages=108330-108330&rft.artnum=108330&rft.issn=0378-5955&rft.eissn=1878-5891&rft_id=info:doi/10.1016/j.heares.2021.108330&rft_dat=%3Cproquest_hal_p%3E2570372283%3C/proquest_hal_p%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2570372283&rft_id=info:pmid/34492444&rft_els_id=S0378595521001647&rfr_iscdi=true