Do you hear what I see? An audio-visual paradigm to assess emotional egocentricity bias

We often use our own emotions to understand other people's emotions. However, emotional egocentric biases (EEB), namely the tendency to use one's own emotional state when relating to others' emotions, may hinder this process, especially when emotions are incongruent. We capitalised on...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Cognition and emotion 2020-06, Vol.34 (4), p.756-770
Hauptverfasser: von Mohr, Mariana, Finotti, Gianluca, Ambroziak, Klaudia B., Tsakiris, Manos
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 770
container_issue 4
container_start_page 756
container_title Cognition and emotion
container_volume 34
creator von Mohr, Mariana
Finotti, Gianluca
Ambroziak, Klaudia B.
Tsakiris, Manos
description We often use our own emotions to understand other people's emotions. However, emotional egocentric biases (EEB), namely the tendency to use one's own emotional state when relating to others' emotions, may hinder this process, especially when emotions are incongruent. We capitalised on the classic EEB task to develop a new version that is easier to implement and control. Unlike the original EEB task that relies on a combination of private (e.g. touch) and public (e.g. vision) sensory information, our EEB task (AV-EEB) used audio-visual stimuli to evoke congruent/incongruent emotions in participants. Auditory and visual signals are both public, in that they can be shared among individuals, and make the task easier to implement and control. We provide lab-based and online validations of the AV-EEB, and demonstrate a positive relationship between EEB and social negative potency. This new, easily implemented version of the EEB task can accelerate the investigation of egocentricity biases in several research areas.
doi_str_mv 10.1080/02699931.2019.1683516
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1080_02699931_2019_1683516</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2416759668</sourcerecordid><originalsourceid>FETCH-LOGICAL-c394t-f40451c333a474d54236e8badcff176646b2f09583875fe434f5be839f80e0923</originalsourceid><addsrcrecordid>eNp9kD1v2zAQhomiQeO6-QktCHTpIucoflicGiNtmgAGsqTISFDSMWEgiS4pxfC_Dw07GTJ0uuGe9-7FQ8hXBgsGFZxDqbTWnC1KYHrBVMUlUx_IjAklClAAH8lszxR76JR8TukJAAQX8ImccqaWJWg5I_e_At2FiT6ijXT7aEd6QxPiT7oaqJ1aH4pnnybb0Y2NtvUPPR0DtSlhShT7MPow5CU-hAaHMfrGjztae5u-kBNnu4Rnxzknf69-311eF-vbPzeXq3XRcC3GwgkQkjWccyuWopWi5Aqr2raNc2yplFB16XLRildL6TDXd7LGimtXAYIu-Zz8ONzdxPBvwjSa3qcGu84OGKZkSs6YEowLmdHv79CnMMVcP1MiC5FaZYtzIg9UE0NKEZ3ZRN_buDMMzN68eTVv9ubN0XzOfTten-oe27fUq-oMXBwAP7gQe7sNsWvNaHddiC7aofEpw__98QJocJBn</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2416759668</pqid></control><display><type>article</type><title>Do you hear what I see? An audio-visual paradigm to assess emotional egocentricity bias</title><source>Applied Social Sciences Index &amp; Abstracts (ASSIA)</source><source>MEDLINE</source><source>Business Source Complete</source><creator>von Mohr, Mariana ; Finotti, Gianluca ; Ambroziak, Klaudia B. ; Tsakiris, Manos</creator><creatorcontrib>von Mohr, Mariana ; Finotti, Gianluca ; Ambroziak, Klaudia B. ; Tsakiris, Manos</creatorcontrib><description>We often use our own emotions to understand other people's emotions. However, emotional egocentric biases (EEB), namely the tendency to use one's own emotional state when relating to others' emotions, may hinder this process, especially when emotions are incongruent. We capitalised on the classic EEB task to develop a new version that is easier to implement and control. Unlike the original EEB task that relies on a combination of private (e.g. touch) and public (e.g. vision) sensory information, our EEB task (AV-EEB) used audio-visual stimuli to evoke congruent/incongruent emotions in participants. Auditory and visual signals are both public, in that they can be shared among individuals, and make the task easier to implement and control. We provide lab-based and online validations of the AV-EEB, and demonstrate a positive relationship between EEB and social negative potency. This new, easily implemented version of the EEB task can accelerate the investigation of egocentricity biases in several research areas.</description><identifier>ISSN: 0269-9931</identifier><identifier>EISSN: 1464-0600</identifier><identifier>DOI: 10.1080/02699931.2019.1683516</identifier><identifier>PMID: 31672095</identifier><language>eng</language><publisher>England: Routledge</publisher><subject>Acoustic Stimulation - methods ; Adult ; alexithymia ; Bias ; body awareness ; Egocentrism ; Emotional egocentricity bias ; Emotions ; Female ; Humans ; Male ; Photic Stimulation - methods ; Psychological Tests ; Sensory integration ; social reward ; Visual stimuli ; Young Adult</subject><ispartof>Cognition and emotion, 2020-06, Vol.34 (4), p.756-770</ispartof><rights>2019 Informa UK Limited, trading as Taylor &amp; Francis Group 2019</rights><rights>2019 Informa UK Limited, trading as Taylor &amp; Francis Group</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c394t-f40451c333a474d54236e8badcff176646b2f09583875fe434f5be839f80e0923</citedby><cites>FETCH-LOGICAL-c394t-f40451c333a474d54236e8badcff176646b2f09583875fe434f5be839f80e0923</cites><orcidid>0000-0003-0671-0735</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902,30976</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/31672095$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>von Mohr, Mariana</creatorcontrib><creatorcontrib>Finotti, Gianluca</creatorcontrib><creatorcontrib>Ambroziak, Klaudia B.</creatorcontrib><creatorcontrib>Tsakiris, Manos</creatorcontrib><title>Do you hear what I see? An audio-visual paradigm to assess emotional egocentricity bias</title><title>Cognition and emotion</title><addtitle>Cogn Emot</addtitle><description>We often use our own emotions to understand other people's emotions. However, emotional egocentric biases (EEB), namely the tendency to use one's own emotional state when relating to others' emotions, may hinder this process, especially when emotions are incongruent. We capitalised on the classic EEB task to develop a new version that is easier to implement and control. Unlike the original EEB task that relies on a combination of private (e.g. touch) and public (e.g. vision) sensory information, our EEB task (AV-EEB) used audio-visual stimuli to evoke congruent/incongruent emotions in participants. Auditory and visual signals are both public, in that they can be shared among individuals, and make the task easier to implement and control. We provide lab-based and online validations of the AV-EEB, and demonstrate a positive relationship between EEB and social negative potency. This new, easily implemented version of the EEB task can accelerate the investigation of egocentricity biases in several research areas.</description><subject>Acoustic Stimulation - methods</subject><subject>Adult</subject><subject>alexithymia</subject><subject>Bias</subject><subject>body awareness</subject><subject>Egocentrism</subject><subject>Emotional egocentricity bias</subject><subject>Emotions</subject><subject>Female</subject><subject>Humans</subject><subject>Male</subject><subject>Photic Stimulation - methods</subject><subject>Psychological Tests</subject><subject>Sensory integration</subject><subject>social reward</subject><subject>Visual stimuli</subject><subject>Young Adult</subject><issn>0269-9931</issn><issn>1464-0600</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>7QJ</sourceid><recordid>eNp9kD1v2zAQhomiQeO6-QktCHTpIucoflicGiNtmgAGsqTISFDSMWEgiS4pxfC_Dw07GTJ0uuGe9-7FQ8hXBgsGFZxDqbTWnC1KYHrBVMUlUx_IjAklClAAH8lszxR76JR8TukJAAQX8ImccqaWJWg5I_e_At2FiT6ijXT7aEd6QxPiT7oaqJ1aH4pnnybb0Y2NtvUPPR0DtSlhShT7MPow5CU-hAaHMfrGjztae5u-kBNnu4Rnxzknf69-311eF-vbPzeXq3XRcC3GwgkQkjWccyuWopWi5Aqr2raNc2yplFB16XLRildL6TDXd7LGimtXAYIu-Zz8ONzdxPBvwjSa3qcGu84OGKZkSs6YEowLmdHv79CnMMVcP1MiC5FaZYtzIg9UE0NKEZ3ZRN_buDMMzN68eTVv9ubN0XzOfTten-oe27fUq-oMXBwAP7gQe7sNsWvNaHddiC7aofEpw__98QJocJBn</recordid><startdate>202006</startdate><enddate>202006</enddate><creator>von Mohr, Mariana</creator><creator>Finotti, Gianluca</creator><creator>Ambroziak, Klaudia B.</creator><creator>Tsakiris, Manos</creator><general>Routledge</general><general>Taylor &amp; Francis Ltd</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QJ</scope><scope>7TK</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-0671-0735</orcidid></search><sort><creationdate>202006</creationdate><title>Do you hear what I see? An audio-visual paradigm to assess emotional egocentricity bias</title><author>von Mohr, Mariana ; Finotti, Gianluca ; Ambroziak, Klaudia B. ; Tsakiris, Manos</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c394t-f40451c333a474d54236e8badcff176646b2f09583875fe434f5be839f80e0923</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Acoustic Stimulation - methods</topic><topic>Adult</topic><topic>alexithymia</topic><topic>Bias</topic><topic>body awareness</topic><topic>Egocentrism</topic><topic>Emotional egocentricity bias</topic><topic>Emotions</topic><topic>Female</topic><topic>Humans</topic><topic>Male</topic><topic>Photic Stimulation - methods</topic><topic>Psychological Tests</topic><topic>Sensory integration</topic><topic>social reward</topic><topic>Visual stimuli</topic><topic>Young Adult</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>von Mohr, Mariana</creatorcontrib><creatorcontrib>Finotti, Gianluca</creatorcontrib><creatorcontrib>Ambroziak, Klaudia B.</creatorcontrib><creatorcontrib>Tsakiris, Manos</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Applied Social Sciences Index &amp; Abstracts (ASSIA)</collection><collection>Neurosciences Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>Cognition and emotion</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>von Mohr, Mariana</au><au>Finotti, Gianluca</au><au>Ambroziak, Klaudia B.</au><au>Tsakiris, Manos</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Do you hear what I see? An audio-visual paradigm to assess emotional egocentricity bias</atitle><jtitle>Cognition and emotion</jtitle><addtitle>Cogn Emot</addtitle><date>2020-06</date><risdate>2020</risdate><volume>34</volume><issue>4</issue><spage>756</spage><epage>770</epage><pages>756-770</pages><issn>0269-9931</issn><eissn>1464-0600</eissn><abstract>We often use our own emotions to understand other people's emotions. However, emotional egocentric biases (EEB), namely the tendency to use one's own emotional state when relating to others' emotions, may hinder this process, especially when emotions are incongruent. We capitalised on the classic EEB task to develop a new version that is easier to implement and control. Unlike the original EEB task that relies on a combination of private (e.g. touch) and public (e.g. vision) sensory information, our EEB task (AV-EEB) used audio-visual stimuli to evoke congruent/incongruent emotions in participants. Auditory and visual signals are both public, in that they can be shared among individuals, and make the task easier to implement and control. We provide lab-based and online validations of the AV-EEB, and demonstrate a positive relationship between EEB and social negative potency. This new, easily implemented version of the EEB task can accelerate the investigation of egocentricity biases in several research areas.</abstract><cop>England</cop><pub>Routledge</pub><pmid>31672095</pmid><doi>10.1080/02699931.2019.1683516</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0003-0671-0735</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0269-9931
ispartof Cognition and emotion, 2020-06, Vol.34 (4), p.756-770
issn 0269-9931
1464-0600
language eng
recordid cdi_crossref_primary_10_1080_02699931_2019_1683516
source Applied Social Sciences Index & Abstracts (ASSIA); MEDLINE; Business Source Complete
subjects Acoustic Stimulation - methods
Adult
alexithymia
Bias
body awareness
Egocentrism
Emotional egocentricity bias
Emotions
Female
Humans
Male
Photic Stimulation - methods
Psychological Tests
Sensory integration
social reward
Visual stimuli
Young Adult
title Do you hear what I see? An audio-visual paradigm to assess emotional egocentricity bias
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-05T14%3A12%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Do%20you%20hear%20what%20I%20see?%20An%20audio-visual%20paradigm%20to%20assess%20emotional%20egocentricity%20bias&rft.jtitle=Cognition%20and%20emotion&rft.au=von%20Mohr,%20Mariana&rft.date=2020-06&rft.volume=34&rft.issue=4&rft.spage=756&rft.epage=770&rft.pages=756-770&rft.issn=0269-9931&rft.eissn=1464-0600&rft_id=info:doi/10.1080/02699931.2019.1683516&rft_dat=%3Cproquest_cross%3E2416759668%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2416759668&rft_id=info:pmid/31672095&rfr_iscdi=true