Coordinating spatial referencing using shared gaze

To better understand the problem of referencing a location in space under time pressure, we had two remotely located partners (A, B) attempt to locate and reach consensus on a sniper target, which appeared randomly in the windows of buildings in a pseudorealistic city scene. The partners were able t...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Psychonomic bulletin & review 2010-10, Vol.17 (5), p.718-724
Hauptverfasser: Neider, Mark B., Chen, Xin, Dickinson, Christopher A., Brennan, Susan E., Zelinsky, Gregory J.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 724
container_issue 5
container_start_page 718
container_title Psychonomic bulletin & review
container_volume 17
creator Neider, Mark B.
Chen, Xin
Dickinson, Christopher A.
Brennan, Susan E.
Zelinsky, Gregory J.
description To better understand the problem of referencing a location in space under time pressure, we had two remotely located partners (A, B) attempt to locate and reach consensus on a sniper target, which appeared randomly in the windows of buildings in a pseudorealistic city scene. The partners were able to communicate using speech alone (shared voice), gaze cursors alone (shared gaze), or both. In the shared-gaze conditions, a gaze cursor representing Partner A’s eye position was superimposed over Partner B’s search display and vice versa. Spatial referencing times (for both partners to find and agree on targets) were faster with shared gaze than with speech, with this benefit due primarily to faster consensus (less time needed for one partner to locate the target after it was located by the other partner). These results suggest that sharing gaze can be more efficient than speaking when people collaborate on tasks requiring the rapid communication of spatial information. Supplemental materials for this article may be downloaded from http://pbr.psychonomic-journals.org/content/supplemental.
doi_str_mv 10.3758/PBR.17.5.718
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_762022374</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>762022374</sourcerecordid><originalsourceid>FETCH-LOGICAL-c428t-74c2a86bfdea2475501908aad2b1e7688f88e75f28a48cefd7b9efafa777dbde3</originalsourceid><addsrcrecordid>eNpt0EtLxDAQB_Agiuvr5lkWQbzYmkfTSY-6-IIFRfQcpu1k3aXbrsn2oJ_e6K4K4mlC5sdk8mfsUPBUgTbnD5ePqYBUpyDMBtsRWolEK8k345nnRVIokw3YbggzzrnOi3ybDaTgCgTIHSZHXefraYvLaTsZhkWs2Aw9OfLUVp93ffjqvKCnejjBd9pnWw6bQAfruseer6-eRrfJ-P7mbnQxTqpMmmUCWSXR5KWrCWUGWnNRcINYy1IQ5MY4Ywi0kwYzU5GroSzIoUMAqMua1B47Xc1d-O61p7C082moqGmwpa4PFnLJpVSQRXn8R8663rdxOWt4poRROY_obIUq34UQf2gXfjpH_2YFt59J2pikFWC1jUlGfrSe2Zdzqn_wd3QRnKwBhgob5zHmFX6d0rIAKKJLVi7EVjsh_7vcvw9_APpoiYw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>804318360</pqid></control><display><type>article</type><title>Coordinating spatial referencing using shared gaze</title><source>MEDLINE</source><source>SpringerNature Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Neider, Mark B. ; Chen, Xin ; Dickinson, Christopher A. ; Brennan, Susan E. ; Zelinsky, Gregory J.</creator><creatorcontrib>Neider, Mark B. ; Chen, Xin ; Dickinson, Christopher A. ; Brennan, Susan E. ; Zelinsky, Gregory J.</creatorcontrib><description>To better understand the problem of referencing a location in space under time pressure, we had two remotely located partners (A, B) attempt to locate and reach consensus on a sniper target, which appeared randomly in the windows of buildings in a pseudorealistic city scene. The partners were able to communicate using speech alone (shared voice), gaze cursors alone (shared gaze), or both. In the shared-gaze conditions, a gaze cursor representing Partner A’s eye position was superimposed over Partner B’s search display and vice versa. Spatial referencing times (for both partners to find and agree on targets) were faster with shared gaze than with speech, with this benefit due primarily to faster consensus (less time needed for one partner to locate the target after it was located by the other partner). These results suggest that sharing gaze can be more efficient than speaking when people collaborate on tasks requiring the rapid communication of spatial information. Supplemental materials for this article may be downloaded from http://pbr.psychonomic-journals.org/content/supplemental.</description><identifier>ISSN: 1069-9384</identifier><identifier>EISSN: 1531-5320</identifier><identifier>DOI: 10.3758/PBR.17.5.718</identifier><identifier>PMID: 21037172</identifier><language>eng</language><publisher>New York: Springer New York</publisher><subject>Behavioral Science and Psychology ; Biological and medical sciences ; Brief Reports ; Cognitive Psychology ; Experimental psychology ; Fixation, Ocular - physiology ; Fundamental and applied biological sciences. Psychology ; Humans ; Interpersonal Relations ; Nonverbal communication ; Perception ; Photic Stimulation ; Psychology ; Psychology. Psychoanalysis. Psychiatry ; Psychology. Psychophysiology ; Psychomotor Performance - physiology ; Reaction Time - physiology ; Sensory perception ; Social interactions. Communication. Group processes ; Social psychology ; Space Perception - physiology ; Speech ; Theory ; Vision</subject><ispartof>Psychonomic bulletin &amp; review, 2010-10, Vol.17 (5), p.718-724</ispartof><rights>Psychonomic Society, Inc. 2010</rights><rights>2015 INIST-CNRS</rights><rights>Copyright Springer Science &amp; Business Media Oct 2010</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c428t-74c2a86bfdea2475501908aad2b1e7688f88e75f28a48cefd7b9efafa777dbde3</citedby><cites>FETCH-LOGICAL-c428t-74c2a86bfdea2475501908aad2b1e7688f88e75f28a48cefd7b9efafa777dbde3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.3758/PBR.17.5.718$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.3758/PBR.17.5.718$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=23529779$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/21037172$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Neider, Mark B.</creatorcontrib><creatorcontrib>Chen, Xin</creatorcontrib><creatorcontrib>Dickinson, Christopher A.</creatorcontrib><creatorcontrib>Brennan, Susan E.</creatorcontrib><creatorcontrib>Zelinsky, Gregory J.</creatorcontrib><title>Coordinating spatial referencing using shared gaze</title><title>Psychonomic bulletin &amp; review</title><addtitle>Psychonomic Bulletin &amp; Review</addtitle><addtitle>Psychon Bull Rev</addtitle><description>To better understand the problem of referencing a location in space under time pressure, we had two remotely located partners (A, B) attempt to locate and reach consensus on a sniper target, which appeared randomly in the windows of buildings in a pseudorealistic city scene. The partners were able to communicate using speech alone (shared voice), gaze cursors alone (shared gaze), or both. In the shared-gaze conditions, a gaze cursor representing Partner A’s eye position was superimposed over Partner B’s search display and vice versa. Spatial referencing times (for both partners to find and agree on targets) were faster with shared gaze than with speech, with this benefit due primarily to faster consensus (less time needed for one partner to locate the target after it was located by the other partner). These results suggest that sharing gaze can be more efficient than speaking when people collaborate on tasks requiring the rapid communication of spatial information. Supplemental materials for this article may be downloaded from http://pbr.psychonomic-journals.org/content/supplemental.</description><subject>Behavioral Science and Psychology</subject><subject>Biological and medical sciences</subject><subject>Brief Reports</subject><subject>Cognitive Psychology</subject><subject>Experimental psychology</subject><subject>Fixation, Ocular - physiology</subject><subject>Fundamental and applied biological sciences. Psychology</subject><subject>Humans</subject><subject>Interpersonal Relations</subject><subject>Nonverbal communication</subject><subject>Perception</subject><subject>Photic Stimulation</subject><subject>Psychology</subject><subject>Psychology. Psychoanalysis. Psychiatry</subject><subject>Psychology. Psychophysiology</subject><subject>Psychomotor Performance - physiology</subject><subject>Reaction Time - physiology</subject><subject>Sensory perception</subject><subject>Social interactions. Communication. Group processes</subject><subject>Social psychology</subject><subject>Space Perception - physiology</subject><subject>Speech</subject><subject>Theory</subject><subject>Vision</subject><issn>1069-9384</issn><issn>1531-5320</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2010</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNpt0EtLxDAQB_Agiuvr5lkWQbzYmkfTSY-6-IIFRfQcpu1k3aXbrsn2oJ_e6K4K4mlC5sdk8mfsUPBUgTbnD5ePqYBUpyDMBtsRWolEK8k345nnRVIokw3YbggzzrnOi3ybDaTgCgTIHSZHXefraYvLaTsZhkWs2Aw9OfLUVp93ffjqvKCnejjBd9pnWw6bQAfruseer6-eRrfJ-P7mbnQxTqpMmmUCWSXR5KWrCWUGWnNRcINYy1IQ5MY4Ywi0kwYzU5GroSzIoUMAqMua1B47Xc1d-O61p7C082moqGmwpa4PFnLJpVSQRXn8R8663rdxOWt4poRROY_obIUq34UQf2gXfjpH_2YFt59J2pikFWC1jUlGfrSe2Zdzqn_wd3QRnKwBhgob5zHmFX6d0rIAKKJLVi7EVjsh_7vcvw9_APpoiYw</recordid><startdate>20101001</startdate><enddate>20101001</enddate><creator>Neider, Mark B.</creator><creator>Chen, Xin</creator><creator>Dickinson, Christopher A.</creator><creator>Brennan, Susan E.</creator><creator>Zelinsky, Gregory J.</creator><general>Springer New York</general><general>Springer</general><general>Springer Nature B.V</general><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>4T-</scope><scope>4U-</scope><scope>K9.</scope><scope>7X8</scope></search><sort><creationdate>20101001</creationdate><title>Coordinating spatial referencing using shared gaze</title><author>Neider, Mark B. ; Chen, Xin ; Dickinson, Christopher A. ; Brennan, Susan E. ; Zelinsky, Gregory J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c428t-74c2a86bfdea2475501908aad2b1e7688f88e75f28a48cefd7b9efafa777dbde3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2010</creationdate><topic>Behavioral Science and Psychology</topic><topic>Biological and medical sciences</topic><topic>Brief Reports</topic><topic>Cognitive Psychology</topic><topic>Experimental psychology</topic><topic>Fixation, Ocular - physiology</topic><topic>Fundamental and applied biological sciences. Psychology</topic><topic>Humans</topic><topic>Interpersonal Relations</topic><topic>Nonverbal communication</topic><topic>Perception</topic><topic>Photic Stimulation</topic><topic>Psychology</topic><topic>Psychology. Psychoanalysis. Psychiatry</topic><topic>Psychology. Psychophysiology</topic><topic>Psychomotor Performance - physiology</topic><topic>Reaction Time - physiology</topic><topic>Sensory perception</topic><topic>Social interactions. Communication. Group processes</topic><topic>Social psychology</topic><topic>Space Perception - physiology</topic><topic>Speech</topic><topic>Theory</topic><topic>Vision</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Neider, Mark B.</creatorcontrib><creatorcontrib>Chen, Xin</creatorcontrib><creatorcontrib>Dickinson, Christopher A.</creatorcontrib><creatorcontrib>Brennan, Susan E.</creatorcontrib><creatorcontrib>Zelinsky, Gregory J.</creatorcontrib><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Docstoc</collection><collection>University Readers</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>MEDLINE - Academic</collection><jtitle>Psychonomic bulletin &amp; review</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Neider, Mark B.</au><au>Chen, Xin</au><au>Dickinson, Christopher A.</au><au>Brennan, Susan E.</au><au>Zelinsky, Gregory J.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Coordinating spatial referencing using shared gaze</atitle><jtitle>Psychonomic bulletin &amp; review</jtitle><stitle>Psychonomic Bulletin &amp; Review</stitle><addtitle>Psychon Bull Rev</addtitle><date>2010-10-01</date><risdate>2010</risdate><volume>17</volume><issue>5</issue><spage>718</spage><epage>724</epage><pages>718-724</pages><issn>1069-9384</issn><eissn>1531-5320</eissn><abstract>To better understand the problem of referencing a location in space under time pressure, we had two remotely located partners (A, B) attempt to locate and reach consensus on a sniper target, which appeared randomly in the windows of buildings in a pseudorealistic city scene. The partners were able to communicate using speech alone (shared voice), gaze cursors alone (shared gaze), or both. In the shared-gaze conditions, a gaze cursor representing Partner A’s eye position was superimposed over Partner B’s search display and vice versa. Spatial referencing times (for both partners to find and agree on targets) were faster with shared gaze than with speech, with this benefit due primarily to faster consensus (less time needed for one partner to locate the target after it was located by the other partner). These results suggest that sharing gaze can be more efficient than speaking when people collaborate on tasks requiring the rapid communication of spatial information. Supplemental materials for this article may be downloaded from http://pbr.psychonomic-journals.org/content/supplemental.</abstract><cop>New York</cop><pub>Springer New York</pub><pmid>21037172</pmid><doi>10.3758/PBR.17.5.718</doi><tpages>7</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1069-9384
ispartof Psychonomic bulletin & review, 2010-10, Vol.17 (5), p.718-724
issn 1069-9384
1531-5320
language eng
recordid cdi_proquest_miscellaneous_762022374
source MEDLINE; SpringerNature Journals; EZB-FREE-00999 freely available EZB journals
subjects Behavioral Science and Psychology
Biological and medical sciences
Brief Reports
Cognitive Psychology
Experimental psychology
Fixation, Ocular - physiology
Fundamental and applied biological sciences. Psychology
Humans
Interpersonal Relations
Nonverbal communication
Perception
Photic Stimulation
Psychology
Psychology. Psychoanalysis. Psychiatry
Psychology. Psychophysiology
Psychomotor Performance - physiology
Reaction Time - physiology
Sensory perception
Social interactions. Communication. Group processes
Social psychology
Space Perception - physiology
Speech
Theory
Vision
title Coordinating spatial referencing using shared gaze
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T15%3A03%3A39IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Coordinating%20spatial%20referencing%20using%20shared%20gaze&rft.jtitle=Psychonomic%20bulletin%20&%20review&rft.au=Neider,%20Mark%20B.&rft.date=2010-10-01&rft.volume=17&rft.issue=5&rft.spage=718&rft.epage=724&rft.pages=718-724&rft.issn=1069-9384&rft.eissn=1531-5320&rft_id=info:doi/10.3758/PBR.17.5.718&rft_dat=%3Cproquest_cross%3E762022374%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=804318360&rft_id=info:pmid/21037172&rfr_iscdi=true