How Do People Find Pairs?
Humans continuously scan their visual environment for relevant information. Such visual search behavior has typically been studied with tasks in which the search goal is constant and well-defined, requiring relatively little interplay between memory and orienting. Here we studied a situation in whic...
Gespeichert in:
Veröffentlicht in: | Journal of experimental psychology. General 2023-08, Vol.152 (8), p.2190-2204 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 2204 |
---|---|
container_issue | 8 |
container_start_page | 2190 |
container_title | Journal of experimental psychology. General |
container_volume | 152 |
creator | Li, Aoqi Chen, Zhenzhong Wolfe, Jeremy M. Olivers, Christian N. L. |
description | Humans continuously scan their visual environment for relevant information. Such visual search behavior has typically been studied with tasks in which the search goal is constant and well-defined, requiring relatively little interplay between memory and orienting. Here we studied a situation in which the target is not known in advance, and instead, memory needs to be dynamically updated during the actual search. Observers compared two simultaneously presented arrays of objects for any matching pair of items-a task that requires continuous comparisons between what is seen now and what was seen a few moments ago. To manipulate the balance between memorizing and scanning, we ran two versions of the task. In an eye-tracking version, the objects were continuously available and could be scanned with relative ease. The results suggested that observers preferred scanning over memorizing. In a mouse-tracking version, perceptual availability was limited, and scanning was slowed. Now observers substantially increased their memory use. Thus, the results revealed a flexible and dynamic interplay between memory and perception. The findings aid in further bridging the research fields of attention and memory.
Public Significance Statement
Human observers can search their environment on the basis of abstract rules that demand a dynamic interplay between memory and perception, but this ability has received little investigation. This study uses the tools of the visual search literature to reveal how humans perform a task based on such an abstract rule; in this case, "find any matching pair of objects." The results demonstrate a continuous and flexible trade-off between internal memorizing and external perceptual sampling, depending on the balance of costs associated with these processes. |
doi_str_mv | 10.1037/xge0001390 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2790050530</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2789791047</sourcerecordid><originalsourceid>FETCH-LOGICAL-a417t-c2924e856ceaa362802fb69d435e0bae73f84a92f01f4b2e2c2ec504202581473</originalsourceid><addsrcrecordid>eNp90M9LwzAUB_AgipvTizcvUvAiQvXlV9OcRKZTYeAOeg5p9iodXVuTFt1_b-emggdzeZcP3_fyJeSYwiUFrq4-XhEAKNewQ4ZUcx2z_u2SIYBOYi6EHJCDEBY9Ap4m-2TAEy2pEmxITh7q9-i2jmZYNyVGk6KaRzNb-HB9SPZyWwY82s4ReZncPY8f4unT_eP4ZhpbQVUbO6aZwFQmDq3lCUuB5Vmi54JLhMyi4nkqrGY50FxkDJlj6CQIBkymVCg-Iueb3MbXbx2G1iyL4LAsbYV1FwxTGkCC5NDTsz90UXe-6q8zLBUKlIL--_8plWqlKXytvdgo5-sQPOam8cXS-pWhYNa1mt9ae3y6jeyyJc5_6HePPYg3wDbWNGHlrG8LV2JwnfdYteswQyUzqWG0D_wEgVt8_Q</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2789791047</pqid></control><display><type>article</type><title>How Do People Find Pairs?</title><source>APA PsycARTICLES</source><creator>Li, Aoqi ; Chen, Zhenzhong ; Wolfe, Jeremy M. ; Olivers, Christian N. L.</creator><contributor>Brown-Schmidt, Sarah</contributor><creatorcontrib>Li, Aoqi ; Chen, Zhenzhong ; Wolfe, Jeremy M. ; Olivers, Christian N. L. ; Brown-Schmidt, Sarah</creatorcontrib><description>Humans continuously scan their visual environment for relevant information. Such visual search behavior has typically been studied with tasks in which the search goal is constant and well-defined, requiring relatively little interplay between memory and orienting. Here we studied a situation in which the target is not known in advance, and instead, memory needs to be dynamically updated during the actual search. Observers compared two simultaneously presented arrays of objects for any matching pair of items-a task that requires continuous comparisons between what is seen now and what was seen a few moments ago. To manipulate the balance between memorizing and scanning, we ran two versions of the task. In an eye-tracking version, the objects were continuously available and could be scanned with relative ease. The results suggested that observers preferred scanning over memorizing. In a mouse-tracking version, perceptual availability was limited, and scanning was slowed. Now observers substantially increased their memory use. Thus, the results revealed a flexible and dynamic interplay between memory and perception. The findings aid in further bridging the research fields of attention and memory.
Public Significance Statement
Human observers can search their environment on the basis of abstract rules that demand a dynamic interplay between memory and perception, but this ability has received little investigation. This study uses the tools of the visual search literature to reveal how humans perform a task based on such an abstract rule; in this case, "find any matching pair of objects." The results demonstrate a continuous and flexible trade-off between internal memorizing and external perceptual sampling, depending on the balance of costs associated with these processes.</description><identifier>ISSN: 0096-3445</identifier><identifier>EISSN: 1939-2222</identifier><identifier>DOI: 10.1037/xge0001390</identifier><identifier>PMID: 36951742</identifier><language>eng</language><publisher>United States: American Psychological Association</publisher><subject>Attention ; Female ; Human ; Long Term Memory ; Male ; Memory ; Observers ; Short Term Memory ; Visual Memory ; Visual perception ; Visual Search ; Visual task performance</subject><ispartof>Journal of experimental psychology. General, 2023-08, Vol.152 (8), p.2190-2204</ispartof><rights>2023 American Psychological Association</rights><rights>2023, American Psychological Association</rights><rights>Copyright American Psychological Association Aug 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-a417t-c2924e856ceaa362802fb69d435e0bae73f84a92f01f4b2e2c2ec504202581473</citedby><orcidid>0000-0001-6373-6638</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>315,781,785,27929,27930</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36951742$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Brown-Schmidt, Sarah</contributor><creatorcontrib>Li, Aoqi</creatorcontrib><creatorcontrib>Chen, Zhenzhong</creatorcontrib><creatorcontrib>Wolfe, Jeremy M.</creatorcontrib><creatorcontrib>Olivers, Christian N. L.</creatorcontrib><title>How Do People Find Pairs?</title><title>Journal of experimental psychology. General</title><addtitle>J Exp Psychol Gen</addtitle><description>Humans continuously scan their visual environment for relevant information. Such visual search behavior has typically been studied with tasks in which the search goal is constant and well-defined, requiring relatively little interplay between memory and orienting. Here we studied a situation in which the target is not known in advance, and instead, memory needs to be dynamically updated during the actual search. Observers compared two simultaneously presented arrays of objects for any matching pair of items-a task that requires continuous comparisons between what is seen now and what was seen a few moments ago. To manipulate the balance between memorizing and scanning, we ran two versions of the task. In an eye-tracking version, the objects were continuously available and could be scanned with relative ease. The results suggested that observers preferred scanning over memorizing. In a mouse-tracking version, perceptual availability was limited, and scanning was slowed. Now observers substantially increased their memory use. Thus, the results revealed a flexible and dynamic interplay between memory and perception. The findings aid in further bridging the research fields of attention and memory.
Public Significance Statement
Human observers can search their environment on the basis of abstract rules that demand a dynamic interplay between memory and perception, but this ability has received little investigation. This study uses the tools of the visual search literature to reveal how humans perform a task based on such an abstract rule; in this case, "find any matching pair of objects." The results demonstrate a continuous and flexible trade-off between internal memorizing and external perceptual sampling, depending on the balance of costs associated with these processes.</description><subject>Attention</subject><subject>Female</subject><subject>Human</subject><subject>Long Term Memory</subject><subject>Male</subject><subject>Memory</subject><subject>Observers</subject><subject>Short Term Memory</subject><subject>Visual Memory</subject><subject>Visual perception</subject><subject>Visual Search</subject><subject>Visual task performance</subject><issn>0096-3445</issn><issn>1939-2222</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp90M9LwzAUB_AgipvTizcvUvAiQvXlV9OcRKZTYeAOeg5p9iodXVuTFt1_b-emggdzeZcP3_fyJeSYwiUFrq4-XhEAKNewQ4ZUcx2z_u2SIYBOYi6EHJCDEBY9Ap4m-2TAEy2pEmxITh7q9-i2jmZYNyVGk6KaRzNb-HB9SPZyWwY82s4ReZncPY8f4unT_eP4ZhpbQVUbO6aZwFQmDq3lCUuB5Vmi54JLhMyi4nkqrGY50FxkDJlj6CQIBkymVCg-Iueb3MbXbx2G1iyL4LAsbYV1FwxTGkCC5NDTsz90UXe-6q8zLBUKlIL--_8plWqlKXytvdgo5-sQPOam8cXS-pWhYNa1mt9ae3y6jeyyJc5_6HePPYg3wDbWNGHlrG8LV2JwnfdYteswQyUzqWG0D_wEgVt8_Q</recordid><startdate>20230801</startdate><enddate>20230801</enddate><creator>Li, Aoqi</creator><creator>Chen, Zhenzhong</creator><creator>Wolfe, Jeremy M.</creator><creator>Olivers, Christian N. L.</creator><general>American Psychological Association</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7RZ</scope><scope>PSYQQ</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-6373-6638</orcidid></search><sort><creationdate>20230801</creationdate><title>How Do People Find Pairs?</title><author>Li, Aoqi ; Chen, Zhenzhong ; Wolfe, Jeremy M. ; Olivers, Christian N. L.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a417t-c2924e856ceaa362802fb69d435e0bae73f84a92f01f4b2e2c2ec504202581473</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Attention</topic><topic>Female</topic><topic>Human</topic><topic>Long Term Memory</topic><topic>Male</topic><topic>Memory</topic><topic>Observers</topic><topic>Short Term Memory</topic><topic>Visual Memory</topic><topic>Visual perception</topic><topic>Visual Search</topic><topic>Visual task performance</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Aoqi</creatorcontrib><creatorcontrib>Chen, Zhenzhong</creatorcontrib><creatorcontrib>Wolfe, Jeremy M.</creatorcontrib><creatorcontrib>Olivers, Christian N. L.</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>Access via APA PsycArticles® (ProQuest)</collection><collection>ProQuest One Psychology</collection><collection>MEDLINE - Academic</collection><jtitle>Journal of experimental psychology. General</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Aoqi</au><au>Chen, Zhenzhong</au><au>Wolfe, Jeremy M.</au><au>Olivers, Christian N. L.</au><au>Brown-Schmidt, Sarah</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>How Do People Find Pairs?</atitle><jtitle>Journal of experimental psychology. General</jtitle><addtitle>J Exp Psychol Gen</addtitle><date>2023-08-01</date><risdate>2023</risdate><volume>152</volume><issue>8</issue><spage>2190</spage><epage>2204</epage><pages>2190-2204</pages><issn>0096-3445</issn><eissn>1939-2222</eissn><abstract>Humans continuously scan their visual environment for relevant information. Such visual search behavior has typically been studied with tasks in which the search goal is constant and well-defined, requiring relatively little interplay between memory and orienting. Here we studied a situation in which the target is not known in advance, and instead, memory needs to be dynamically updated during the actual search. Observers compared two simultaneously presented arrays of objects for any matching pair of items-a task that requires continuous comparisons between what is seen now and what was seen a few moments ago. To manipulate the balance between memorizing and scanning, we ran two versions of the task. In an eye-tracking version, the objects were continuously available and could be scanned with relative ease. The results suggested that observers preferred scanning over memorizing. In a mouse-tracking version, perceptual availability was limited, and scanning was slowed. Now observers substantially increased their memory use. Thus, the results revealed a flexible and dynamic interplay between memory and perception. The findings aid in further bridging the research fields of attention and memory.
Public Significance Statement
Human observers can search their environment on the basis of abstract rules that demand a dynamic interplay between memory and perception, but this ability has received little investigation. This study uses the tools of the visual search literature to reveal how humans perform a task based on such an abstract rule; in this case, "find any matching pair of objects." The results demonstrate a continuous and flexible trade-off between internal memorizing and external perceptual sampling, depending on the balance of costs associated with these processes.</abstract><cop>United States</cop><pub>American Psychological Association</pub><pmid>36951742</pmid><doi>10.1037/xge0001390</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0001-6373-6638</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0096-3445 |
ispartof | Journal of experimental psychology. General, 2023-08, Vol.152 (8), p.2190-2204 |
issn | 0096-3445 1939-2222 |
language | eng |
recordid | cdi_proquest_miscellaneous_2790050530 |
source | APA PsycARTICLES |
subjects | Attention Female Human Long Term Memory Male Memory Observers Short Term Memory Visual Memory Visual perception Visual Search Visual task performance |
title | How Do People Find Pairs? |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-11T17%3A36%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=How%20Do%20People%20Find%20Pairs?&rft.jtitle=Journal%20of%20experimental%20psychology.%20General&rft.au=Li,%20Aoqi&rft.date=2023-08-01&rft.volume=152&rft.issue=8&rft.spage=2190&rft.epage=2204&rft.pages=2190-2204&rft.issn=0096-3445&rft.eissn=1939-2222&rft_id=info:doi/10.1037/xge0001390&rft_dat=%3Cproquest_cross%3E2789791047%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2789791047&rft_id=info:pmid/36951742&rfr_iscdi=true |