Attentional Priority Is Determined by Predicted Feature Distributions
Visual attention is often characterized as being guided by precise memories for target objects. However, real-world search targets have dynamic features that vary over time, meaning that observers must predict how the target could look based on how features are expected to change. Despite its import...
Gespeichert in:
Veröffentlicht in: | Journal of experimental psychology. Human perception and performance 2022-11, Vol.48 (11), p.1201-1212 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Visual attention is often characterized as being guided by precise memories for target objects. However, real-world search targets have dynamic features that vary over time, meaning that observers must predict how the target could look based on how features are expected to change. Despite its importance, little is known about how target feature predictions influence feature-based attention, or how these predictions are represented in the target template. In Experiment 1 (N = 60 university students), we show observers readily track the statistics of target features over time and adapt attentional priority to predictions about the distribution of target features. In Experiments 2a and 2b (N = 480 university students), we show that these predictions are encoded into the target template as a distribution of likelihoods over possible target features, which are independent of memory precision for the cued item. These results provide a novel demonstration of how observers represent predicted feature distributions when target features are uncertain and show that these predictions are used to set attentional priority during visual search.
Public Significance Statement
Theories of attention and working memory posit that when we engage in complex cognitive tasks, our performance is determined by how precisely we remember task-relevant information. However, in the real world, properties of objects change over time, creating uncertainty about many aspects of the task. There is currently a gap in our understanding of how cognitive systems overcome this uncertainty when engaging in common behaviors like visual search. In two studies we show that when searching for target objects, observers readily learn the distribution of possible target features and leverage this information to make predictions about which features will best guide attention in the upcoming search. Further, we show that these predictions are distinct from memory, and uniquely influence attention when search targets are uncertain. These results help advance theories of attention and working memory by explaining how we use learning and prediction to overcome uncertainty in the environment. |
---|---|
ISSN: | 0096-1523 1939-1277 |
DOI: | 10.1037/xhp0001041 |