Modeling Rational Adaptation of Visual Search to Hierarchical Structures
Efficient attention deployment in visual search is limited by human visual memory, yet this limitation can be offset by exploiting the environment's structure. This paper introduces a computational cognitive model that simulates how the human visual system uses visual hierarchies to prevent ref...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Efficient attention deployment in visual search is limited by human visual
memory, yet this limitation can be offset by exploiting the environment's
structure. This paper introduces a computational cognitive model that simulates
how the human visual system uses visual hierarchies to prevent refixations in
sequential attention deployment. The model adopts computational rationality,
positing behaviors as adaptations to cognitive constraints and environmental
structures. In contrast to earlier models that predict search performance for
hierarchical information, our model does not include predefined assumptions
about particular search strategies. Instead, our model's search strategy
emerges as a result of adapting to the environment through reinforcement
learning algorithms. In an experiment with human participants we test the
model's prediction that structured environments reduce visual search times
compared to random tasks. Our model's predictions correspond well with human
search performance across various set sizes for both structured and
unstructured visual layouts. Our work improves understanding of the adaptive
nature of visual search in hierarchically structured environments and informs
the design of optimized search spaces. |
---|---|
DOI: | 10.48550/arxiv.2409.08967 |