VAM: A neuro-cognitive model for visual attention control of segmentation, object recognition, and space-based motor action

This paper introduces a new neuro-cognitive Visual Attention Model, called VAM. It is a model of visual attention control of segmentation, object recognition, and space-based motor action. VAM is concerned with two main functions of visual attention-that is "selection-for-object-recognition&quo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Visual cognition 1995-06, Vol.2 (2-3), p.331-376
1. Verfasser: Schneider, Werner X.
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper introduces a new neuro-cognitive Visual Attention Model, called VAM. It is a model of visual attention control of segmentation, object recognition, and space-based motor action. VAM is concerned with two main functions of visual attention-that is "selection-for-object-recognition" and "selection-for-space-based-motor-action". The attentional control processes that perform these two functions restructure the results of stimulus-driven and local perceptual grouping and segregation processes, the "visual chunks", in such a way that one visual chunk is globally segmented and implemented as an "object token". This attentional segmentation solves the "inter- and intra-object-binding problem". It can be controlled by higher-level visual modules of the what-pathway (e.g. V4/IT) and/or the where-pathway (e.g. PPC) that contain relatively invariant "type-level" information (e.g. an alphabet of shape primitives, colors with constancy, locations for space-based motor actions). What-based attentional control is successful if there is only one object in the visual scene whose type-level features match the intended target object description. If this is not the case, where-based attention is required that can serially scan one object location after another.
ISSN:1350-6285
1464-0716
DOI:10.1080/13506289508401737