Near-term memory in programming: a simulation-based analysis

Near-term memory (NTM) is proposed as a construct for analysing the memory that experts build up and use as they solve a problem in their domain of expertise. Large amounts of information are processed in such situations, and any particular detail could become important later, so performance is faci...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of human-computer studies 2001-02, Vol.54 (2), p.189-210
1. Verfasser: ALTMANN, ERIK M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Near-term memory (NTM) is proposed as a construct for analysing the memory that experts build up and use as they solve a problem in their domain of expertise. Large amounts of information are processed in such situations, and any particular detail could become important later, so performance is facilitated by maintaining long-term memory access to as much detail as possible. Precise analysis of such memory is difficult to achieve with experimentation or observation alone, so computational simulation is used as the analytical method. A computational process model grounded in cognitive theory (Soar) is constructed to fit extensive fine-grained behavioral data from an expert programmer. The model's structures and processes are then inspected for insights into NTM. Structurally, the model's NTM consists of fine-grain perceptual, semantic, and episodic items whose availability is tied to cues from the encoding context. Quantitatively, much more detail enters NTM than is ever retrieved, but when retrieval does occur it can change the course of behavior. To illustrate applications of the construct, the model is used to examine how a cluttered interface might impose cognitive costs by increasing retrieval demands on memory.
ISSN:1071-5819
1095-9300
DOI:10.1006/ijhc.2000.0407