Adaptive cognitive maps for curved surfaces in the 3D world

Terrains in a 3D world can be undulating. Yet, most prior research has exclusively investigated spatial representations on a flat surface, leaving a 2D cognitive map as the dominant model in the field. Here, we investigated whether humans represent a curved surface by building a dimension-reduced fl...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Cognition 2022-08, Vol.225, p.105126-105126, Article 105126
Hauptverfasser: Kim, Misun, Doeller, Christian F.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Terrains in a 3D world can be undulating. Yet, most prior research has exclusively investigated spatial representations on a flat surface, leaving a 2D cognitive map as the dominant model in the field. Here, we investigated whether humans represent a curved surface by building a dimension-reduced flattened 2D map or a full 3D map. Participants learned the location of objects positioned on a flat and curved surface in a virtual environment by driving on the concave side of the surface (Experiment 1), driving and looking vertically (Experiment 2), or flying (Experiment 3). Subsequently, they were asked to retrieve either the path distance or the 3D Euclidean distance between the objects. Path distance estimation was good overall, but we found a significant underestimation bias for the path distance on the curve, suggesting an influence of potential 3D shortcuts, even though participants were only driving on the surface. Euclidean distance estimation was better when participants were exposed more to the global 3D structure of the environment by looking and flying. These results suggest that the representation of the 2D manifold, embedded in a 3D world, is neither purely 2D nor 3D. Rather, it is flexible and dependent on the behavioral experience and demand.
ISSN:0010-0277
1873-7838
DOI:10.1016/j.cognition.2022.105126