Interplay of tactile and motor information in constructing spatial self-perception
During active movement, there is normally a tight relation between motor command and sensory representation about the resulting spatial displacement of the body. Indeed, some theories of space perception emphasize the topographic layout of sensory receptor surfaces, while others emphasize implicit s...
Gespeichert in:
Veröffentlicht in: | Current biology 2022-03, Vol.32 (6), p.1301-1309.e3 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | During active movement, there is normally a tight relation between motor command and sensory representation about the resulting spatial displacement of the body. Indeed, some theories of space perception emphasize the topographic layout of sensory receptor surfaces, while others emphasize implicit spatial information provided by the intensity of motor command signals. To identify which has the primary role in spatial perception, we developed experiments based on everyday self-touch, in which the right hand strokes the left arm. We used a robot-mediated form of self-touch to decouple the spatial extent of active or passive right hand movements from their tactile consequences. Participants made active movements of the right hand between unpredictable, haptically defined start and stop positions, or the hand was passively moved between the same positions. These movements caused a stroking tactile motion by a brush along the left forearm, with minimal delay, but with an unpredictable spatial gain factor. Participants judged the spatial extent of either the right hand’s movement, or of the resulting tactile stimulation to their left forearm. Across five experiments, we found that movement extent strongly interfered with tactile extent perception, and vice versa. Crucially, interference in both directions was stronger during active than passive movements. Thus, voluntary motor commands produced stronger integration of multiple sensorimotor signals underpinning the perception of personal space. Our results prompt a reappraisal of classical theories that reduce space perception to motor command information.
•We used two coupled robots to manipulate spatial perceptions during self-touch•We found strong interference of movement on tactile spatial percepts and vice versa•This bidirectional interference was greater for active than for passive movements•Active self-touch plays a key role in spatial coherence of bodily self-awareness
Tactile and motor signals are tightly linked in self-touch behaviors, making their separate contributions to perception hard to identify. Cataldo et al. use two linked robots to decouple motor from sensory components. Their results show that voluntary motor commands bind spatial information from movement and touch into a coherent spatial percept. |
---|---|
ISSN: | 0960-9822 1879-0445 |
DOI: | 10.1016/j.cub.2022.01.047 |