Coding and use of tactile signals from the fingertips in object manipulation tasks
Everyday object manipulation tasks require the brain to interpret the signals from tactile afferents in the hands. Johansson and Flanagan describe our current understanding of this process, showing how tactile signals are used to control and refine manipulations. Key Points Object manipulation tasks...
Gespeichert in:
Veröffentlicht in: | Nature reviews. Neuroscience 2009-05, Vol.10 (5), p.345-359 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Everyday object manipulation tasks require the brain to interpret the signals from tactile afferents in the hands. Johansson and Flanagan describe our current understanding of this process, showing how tactile signals are used to control and refine manipulations.
Key Points
Object manipulation tasks comprise sequentially organized action phases that are generally delineated by distinct mechanical contact events representing task subgoals. To achieve these subgoals, the brain selects and implements action-phase controllers that use sensory predictions and afferent signals to tailor motor output in anticipation of requirements imposed by objects' physical properties.
Crucial control operations are centred on events that mark transitions between action phases. At these events, the CNS both receives and makes predictions about sensory information from multiple sources. Mismatches between predicted and actual sensory outcomes can be used to quickly and flexibly launch corrective actions as required.
Signals from tactile afferents provide rich information about both the timing and the physical nature of contact events. In addition, they encode information related to object properties, including the shape and texture of contacted surfaces and the frictional conditions between these surfaces and the skin.
A central question is how tactile afferent information is encoded and processed by the brain for the rapid detection and analysis of contact events. Recent evidence suggests that the relative timing of spikes in ensembles of tactile afferents provides such information fast enough to account for the speed with which tactile signals are used in object manipulation tasks.
Contact events in manipulation can also be represented in the visual and auditory modalities and this enables the brain to simultaneously evaluate sensory predictions in different modalities. Multimodal representations of subgoal events also provide an opportunity for the brain to learn and uphold sensorimotor correlations that can be exploited by action-phase controllers.
A current challenge is to learn how the brain implements the control operations that support object manipulations, such as processes involved in detecting sensory mismatches, triggering corrective actions, and creating, recruiting and linking different action-phase controllers during task progression. The signal processing in somatosensory pathways for dynamic context-specific decoding of tactile afferent messages needs to be bet |
---|---|
ISSN: | 1471-003X 1471-0048 1471-0048 1469-3178 |
DOI: | 10.1038/nrn2621 |