The roles of vision and proprioception in spatial tuning of sensory attenuation

When we touch ourselves, the pressure appears weaker compared to when someone else touches us, an effect known as sensory attenuation. Sensory attenuation is spatially tuned and does only occur if the positions of the touching and the touched body-party spatially coincide. Here, we ask about the con...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Experimental brain research 2025, Vol.243 (1), p.42
Hauptverfasser: Fritz, Clara, Bayer, Manuel, Zimmermann, Eckart
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:When we touch ourselves, the pressure appears weaker compared to when someone else touches us, an effect known as sensory attenuation. Sensory attenuation is spatially tuned and does only occur if the positions of the touching and the touched body-party spatially coincide. Here, we ask about the contribution of visual or proprioceptive signals to determine self-touch. By using a 3D arm model in a virtual reality environment, we dissociated the visual from the proprioceptive arm signal. When a virtual arm was visible indicating self-touch, we found that sensory attenuation generalized across different locations. When no virtual arm was visible, we found sensory attenuation to be strongest when subjects pointed to the position where they felt their arm to be located. We conclude that the spatial tuning of tactile attenuation depends on which signal determines the occurrence of self-touch. When observers can see their hand, the visual signal dominates the proprioceptive determining self-touch in a single visual snapshot. When only the proprioceptive signal is available, the positions of the touching and the touched body-part must be separately estimated and subsequently compared if they overlap in anatomical space.
ISSN:0014-4819
1432-1106
DOI:10.1007/s00221-024-06982-w