Optical Lace for Synthetic Afferent Neural Networks

While vision dominates sensing in robots, animals with limited vision deftly navigate their environment using other forms of perception such as touch. Efforts have been made to apply artificial skins with tactile sensing to robots for similarly sophisticated mobile and manipulative skills. The abili...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Science robotics 2019-09, Vol.4 (34)
Hauptverfasser: Xu, Artemis, Mishra, A K, Bai, H, Aubin, C A, Zullo, L, Shepherd, R F
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:While vision dominates sensing in robots, animals with limited vision deftly navigate their environment using other forms of perception such as touch. Efforts have been made to apply artificial skins with tactile sensing to robots for similarly sophisticated mobile and manipulative skills. The ability to functionally mimic the afferent sensory neural network, required for distributed sensing and communication networks throughout the body is still missing. This limitation is partially due to the lack of cointegration of the mechanosensors in the body of the robot. In this paper, lacings of stretchable optical fibers distributed throughout 3D printed elastomer frameworks create a cointegrated body, sensing, and communication network. This soft, functional structure can localize deformation with sub-millimeter positional accuracy (Err = 0.71 mm), and sub-Newton force resolution (~0.3 N).
ISSN:2470-9476
2470-9476
DOI:10.1126/scirobotics.aaw6304