Proprioception and Exteroception of a Soft Robotic Finger Using Neuromorphic Vision-Based Sensing

Equipping soft robotic grippers with sensing and perception capabilities faces significant challenges due to their high compliance and flexibility, limiting their ability to successfully interact with the environment. In this work, we propose a sensorized soft robotic finger with embedded marker pat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Soft robotics 2023-06, Vol.10 (3), p.467-481
Hauptverfasser: Faris, Omar, Muthusamy, Rajkumar, Renda, Federico, Hussain, Irfan, Gan, Dongming, Seneviratne, Lakmal, Zweiri, Yahya
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Equipping soft robotic grippers with sensing and perception capabilities faces significant challenges due to their high compliance and flexibility, limiting their ability to successfully interact with the environment. In this work, we propose a sensorized soft robotic finger with embedded marker pattern that integrates a high-speed neuromorphic event-based camera to enable finger proprioception and exteroception. A learning-based approach involving a convolutional neural network is developed to process event-based heat maps and achieve specific sensing tasks. The feasibility of the sensing approach for proprioception is demonstrated by showing its ability to predict the two-dimensional deformation of three points located on the finger structure, whereas the exteroception capability is assessed in a slip detection task that can classify slip heat maps at a temporal resolution of 2 ms. Our results show that our proposed approach can enable complete sensorization of the finger for both proprioception and exteroception using a single camera without negatively affecting the finger compliance. Using such sensorized finger in robotic grippers may provide safe, adaptive, and precise grasping for handling a wide category of objects.
ISSN:2169-5172
2169-5180
DOI:10.1089/soro.2022.0030