Leveraging Tactile Sensors for Low Latency Embedded Smart Hands for Prosthetic and Robotic Applications
Tactile sensing is a crucial perception mode for robots and human amputees in need of controlling a prosthetic device. Today, robotic and prosthetic systems are still missing the important feature of accurate tactile sensing. This lack is mainly due to the fact that the existing tactile technologies...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on instrumentation and measurement 2022, Vol.71, p.1-14 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Tactile sensing is a crucial perception mode for robots and human amputees in need of controlling a prosthetic device. Today, robotic and prosthetic systems are still missing the important feature of accurate tactile sensing. This lack is mainly due to the fact that the existing tactile technologies have limited spatial and temporal resolution and are either expensive or not scalable. In this article, we present the design and implementation of a hardware-software embedded system called SmartHand. It is specifically designed to enable the acquisition and real-time processing of high-resolution tactile information from a hand-shaped multisensor array for prosthetic and robotic applications. During data collection, our system can deliver a high throughput of 100 frames per second, which is 13.7\times higher than previous related work. This has allowed the collection of a new tactile dataset consisting of 340 000 frames while interacting with 16 objects from everyday life during five different sessions. Together with the empty hand, the dataset presents a total of 17 classes. We propose a compact yet accurate convolutional neural network that requires one order of magnitude less memory and 15.6\times fewer computations compared with related work without degrading classification accuracy. The top-1 and top-3 cross-validation accuracies on the collected dataset are, respectively, 98.86% and 99.83%. We further analyze the intersession variability and obtain the best top-3 leave-one-out-validation accuracy of 77.84%. We deploy the trained model on a high-performance ARM Cortex-M7 microcontroller achieving an inference time of only 100 ms minimizing the response latency. The overall measured power consumption is 505 mW. Finally, we fabricate a new control sensor and perform additional experiments to provide analyses on sensor degradation and slip detection. This work is a step forward in giving robotic and prosthetic devices a sense of touch by demonstrating the practicality of a smart embedded system that uses a scalable tactile sensor with embedded tiny machine learning. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2022.3165828 |