Fabrication and Characterization of a Soft and Stretchable Capacitive Strain Sensor for Hand Gesture Recognition

In line with recent progress in soft robotics, human-machine interfaces, and wearable sensors, there has been an increasing need for flexible and stretchable strain sensors, especially high-performance and low-cost capacitive strain-based sensors. Our sensor, based on a Multi-walled Carbon Nanotube/...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2024-11, p.1-1
Hauptverfasser: Tchantchane, Rayane, Zhou, Hao, Zhang, Shen, Alici, Gursel
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In line with recent progress in soft robotics, human-machine interfaces, and wearable sensors, there has been an increasing need for flexible and stretchable strain sensors, especially high-performance and low-cost capacitive strain-based sensors. Our sensor, based on a Multi-walled Carbon Nanotube/Ecoflex composite, conforms to curved and irregular surfaces to detect and respond to mechanical deformations, including tensile and bending modes while maintaining exceptional flexibility, stretchability (230%), and comfort without interfering with hand movements. It exhibits a low hysteresis (maximum hysteresis error of ≤2.5%), high sensitivity characterized by a gauge factor of 0.80 at 100% elongation, and 0.147 at 90° bending. The sensor also demonstrates durability under cyclic loads, enduring over 1,000 bending cycles. By employing different machine learning classifiers, including Random Forest, Linear Discriminant Analysis, and Logistic Regression, the strain sensor can recognize various finger angles from five subjects with accuracies of 99%, 98%, and 97%, respectively, demonstrating its promising applications in enhancing human-machine interactions and wearable technology, paving the way for future research on flexible sensing systems capable of real-time, precise gesture recognition.
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2024.3493160