An Ego-Vision System for Hand Grasp Analysis

This paper presents an egocentric vision (ego-vision) system for hand grasp analysis in unstructured environments. Our goal is to automatically recognize hand grasp types and to discover the visual structures of hand grasps using a wearable camera. In the proposed system, free hand-object interactio...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on human-machine systems 2017-08, Vol.47 (4), p.524-535
Hauptverfasser: Minjie Cai, Kitani, Kris M., Sato, Yoichi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents an egocentric vision (ego-vision) system for hand grasp analysis in unstructured environments. Our goal is to automatically recognize hand grasp types and to discover the visual structures of hand grasps using a wearable camera. In the proposed system, free hand-object interactions are recorded from a first-person viewing perspective. State-of-the-art computer vision techniques are used to detect hands and extract hand-based features. A new feature representation that incorporates hand tracking information is also proposed. Then, grasp classifiers are trained to discriminate among different grasp types from a predefined grasp taxonomy. Based on the trained grasp classifiers, visual structures of hand grasps are learned using an iterative grasp clustering method. In experiments, grasp recognition performance in both laboratory and real-world scenarios is evaluated. The best classification accuracy our system achieves is 92% and 59%, respectively. System generality to different tasks and users is also verified by the experiments. Analysis in a real-world scenario shows that it is possible to automatically learn intuitive visual grasp structures that are consistent with expert-designed grasp taxonomies.
ISSN:2168-2291
2168-2305
DOI:10.1109/THMS.2017.2681423