Direction-oriented human motion recognition with prior estimation of directions

As the facilities are becoming available with the advent of the state-of-art technologies, the necessity of man-machine interaction systems is growing day-by-day. Within various applications of such a system, one of the most promising applications in the field of computer vision is the understanding...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Eftakhar, S. M. A., Joo Kooi Tan, Hyoungseop Kim, Ishikawa, S.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As the facilities are becoming available with the advent of the state-of-art technologies, the necessity of man-machine interaction systems is growing day-by-day. Within various applications of such a system, one of the most promising applications in the field of computer vision is the understanding and the interpretation of human motion or behavior in a scene. Direction-oriented motion capture of a person performing some tasks is an important issue in developing a human motion recognition system, since an intelligent system should also be incorporated with the directional information. We propose a direction-oriented motion recognition approach that makes use of the directional information by prior estimation. This reduces the processing time of the system by excluding unnecessary searching for the most similar motions. In this approach, direction-wise motions are clustered within the feature space in order to make the direction estimation easier. Each motion is converted to individual template, namely Motion History Image (MHI) and Exclusive-OR (XOR) Image, by extracting distinguishable features from the video-clips containing the motions. A Structured Motion Database (SMoDB) is developed to match an unlabeled motion against the pre-stored motions. Experiments are conducted on an Avatar dataset and significant improvements in the results are noticed.
ISSN:1553-572X
DOI:10.1109/IECON.2011.6120002