Function to Flatten Gesture Data for Specific Feature Selection Methods to Improve Classification

Gestures are pieces of information with characteristics such as: multiple and chronologically linked samples with different length. The gesture characteristics mentioned before make classification, of this type of data, a challenging task. We studied the effects of flattening gesture data. We propos...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Traitement du signal 2021-08, Vol.38 (4), p.929-935
Hauptverfasser: Cervantes Salgado, Marilu, Pinto Elias, Raul, Magadan Salazar, Andrea
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Gestures are pieces of information with characteristics such as: multiple and chronologically linked samples with different length. The gesture characteristics mentioned before make classification, of this type of data, a challenging task. We studied the effects of flattening gesture data. We proposed a function to represent gestures in a flat format taking in consideration the evolution sense they possess. The function's main goal is to compare gestures intra class to spot differences. This function is described step by step and then its outcome is used as input to two feature selection methods (Bayesian network / Markov blanket and Logical Combinatorial to Pattern Recognition). After, with the subsets obtained, we trained Hidden Markov Models machines. We found that applying our methodology to gesture data, the subset of attributes obtained (feature selection) were able to classify with accuracies of 0.88 and 0.87 of a maximum of 0.90. The maximum accuracy was obtained from an exhaustive classification exercise we performed in order to compare our results. These findings suggest that our methodology can be applied over raw data (gesture data or any chronologically linked data) without the need of experts to transform data (i.e. feature extraction).
ISSN:0765-0019
1958-5608
DOI:10.18280/ts.380402