SHREC 2024: Recognition of dynamic hand motions molding clay

Gesture recognition is a tool to enable novel interactions with different techniques and applications, like Mixed Reality and Virtual Reality environments. With all the recent advancements in gesture recognition from skeletal data, it is still unclear how well state-of-the-art techniques perform in...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers & graphics 2024-10, Vol.123, p.104012, Article 104012
Hauptverfasser: Veldhuijzen, Ben, Veltkamp, Remco C., Ikne, Omar, Allaert, Benjamin, Wannous, Hazem, Emporio, Marco, Giachetti, Andrea, LaViola, Joseph J., He, Ruiwen, Benhabiles, Halim, Cabani, Adnane, Fleury, Anthony, Hammoudi, Karim, Gavalas, Konstantinos, Vlachos, Christoforos, Papanikolaou, Athanasios, Romanelis, Ioannis, Fotis, Vlassis, Arvanitis, Gerasimos, Moustakas, Konstantinos, Hanik, Martin, Nava-Yazdani, Esfandiar, von Tycowicz, Christoph
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Gesture recognition is a tool to enable novel interactions with different techniques and applications, like Mixed Reality and Virtual Reality environments. With all the recent advancements in gesture recognition from skeletal data, it is still unclear how well state-of-the-art techniques perform in a scenario using precise motions with two hands. This paper presents the results of the SHREC 2024 contest organized to evaluate methods for their recognition of highly similar hand motions using the skeletal spatial coordinate data of both hands. The task is the recognition of 7 motion classes given their spatial coordinates in a frame-by-frame motion. The skeletal data has been captured using a Vicon system and pre-processed into a coordinate system using Blender and Vicon Shogun Post. We created a small, novel dataset with a high variety of durations in frames. This paper shows the results of the contest, showing the techniques created by the 5 research groups on this challenging task and comparing them to our baseline method. [Display omitted]
ISSN:0097-8493
DOI:10.1016/j.cag.2024.104012