American sign language recognition and training method with recurrent neural network

•An American Sign Language recognition model was developed using Leap Motion.•LSTM-RNN with kNN method was proposed for recognition 26 alphabets.•3D motion of hand gesture and relevant 30 features were extracted.•26 alphabets with recognition rate of 99.44% accuracy was obtained. Though American sig...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems with applications 2021-04, Vol.167, p.114403, Article 114403
Hauptverfasser: Lee, C.K.M., Ng, Kam K.H., Chen, Chun-Hsien, Lau, H.C.W., Chung, S.Y., Tsoi, Tiffany
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•An American Sign Language recognition model was developed using Leap Motion.•LSTM-RNN with kNN method was proposed for recognition 26 alphabets.•3D motion of hand gesture and relevant 30 features were extracted.•26 alphabets with recognition rate of 99.44% accuracy was obtained. Though American sign language (ASL) has gained recognition from the American society, few ASL applications have been developed with educational purposes. Those designed with real-time sign recognition systems are also lacking. Leap motion controller facilitates the real-time and accurate recognition of ASL signs. It allows an opportunity for designing a learning application with a real-time sign recognition system that seeks to improve the effectiveness of ASL learning. The project proposes an ASL learning application prototype. The application would be a whack-a-mole game with a real-time sign recognition system embedded. Since both static and dynamic signs (J, Z) exist in ASL alphabets, Long-Short Term Memory Recurrent Neural Network with k-Nearest-Neighbour method is adopted as the classification method is based on handling of sequences of input. Characteristics such as sphere radius, angles between fingers and distance between finger positions are extracted as input for the classification model. The model is trained with 2600 samples, 100 samples taken for each alphabet. The experimental results revealed that the recognition rate for 26 ASL alphabets yields an average of 99.44% accuracy rate and 91.82% in 5-fold cross-validation with the use of leap motion controller.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2020.114403