American sign language recognition and training method with recurrent neural network

•An American Sign Language recognition model was developed using Leap Motion.•LSTM-RNN with kNN method was proposed for recognition 26 alphabets.•3D motion of hand gesture and relevant 30 features were extracted.•26 alphabets with recognition rate of 99.44% accuracy was obtained. Though American sig...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems with applications 2021-04, Vol.167, p.114403, Article 114403
Hauptverfasser: Lee, C.K.M., Ng, Kam K.H., Chen, Chun-Hsien, Lau, H.C.W., Chung, S.Y., Tsoi, Tiffany
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page 114403
container_title Expert systems with applications
container_volume 167
creator Lee, C.K.M.
Ng, Kam K.H.
Chen, Chun-Hsien
Lau, H.C.W.
Chung, S.Y.
Tsoi, Tiffany
description •An American Sign Language recognition model was developed using Leap Motion.•LSTM-RNN with kNN method was proposed for recognition 26 alphabets.•3D motion of hand gesture and relevant 30 features were extracted.•26 alphabets with recognition rate of 99.44% accuracy was obtained. Though American sign language (ASL) has gained recognition from the American society, few ASL applications have been developed with educational purposes. Those designed with real-time sign recognition systems are also lacking. Leap motion controller facilitates the real-time and accurate recognition of ASL signs. It allows an opportunity for designing a learning application with a real-time sign recognition system that seeks to improve the effectiveness of ASL learning. The project proposes an ASL learning application prototype. The application would be a whack-a-mole game with a real-time sign recognition system embedded. Since both static and dynamic signs (J, Z) exist in ASL alphabets, Long-Short Term Memory Recurrent Neural Network with k-Nearest-Neighbour method is adopted as the classification method is based on handling of sequences of input. Characteristics such as sphere radius, angles between fingers and distance between finger positions are extracted as input for the classification model. The model is trained with 2600 samples, 100 samples taken for each alphabet. The experimental results revealed that the recognition rate for 26 ASL alphabets yields an average of 99.44% accuracy rate and 91.82% in 5-fold cross-validation with the use of leap motion controller.
doi_str_mv 10.1016/j.eswa.2020.114403
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2503173096</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0957417420310745</els_id><sourcerecordid>2503173096</sourcerecordid><originalsourceid>FETCH-LOGICAL-c372t-39f2ffd0c876381da5ac83e240313f96e02381f831f880f72f3d1c1516ab9503</originalsourceid><addsrcrecordid>eNp9UMtOwzAQtBBIlMIPcLLEOcWPJE4kLhXiJVXi0rtlnHXq0NrFdqj4exyFM4fVSLszu7OD0C0lK0pofT-sIJ7UihGWG7QsCT9DC9oIXtSi5edoQdpKFCUV5SW6inEghApCxAJt1wcIViuHo-0d3ivXj6oHHED73tlkvcPKdTgFZZ11PT5A2vkOn2zaTaQxBHAJOxiD2mdIJx8-r9GFUfsIN3-4RNvnp-3ja7F5f3l7XG8KzQVLBW8NM6YjuhE1b2inKqUbDiy7p9y0NRCW26bhuRpiBDO8o5pWtFYfbUX4Et3Na4_Bf40Qkxz8GFy-KFkeU8FJW2cWm1k6-BgDGHkM9qDCj6RETuHJQU7hySk8OYeXRQ-zCLL9bwtBRm3BaehsfjrJztv_5L_p6ngW</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2503173096</pqid></control><display><type>article</type><title>American sign language recognition and training method with recurrent neural network</title><source>Access via ScienceDirect (Elsevier)</source><creator>Lee, C.K.M. ; Ng, Kam K.H. ; Chen, Chun-Hsien ; Lau, H.C.W. ; Chung, S.Y. ; Tsoi, Tiffany</creator><creatorcontrib>Lee, C.K.M. ; Ng, Kam K.H. ; Chen, Chun-Hsien ; Lau, H.C.W. ; Chung, S.Y. ; Tsoi, Tiffany</creatorcontrib><description>•An American Sign Language recognition model was developed using Leap Motion.•LSTM-RNN with kNN method was proposed for recognition 26 alphabets.•3D motion of hand gesture and relevant 30 features were extracted.•26 alphabets with recognition rate of 99.44% accuracy was obtained. Though American sign language (ASL) has gained recognition from the American society, few ASL applications have been developed with educational purposes. Those designed with real-time sign recognition systems are also lacking. Leap motion controller facilitates the real-time and accurate recognition of ASL signs. It allows an opportunity for designing a learning application with a real-time sign recognition system that seeks to improve the effectiveness of ASL learning. The project proposes an ASL learning application prototype. The application would be a whack-a-mole game with a real-time sign recognition system embedded. Since both static and dynamic signs (J, Z) exist in ASL alphabets, Long-Short Term Memory Recurrent Neural Network with k-Nearest-Neighbour method is adopted as the classification method is based on handling of sequences of input. Characteristics such as sphere radius, angles between fingers and distance between finger positions are extracted as input for the classification model. The model is trained with 2600 samples, 100 samples taken for each alphabet. The experimental results revealed that the recognition rate for 26 ASL alphabets yields an average of 99.44% accuracy rate and 91.82% in 5-fold cross-validation with the use of leap motion controller.</description><identifier>ISSN: 0957-4174</identifier><identifier>EISSN: 1873-6793</identifier><identifier>DOI: 10.1016/j.eswa.2020.114403</identifier><language>eng</language><publisher>New York: Elsevier Ltd</publisher><subject>Alphabets ; American sign language ; Classification ; Controllers ; Leap motion controller ; Learning ; Learning application ; Neural networks ; Real time ; Recognition ; Recurrent neural networks ; Sign language ; Sign recognition system</subject><ispartof>Expert systems with applications, 2021-04, Vol.167, p.114403, Article 114403</ispartof><rights>2020 Elsevier Ltd</rights><rights>Copyright Elsevier BV Apr 1, 2021</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c372t-39f2ffd0c876381da5ac83e240313f96e02381f831f880f72f3d1c1516ab9503</citedby><cites>FETCH-LOGICAL-c372t-39f2ffd0c876381da5ac83e240313f96e02381f831f880f72f3d1c1516ab9503</cites><orcidid>0000-0003-2193-5270</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.eswa.2020.114403$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3550,27924,27925,45995</link.rule.ids></links><search><creatorcontrib>Lee, C.K.M.</creatorcontrib><creatorcontrib>Ng, Kam K.H.</creatorcontrib><creatorcontrib>Chen, Chun-Hsien</creatorcontrib><creatorcontrib>Lau, H.C.W.</creatorcontrib><creatorcontrib>Chung, S.Y.</creatorcontrib><creatorcontrib>Tsoi, Tiffany</creatorcontrib><title>American sign language recognition and training method with recurrent neural network</title><title>Expert systems with applications</title><description>•An American Sign Language recognition model was developed using Leap Motion.•LSTM-RNN with kNN method was proposed for recognition 26 alphabets.•3D motion of hand gesture and relevant 30 features were extracted.•26 alphabets with recognition rate of 99.44% accuracy was obtained. Though American sign language (ASL) has gained recognition from the American society, few ASL applications have been developed with educational purposes. Those designed with real-time sign recognition systems are also lacking. Leap motion controller facilitates the real-time and accurate recognition of ASL signs. It allows an opportunity for designing a learning application with a real-time sign recognition system that seeks to improve the effectiveness of ASL learning. The project proposes an ASL learning application prototype. The application would be a whack-a-mole game with a real-time sign recognition system embedded. Since both static and dynamic signs (J, Z) exist in ASL alphabets, Long-Short Term Memory Recurrent Neural Network with k-Nearest-Neighbour method is adopted as the classification method is based on handling of sequences of input. Characteristics such as sphere radius, angles between fingers and distance between finger positions are extracted as input for the classification model. The model is trained with 2600 samples, 100 samples taken for each alphabet. The experimental results revealed that the recognition rate for 26 ASL alphabets yields an average of 99.44% accuracy rate and 91.82% in 5-fold cross-validation with the use of leap motion controller.</description><subject>Alphabets</subject><subject>American sign language</subject><subject>Classification</subject><subject>Controllers</subject><subject>Leap motion controller</subject><subject>Learning</subject><subject>Learning application</subject><subject>Neural networks</subject><subject>Real time</subject><subject>Recognition</subject><subject>Recurrent neural networks</subject><subject>Sign language</subject><subject>Sign recognition system</subject><issn>0957-4174</issn><issn>1873-6793</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNp9UMtOwzAQtBBIlMIPcLLEOcWPJE4kLhXiJVXi0rtlnHXq0NrFdqj4exyFM4fVSLszu7OD0C0lK0pofT-sIJ7UihGWG7QsCT9DC9oIXtSi5edoQdpKFCUV5SW6inEghApCxAJt1wcIViuHo-0d3ivXj6oHHED73tlkvcPKdTgFZZ11PT5A2vkOn2zaTaQxBHAJOxiD2mdIJx8-r9GFUfsIN3-4RNvnp-3ja7F5f3l7XG8KzQVLBW8NM6YjuhE1b2inKqUbDiy7p9y0NRCW26bhuRpiBDO8o5pWtFYfbUX4Et3Na4_Bf40Qkxz8GFy-KFkeU8FJW2cWm1k6-BgDGHkM9qDCj6RETuHJQU7hySk8OYeXRQ-zCLL9bwtBRm3BaehsfjrJztv_5L_p6ngW</recordid><startdate>20210401</startdate><enddate>20210401</enddate><creator>Lee, C.K.M.</creator><creator>Ng, Kam K.H.</creator><creator>Chen, Chun-Hsien</creator><creator>Lau, H.C.W.</creator><creator>Chung, S.Y.</creator><creator>Tsoi, Tiffany</creator><general>Elsevier Ltd</general><general>Elsevier BV</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-2193-5270</orcidid></search><sort><creationdate>20210401</creationdate><title>American sign language recognition and training method with recurrent neural network</title><author>Lee, C.K.M. ; Ng, Kam K.H. ; Chen, Chun-Hsien ; Lau, H.C.W. ; Chung, S.Y. ; Tsoi, Tiffany</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c372t-39f2ffd0c876381da5ac83e240313f96e02381f831f880f72f3d1c1516ab9503</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Alphabets</topic><topic>American sign language</topic><topic>Classification</topic><topic>Controllers</topic><topic>Leap motion controller</topic><topic>Learning</topic><topic>Learning application</topic><topic>Neural networks</topic><topic>Real time</topic><topic>Recognition</topic><topic>Recurrent neural networks</topic><topic>Sign language</topic><topic>Sign recognition system</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lee, C.K.M.</creatorcontrib><creatorcontrib>Ng, Kam K.H.</creatorcontrib><creatorcontrib>Chen, Chun-Hsien</creatorcontrib><creatorcontrib>Lau, H.C.W.</creatorcontrib><creatorcontrib>Chung, S.Y.</creatorcontrib><creatorcontrib>Tsoi, Tiffany</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Expert systems with applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lee, C.K.M.</au><au>Ng, Kam K.H.</au><au>Chen, Chun-Hsien</au><au>Lau, H.C.W.</au><au>Chung, S.Y.</au><au>Tsoi, Tiffany</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>American sign language recognition and training method with recurrent neural network</atitle><jtitle>Expert systems with applications</jtitle><date>2021-04-01</date><risdate>2021</risdate><volume>167</volume><spage>114403</spage><pages>114403-</pages><artnum>114403</artnum><issn>0957-4174</issn><eissn>1873-6793</eissn><abstract>•An American Sign Language recognition model was developed using Leap Motion.•LSTM-RNN with kNN method was proposed for recognition 26 alphabets.•3D motion of hand gesture and relevant 30 features were extracted.•26 alphabets with recognition rate of 99.44% accuracy was obtained. Though American sign language (ASL) has gained recognition from the American society, few ASL applications have been developed with educational purposes. Those designed with real-time sign recognition systems are also lacking. Leap motion controller facilitates the real-time and accurate recognition of ASL signs. It allows an opportunity for designing a learning application with a real-time sign recognition system that seeks to improve the effectiveness of ASL learning. The project proposes an ASL learning application prototype. The application would be a whack-a-mole game with a real-time sign recognition system embedded. Since both static and dynamic signs (J, Z) exist in ASL alphabets, Long-Short Term Memory Recurrent Neural Network with k-Nearest-Neighbour method is adopted as the classification method is based on handling of sequences of input. Characteristics such as sphere radius, angles between fingers and distance between finger positions are extracted as input for the classification model. The model is trained with 2600 samples, 100 samples taken for each alphabet. The experimental results revealed that the recognition rate for 26 ASL alphabets yields an average of 99.44% accuracy rate and 91.82% in 5-fold cross-validation with the use of leap motion controller.</abstract><cop>New York</cop><pub>Elsevier Ltd</pub><doi>10.1016/j.eswa.2020.114403</doi><orcidid>https://orcid.org/0000-0003-2193-5270</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0957-4174
ispartof Expert systems with applications, 2021-04, Vol.167, p.114403, Article 114403
issn 0957-4174
1873-6793
language eng
recordid cdi_proquest_journals_2503173096
source Access via ScienceDirect (Elsevier)
subjects Alphabets
American sign language
Classification
Controllers
Leap motion controller
Learning
Learning application
Neural networks
Real time
Recognition
Recurrent neural networks
Sign language
Sign recognition system
title American sign language recognition and training method with recurrent neural network
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T18%3A35%3A25IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=American%20sign%20language%20recognition%20and%20training%20method%20with%20recurrent%20neural%20network&rft.jtitle=Expert%20systems%20with%20applications&rft.au=Lee,%20C.K.M.&rft.date=2021-04-01&rft.volume=167&rft.spage=114403&rft.pages=114403-&rft.artnum=114403&rft.issn=0957-4174&rft.eissn=1873-6793&rft_id=info:doi/10.1016/j.eswa.2020.114403&rft_dat=%3Cproquest_cross%3E2503173096%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2503173096&rft_id=info:pmid/&rft_els_id=S0957417420310745&rfr_iscdi=true