Dynamic Gesture Recognition in the Internet of Things

Gesture recognition based on computer vision has gradually become a hot research direction in the field of human-computer interaction. The field of human-computer interaction is an important direction in the Internet of Things (IoTs) technology. Human-computer interaction through gestures is the dir...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019, Vol.7, p.23713-23724
Hauptverfasser: Li, Gongfa, Wu, Hao, Jiang, Guozhang, Xu, Shuang, Liu, Honghai
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 23724
container_issue
container_start_page 23713
container_title IEEE access
container_volume 7
creator Li, Gongfa
Wu, Hao
Jiang, Guozhang
Xu, Shuang
Liu, Honghai
description Gesture recognition based on computer vision has gradually become a hot research direction in the field of human-computer interaction. The field of human-computer interaction is an important direction in the Internet of Things (IoTs) technology. Human-computer interaction through gestures is the direction of continuous research on IoTs technology. In recent years, the Kinect sensor-based gesture recognition method has been widely used in gesture recognition, because it can separate gestures from complex backgrounds and is less affected by illumination and can accurately track and locate gesture motions. At present, the Kinect sensor needs to be further improved on the recognition of complex gesture movements, especially the problem that the recognition rate of dynamic gestures is not high, which hinders the development of human-computer interaction under the IoTs technology. In this paper, based on the above problems, the Kinect-based gesture recognition is analyzed in detail, and a dynamic gesture recognition method based on HMM and D-S evidence theory is proposed. Based on the original HMM, the tangent angle and gesture change at different moments of the palm trajectory are used as the characteristics of the complex motion gesture, and the dimension of the trajectory tangent is reduced by the number of quantization codes. Then, the parameter model training of HMM is completed. Finally, combined with D-S evidence theory, combinatorial logic is judged, dynamic gesture recognition is carried out, and a better recognition effect is obtained, which lays a good foundation for human-computer interaction under the IoTs technology.
doi_str_mv 10.1109/ACCESS.2018.2887223
format Article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_8580553</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8580553</ieee_id><doaj_id>oai_doaj_org_article_8ab96a09cc0e4e35a07879494d1e820c</doaj_id><sourcerecordid>2455603531</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-733693b634331dae8d030bd34ed6fe3a3a792ad21116ad461b149067c23e1aa73</originalsourceid><addsrcrecordid>eNpNkMFuwjAMhqtpk4YYT8Cl0s6wJE7S5Ig6xpAmTRrsHKWpC0HQsLQcePuVFaH5Ysvy_9v-kmRMyZRSol9meT5fraaMUDVlSmWMwV0yYFTqCQiQ9__qx2TUNDvShepaIhsk4vVc24N36QKb9hQx_UIXNrVvfahTX6ftFtNl3WKssU1Dla63vt40T8lDZfcNjq55mHy_zdf5--Tjc7HMZx8Tx4lqJxmA1FBI4AC0tKhKAqQogWMpKwQLNtPMloxSKm3JJS0o10RmjgFSazMYJsvetwx2Z47RH2w8m2C9-WuEuDE2tt7t0ShbaGmJdo4gRxCWZCrTXPOSomLEdV7Pvdcxhp9T963ZhVOsu_MN40JI0gGi3RT0Uy6GpolY3bZSYi64TY_bXHCbK-5ONe5VHhFvCiUUEQLgF509eOM</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2455603531</pqid></control><display><type>article</type><title>Dynamic Gesture Recognition in the Internet of Things</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Li, Gongfa ; Wu, Hao ; Jiang, Guozhang ; Xu, Shuang ; Liu, Honghai</creator><creatorcontrib>Li, Gongfa ; Wu, Hao ; Jiang, Guozhang ; Xu, Shuang ; Liu, Honghai</creatorcontrib><description>Gesture recognition based on computer vision has gradually become a hot research direction in the field of human-computer interaction. The field of human-computer interaction is an important direction in the Internet of Things (IoTs) technology. Human-computer interaction through gestures is the direction of continuous research on IoTs technology. In recent years, the Kinect sensor-based gesture recognition method has been widely used in gesture recognition, because it can separate gestures from complex backgrounds and is less affected by illumination and can accurately track and locate gesture motions. At present, the Kinect sensor needs to be further improved on the recognition of complex gesture movements, especially the problem that the recognition rate of dynamic gestures is not high, which hinders the development of human-computer interaction under the IoTs technology. In this paper, based on the above problems, the Kinect-based gesture recognition is analyzed in detail, and a dynamic gesture recognition method based on HMM and D-S evidence theory is proposed. Based on the original HMM, the tangent angle and gesture change at different moments of the palm trajectory are used as the characteristics of the complex motion gesture, and the dimension of the trajectory tangent is reduced by the number of quantization codes. Then, the parameter model training of HMM is completed. Finally, combined with D-S evidence theory, combinatorial logic is judged, dynamic gesture recognition is carried out, and a better recognition effect is obtained, which lays a good foundation for human-computer interaction under the IoTs technology.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2018.2887223</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Combinatorial analysis ; Computer vision ; D-S evidence theory ; Dynamics ; Feature extraction ; Gesture recognition ; hidden Markov model (HMM) ; Hidden Markov models ; Human-computer interaction ; Internet of Things ; Internet of Things (IoT) ; Tracking ; Trajectories ; Trajectory</subject><ispartof>IEEE access, 2019, Vol.7, p.23713-23724</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-733693b634331dae8d030bd34ed6fe3a3a792ad21116ad461b149067c23e1aa73</citedby><cites>FETCH-LOGICAL-c408t-733693b634331dae8d030bd34ed6fe3a3a792ad21116ad461b149067c23e1aa73</cites><orcidid>0000-0001-6932-022X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8580553$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,860,2096,4010,27610,27900,27901,27902,54908</link.rule.ids></links><search><creatorcontrib>Li, Gongfa</creatorcontrib><creatorcontrib>Wu, Hao</creatorcontrib><creatorcontrib>Jiang, Guozhang</creatorcontrib><creatorcontrib>Xu, Shuang</creatorcontrib><creatorcontrib>Liu, Honghai</creatorcontrib><title>Dynamic Gesture Recognition in the Internet of Things</title><title>IEEE access</title><addtitle>Access</addtitle><description>Gesture recognition based on computer vision has gradually become a hot research direction in the field of human-computer interaction. The field of human-computer interaction is an important direction in the Internet of Things (IoTs) technology. Human-computer interaction through gestures is the direction of continuous research on IoTs technology. In recent years, the Kinect sensor-based gesture recognition method has been widely used in gesture recognition, because it can separate gestures from complex backgrounds and is less affected by illumination and can accurately track and locate gesture motions. At present, the Kinect sensor needs to be further improved on the recognition of complex gesture movements, especially the problem that the recognition rate of dynamic gestures is not high, which hinders the development of human-computer interaction under the IoTs technology. In this paper, based on the above problems, the Kinect-based gesture recognition is analyzed in detail, and a dynamic gesture recognition method based on HMM and D-S evidence theory is proposed. Based on the original HMM, the tangent angle and gesture change at different moments of the palm trajectory are used as the characteristics of the complex motion gesture, and the dimension of the trajectory tangent is reduced by the number of quantization codes. Then, the parameter model training of HMM is completed. Finally, combined with D-S evidence theory, combinatorial logic is judged, dynamic gesture recognition is carried out, and a better recognition effect is obtained, which lays a good foundation for human-computer interaction under the IoTs technology.</description><subject>Combinatorial analysis</subject><subject>Computer vision</subject><subject>D-S evidence theory</subject><subject>Dynamics</subject><subject>Feature extraction</subject><subject>Gesture recognition</subject><subject>hidden Markov model (HMM)</subject><subject>Hidden Markov models</subject><subject>Human-computer interaction</subject><subject>Internet of Things</subject><subject>Internet of Things (IoT)</subject><subject>Tracking</subject><subject>Trajectories</subject><subject>Trajectory</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNkMFuwjAMhqtpk4YYT8Cl0s6wJE7S5Ig6xpAmTRrsHKWpC0HQsLQcePuVFaH5Ysvy_9v-kmRMyZRSol9meT5fraaMUDVlSmWMwV0yYFTqCQiQ9__qx2TUNDvShepaIhsk4vVc24N36QKb9hQx_UIXNrVvfahTX6ftFtNl3WKssU1Dla63vt40T8lDZfcNjq55mHy_zdf5--Tjc7HMZx8Tx4lqJxmA1FBI4AC0tKhKAqQogWMpKwQLNtPMloxSKm3JJS0o10RmjgFSazMYJsvetwx2Z47RH2w8m2C9-WuEuDE2tt7t0ShbaGmJdo4gRxCWZCrTXPOSomLEdV7Pvdcxhp9T963ZhVOsu_MN40JI0gGi3RT0Uy6GpolY3bZSYi64TY_bXHCbK-5ONe5VHhFvCiUUEQLgF509eOM</recordid><startdate>2019</startdate><enddate>2019</enddate><creator>Li, Gongfa</creator><creator>Wu, Hao</creator><creator>Jiang, Guozhang</creator><creator>Xu, Shuang</creator><creator>Liu, Honghai</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-6932-022X</orcidid></search><sort><creationdate>2019</creationdate><title>Dynamic Gesture Recognition in the Internet of Things</title><author>Li, Gongfa ; Wu, Hao ; Jiang, Guozhang ; Xu, Shuang ; Liu, Honghai</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-733693b634331dae8d030bd34ed6fe3a3a792ad21116ad461b149067c23e1aa73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Combinatorial analysis</topic><topic>Computer vision</topic><topic>D-S evidence theory</topic><topic>Dynamics</topic><topic>Feature extraction</topic><topic>Gesture recognition</topic><topic>hidden Markov model (HMM)</topic><topic>Hidden Markov models</topic><topic>Human-computer interaction</topic><topic>Internet of Things</topic><topic>Internet of Things (IoT)</topic><topic>Tracking</topic><topic>Trajectories</topic><topic>Trajectory</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Gongfa</creatorcontrib><creatorcontrib>Wu, Hao</creatorcontrib><creatorcontrib>Jiang, Guozhang</creatorcontrib><creatorcontrib>Xu, Shuang</creatorcontrib><creatorcontrib>Liu, Honghai</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Gongfa</au><au>Wu, Hao</au><au>Jiang, Guozhang</au><au>Xu, Shuang</au><au>Liu, Honghai</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Dynamic Gesture Recognition in the Internet of Things</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2019</date><risdate>2019</risdate><volume>7</volume><spage>23713</spage><epage>23724</epage><pages>23713-23724</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Gesture recognition based on computer vision has gradually become a hot research direction in the field of human-computer interaction. The field of human-computer interaction is an important direction in the Internet of Things (IoTs) technology. Human-computer interaction through gestures is the direction of continuous research on IoTs technology. In recent years, the Kinect sensor-based gesture recognition method has been widely used in gesture recognition, because it can separate gestures from complex backgrounds and is less affected by illumination and can accurately track and locate gesture motions. At present, the Kinect sensor needs to be further improved on the recognition of complex gesture movements, especially the problem that the recognition rate of dynamic gestures is not high, which hinders the development of human-computer interaction under the IoTs technology. In this paper, based on the above problems, the Kinect-based gesture recognition is analyzed in detail, and a dynamic gesture recognition method based on HMM and D-S evidence theory is proposed. Based on the original HMM, the tangent angle and gesture change at different moments of the palm trajectory are used as the characteristics of the complex motion gesture, and the dimension of the trajectory tangent is reduced by the number of quantization codes. Then, the parameter model training of HMM is completed. Finally, combined with D-S evidence theory, combinatorial logic is judged, dynamic gesture recognition is carried out, and a better recognition effect is obtained, which lays a good foundation for human-computer interaction under the IoTs technology.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2018.2887223</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0001-6932-022X</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2019, Vol.7, p.23713-23724
issn 2169-3536
2169-3536
language eng
recordid cdi_ieee_primary_8580553
source IEEE Open Access Journals; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals
subjects Combinatorial analysis
Computer vision
D-S evidence theory
Dynamics
Feature extraction
Gesture recognition
hidden Markov model (HMM)
Hidden Markov models
Human-computer interaction
Internet of Things
Internet of Things (IoT)
Tracking
Trajectories
Trajectory
title Dynamic Gesture Recognition in the Internet of Things
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-03T23%3A40%3A01IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Dynamic%20Gesture%20Recognition%20in%20the%20Internet%20of%20Things&rft.jtitle=IEEE%20access&rft.au=Li,%20Gongfa&rft.date=2019&rft.volume=7&rft.spage=23713&rft.epage=23724&rft.pages=23713-23724&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2018.2887223&rft_dat=%3Cproquest_ieee_%3E2455603531%3C/proquest_ieee_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2455603531&rft_id=info:pmid/&rft_ieee_id=8580553&rft_doaj_id=oai_doaj_org_article_8ab96a09cc0e4e35a07879494d1e820c&rfr_iscdi=true