3D human hand motion recognition system
The results of design and investigation on a human gesture recognition system, based on a Kinect sensor, are presented in this paper. In the presented research, we use a Kinect device as a 3D data scanner. Therefore, the 3D coordinates are calculated directly from depth images. The system's har...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 183 |
---|---|
container_issue | |
container_start_page | 180 |
container_title | |
container_volume | |
creator | Rimkus, Kestas Bukis, Audrius Lipnickas, Arunas Sinkevicius, Saulius |
description | The results of design and investigation on a human gesture recognition system, based on a Kinect sensor, are presented in this paper. In the presented research, we use a Kinect device as a 3D data scanner. Therefore, the 3D coordinates are calculated directly from depth images. The system's hardware description and computation method for 3D human gesture identification are presented in this study. Ten specific single hand motion gestures, repeated several times by seven different people were recorded and used in the experimentation. Gesture recognition and interpretation are performed by using a trained neural classifier in two ways. In the first way, single hand motion gestures are captured in free 3D space, while in the second one people's heads coordinates in 3D are used as reference points for recorded hand gestures. Such an approach provided easy adaptation and flexibility for gesture interpretation. The structure of the classifier was estimated through the trial and error approach. |
doi_str_mv | 10.1109/HSI.2013.6577820 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_6577820</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6577820</ieee_id><sourcerecordid>6577820</sourcerecordid><originalsourceid>FETCH-LOGICAL-i175t-21157f78d8bc3b37b3a3f16a05c974076f9fe38d33e57040d747597818ec42903</originalsourceid><addsrcrecordid>eNpVjzFPwzAQhY0AiapkR2LJxpRw57N99ogKpZUqMbTMlZPYNIgkKA5D_z0CujC99y1P3xPiBqFEBHe_2q5LCUil0cxWwpnIHFtUhkkbMnz-jzVciJlEbQsplbkSWUrvAEBorZM4E3f0mB--Ot_nB983eTdM7dDnY6iHt7797emYptBdi8voP1LITjkXr8un3WJVbF6e14uHTdEi66mQiJoj28ZWNVXEFXmKaDzo2rECNtHFQLYhCppBQcOK9Y-uDbWSDmgubv922xDC_nNsOz8e96er9A2R6EIg</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>3D human hand motion recognition system</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Rimkus, Kestas ; Bukis, Audrius ; Lipnickas, Arunas ; Sinkevicius, Saulius</creator><creatorcontrib>Rimkus, Kestas ; Bukis, Audrius ; Lipnickas, Arunas ; Sinkevicius, Saulius</creatorcontrib><description>The results of design and investigation on a human gesture recognition system, based on a Kinect sensor, are presented in this paper. In the presented research, we use a Kinect device as a 3D data scanner. Therefore, the 3D coordinates are calculated directly from depth images. The system's hardware description and computation method for 3D human gesture identification are presented in this study. Ten specific single hand motion gestures, repeated several times by seven different people were recorded and used in the experimentation. Gesture recognition and interpretation are performed by using a trained neural classifier in two ways. In the first way, single hand motion gestures are captured in free 3D space, while in the second one people's heads coordinates in 3D are used as reference points for recorded hand gestures. Such an approach provided easy adaptation and flexibility for gesture interpretation. The structure of the classifier was estimated through the trial and error approach.</description><identifier>ISSN: 2158-2246</identifier><identifier>ISBN: 9781467356350</identifier><identifier>ISBN: 1467356352</identifier><identifier>EISBN: 9781467356367</identifier><identifier>EISBN: 1467356379</identifier><identifier>EISBN: 1467356360</identifier><identifier>EISBN: 9781467356374</identifier><identifier>DOI: 10.1109/HSI.2013.6577820</identifier><language>eng</language><publisher>IEEE</publisher><subject>3D gesture recognition ; Artificial neural networks ; Biological neural networks ; Cameras ; Gesture recognition ; Kinect ; neural network ; Robot sensing systems ; Training</subject><ispartof>2013 6th International Conference on Human System Interactions (HSI), 2013, p.180-183</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6577820$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2052,27902,54895</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6577820$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Rimkus, Kestas</creatorcontrib><creatorcontrib>Bukis, Audrius</creatorcontrib><creatorcontrib>Lipnickas, Arunas</creatorcontrib><creatorcontrib>Sinkevicius, Saulius</creatorcontrib><title>3D human hand motion recognition system</title><title>2013 6th International Conference on Human System Interactions (HSI)</title><addtitle>HSI</addtitle><description>The results of design and investigation on a human gesture recognition system, based on a Kinect sensor, are presented in this paper. In the presented research, we use a Kinect device as a 3D data scanner. Therefore, the 3D coordinates are calculated directly from depth images. The system's hardware description and computation method for 3D human gesture identification are presented in this study. Ten specific single hand motion gestures, repeated several times by seven different people were recorded and used in the experimentation. Gesture recognition and interpretation are performed by using a trained neural classifier in two ways. In the first way, single hand motion gestures are captured in free 3D space, while in the second one people's heads coordinates in 3D are used as reference points for recorded hand gestures. Such an approach provided easy adaptation and flexibility for gesture interpretation. The structure of the classifier was estimated through the trial and error approach.</description><subject>3D gesture recognition</subject><subject>Artificial neural networks</subject><subject>Biological neural networks</subject><subject>Cameras</subject><subject>Gesture recognition</subject><subject>Kinect</subject><subject>neural network</subject><subject>Robot sensing systems</subject><subject>Training</subject><issn>2158-2246</issn><isbn>9781467356350</isbn><isbn>1467356352</isbn><isbn>9781467356367</isbn><isbn>1467356379</isbn><isbn>1467356360</isbn><isbn>9781467356374</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2013</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNpVjzFPwzAQhY0AiapkR2LJxpRw57N99ogKpZUqMbTMlZPYNIgkKA5D_z0CujC99y1P3xPiBqFEBHe_2q5LCUil0cxWwpnIHFtUhkkbMnz-jzVciJlEbQsplbkSWUrvAEBorZM4E3f0mB--Ot_nB983eTdM7dDnY6iHt7797emYptBdi8voP1LITjkXr8un3WJVbF6e14uHTdEi66mQiJoj28ZWNVXEFXmKaDzo2rECNtHFQLYhCppBQcOK9Y-uDbWSDmgubv922xDC_nNsOz8e96er9A2R6EIg</recordid><startdate>201306</startdate><enddate>201306</enddate><creator>Rimkus, Kestas</creator><creator>Bukis, Audrius</creator><creator>Lipnickas, Arunas</creator><creator>Sinkevicius, Saulius</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201306</creationdate><title>3D human hand motion recognition system</title><author>Rimkus, Kestas ; Bukis, Audrius ; Lipnickas, Arunas ; Sinkevicius, Saulius</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i175t-21157f78d8bc3b37b3a3f16a05c974076f9fe38d33e57040d747597818ec42903</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2013</creationdate><topic>3D gesture recognition</topic><topic>Artificial neural networks</topic><topic>Biological neural networks</topic><topic>Cameras</topic><topic>Gesture recognition</topic><topic>Kinect</topic><topic>neural network</topic><topic>Robot sensing systems</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Rimkus, Kestas</creatorcontrib><creatorcontrib>Bukis, Audrius</creatorcontrib><creatorcontrib>Lipnickas, Arunas</creatorcontrib><creatorcontrib>Sinkevicius, Saulius</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Rimkus, Kestas</au><au>Bukis, Audrius</au><au>Lipnickas, Arunas</au><au>Sinkevicius, Saulius</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>3D human hand motion recognition system</atitle><btitle>2013 6th International Conference on Human System Interactions (HSI)</btitle><stitle>HSI</stitle><date>2013-06</date><risdate>2013</risdate><spage>180</spage><epage>183</epage><pages>180-183</pages><issn>2158-2246</issn><isbn>9781467356350</isbn><isbn>1467356352</isbn><eisbn>9781467356367</eisbn><eisbn>1467356379</eisbn><eisbn>1467356360</eisbn><eisbn>9781467356374</eisbn><abstract>The results of design and investigation on a human gesture recognition system, based on a Kinect sensor, are presented in this paper. In the presented research, we use a Kinect device as a 3D data scanner. Therefore, the 3D coordinates are calculated directly from depth images. The system's hardware description and computation method for 3D human gesture identification are presented in this study. Ten specific single hand motion gestures, repeated several times by seven different people were recorded and used in the experimentation. Gesture recognition and interpretation are performed by using a trained neural classifier in two ways. In the first way, single hand motion gestures are captured in free 3D space, while in the second one people's heads coordinates in 3D are used as reference points for recorded hand gestures. Such an approach provided easy adaptation and flexibility for gesture interpretation. The structure of the classifier was estimated through the trial and error approach.</abstract><pub>IEEE</pub><doi>10.1109/HSI.2013.6577820</doi><tpages>4</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2158-2246 |
ispartof | 2013 6th International Conference on Human System Interactions (HSI), 2013, p.180-183 |
issn | 2158-2246 |
language | eng |
recordid | cdi_ieee_primary_6577820 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | 3D gesture recognition Artificial neural networks Biological neural networks Cameras Gesture recognition Kinect neural network Robot sensing systems Training |
title | 3D human hand motion recognition system |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-18T04%3A48%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=3D%20human%20hand%20motion%20recognition%20system&rft.btitle=2013%206th%20International%20Conference%20on%20Human%20System%20Interactions%20(HSI)&rft.au=Rimkus,%20Kestas&rft.date=2013-06&rft.spage=180&rft.epage=183&rft.pages=180-183&rft.issn=2158-2246&rft.isbn=9781467356350&rft.isbn_list=1467356352&rft_id=info:doi/10.1109/HSI.2013.6577820&rft_dat=%3Cieee_6IE%3E6577820%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=9781467356367&rft.eisbn_list=1467356379&rft.eisbn_list=1467356360&rft.eisbn_list=9781467356374&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=6577820&rfr_iscdi=true |