A Combination Model of Shifting Joint Angle Changes With 3D-Deep Convolutional Neural Network to Recognize Human Activity
Research in the field of human activity recognition is very interesting due to its potential for various applications such as in the field of medical rehabilitation. The need to advance its development has become increasingly necessary to enable efficient detection and response to a wide range of mo...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on neural systems and rehabilitation engineering 2024, Vol.32, p.1078-1089 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1089 |
---|---|
container_issue | |
container_start_page | 1078 |
container_title | IEEE transactions on neural systems and rehabilitation engineering |
container_volume | 32 |
creator | Rahayu, Endang Sri Yuniarno, Eko Mulyanto Purnama, I. Ketut Eddy Purnomo, Mauridhi Hery |
description | Research in the field of human activity recognition is very interesting due to its potential for various applications such as in the field of medical rehabilitation. The need to advance its development has become increasingly necessary to enable efficient detection and response to a wide range of movements. Current recognition methods rely on calculating changes in joint distance to classify activity patterns. Therefore, a different approach is required to identify the direction of movement to distinguish activities exhibiting similar joint distance changes but differing motion directions, such as sitting and standing. The research conducted in this study focused on determining the direction of movement using an innovative joint angle shift approach. By analyzing the joint angle shift value between specific joints and reference points in the sequence of activity frames, the research enabled the detection of variations in activity direction. The joint angle shift method was combined with a Deep Convolutional Neural Network (DCNN) model to classify 3D datasets encompassing spatial-temporal information from RGB-D video image data. Model performance was evaluated using the confusion matrix. The results show that the model successfully classified nine activities in the Florence 3D Actions dataset, including sitting and standing, obtaining an accuracy of (96.72 ± 0.83)%. In addition, to evaluate its robustness, this model was tested on the UTKinect Action3D dataset, obtaining an accuracy of 97.44%, proving that state-of-the-art performance has been achieved. |
doi_str_mv | 10.1109/TNSRE.2024.3371474 |
format | Article |
fullrecord | <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_10453601</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10453601</ieee_id><doaj_id>oai_doaj_org_article_0cb5af27a9fa4464b05bec9c2f099848</doaj_id><sourcerecordid>2956384131</sourcerecordid><originalsourceid>FETCH-LOGICAL-c413t-ae5d6a9172dc9d9aab7f06537d702874201288d5023d25ffde048740a9655efd3</originalsourceid><addsrcrecordid>eNpdkV1v0zAUhiMEYmPwBxBClrjhJsWfcXJZdYMNjSFtQ1xajn2cuiR2SZyh8utJ2jIhro519Lyvjvxk2WuCF4Tg6sP9zd3txYJiyheMScIlf5KdEiHKHFOCn85vxnPOKD7JXgzDBmMiCyGfZyes5JSUnJxmuyVaxa72QScfA_oSLbQoOnS39i750KDP0YeElqFpAa3WOjQwoO8-rRE7z88BtlM8PMR2nOO6RTcw9vuRfsX-B0oR3YKJTfC_AV2OnQ5oaZJ_8Gn3MnvmdDvAq-M8y759vLhfXebXXz9drZbXueGEpVyDsIWuiKTWVLbSupYOF4JJKzEtJaeY0LK0AlNmqXDOAubTGuuqEAKcZWfZ1aHXRr1R2953ut-pqL3aL2LfKN0nb1pQ2NRCOyp15TTnBa-xqMFUhjpcVSUvp673h65tH3-OMCTV-cFA2-oAcRwUrRinkgvGJvTdf-gmjv30RTMliskAYWSi6IEyfRyGHtzjgQSrWbLaS1azZHWUPIXeHqvHugP7GPlrdQLeHAAPAP80TocVmLA_v7eqAQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2956384131</pqid></control><display><type>article</type><title>A Combination Model of Shifting Joint Angle Changes With 3D-Deep Convolutional Neural Network to Recognize Human Activity</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Rahayu, Endang Sri ; Yuniarno, Eko Mulyanto ; Purnama, I. Ketut Eddy ; Purnomo, Mauridhi Hery</creator><creatorcontrib>Rahayu, Endang Sri ; Yuniarno, Eko Mulyanto ; Purnama, I. Ketut Eddy ; Purnomo, Mauridhi Hery</creatorcontrib><description>Research in the field of human activity recognition is very interesting due to its potential for various applications such as in the field of medical rehabilitation. The need to advance its development has become increasingly necessary to enable efficient detection and response to a wide range of movements. Current recognition methods rely on calculating changes in joint distance to classify activity patterns. Therefore, a different approach is required to identify the direction of movement to distinguish activities exhibiting similar joint distance changes but differing motion directions, such as sitting and standing. The research conducted in this study focused on determining the direction of movement using an innovative joint angle shift approach. By analyzing the joint angle shift value between specific joints and reference points in the sequence of activity frames, the research enabled the detection of variations in activity direction. The joint angle shift method was combined with a Deep Convolutional Neural Network (DCNN) model to classify 3D datasets encompassing spatial-temporal information from RGB-D video image data. Model performance was evaluated using the confusion matrix. The results show that the model successfully classified nine activities in the Florence 3D Actions dataset, including sitting and standing, obtaining an accuracy of (96.72 ± 0.83)%. In addition, to evaluate its robustness, this model was tested on the UTKinect Action3D dataset, obtaining an accuracy of 97.44%, proving that state-of-the-art performance has been achieved.</description><identifier>ISSN: 1534-4320</identifier><identifier>ISSN: 1558-0210</identifier><identifier>EISSN: 1558-0210</identifier><identifier>DOI: 10.1109/TNSRE.2024.3371474</identifier><identifier>PMID: 38421841</identifier><identifier>CODEN: ITNSB3</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Activity patterns ; Artificial neural networks ; Classification ; Combination model ; Convolutional neural networks ; Data models ; Datasets ; deep convolutional neural network ; Deep Learning ; Hidden Markov models ; Human Activities ; Human activity recognition ; Human motion ; Humans ; Joints ; Motion ; Movement ; Neural networks ; Neural Networks, Computer ; Performance evaluation ; Postal services ; shifting joint angles ; Spatiotemporal phenomena ; Three dimensional models ; Three-dimensional displays</subject><ispartof>IEEE transactions on neural systems and rehabilitation engineering, 2024, Vol.32, p.1078-1089</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c413t-ae5d6a9172dc9d9aab7f06537d702874201288d5023d25ffde048740a9655efd3</cites><orcidid>0000-0002-7438-7880 ; 0000-0003-1243-3025 ; 0000-0002-6221-7382</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,864,2100,4021,27921,27922,27923</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/38421841$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Rahayu, Endang Sri</creatorcontrib><creatorcontrib>Yuniarno, Eko Mulyanto</creatorcontrib><creatorcontrib>Purnama, I. Ketut Eddy</creatorcontrib><creatorcontrib>Purnomo, Mauridhi Hery</creatorcontrib><title>A Combination Model of Shifting Joint Angle Changes With 3D-Deep Convolutional Neural Network to Recognize Human Activity</title><title>IEEE transactions on neural systems and rehabilitation engineering</title><addtitle>TNSRE</addtitle><addtitle>IEEE Trans Neural Syst Rehabil Eng</addtitle><description>Research in the field of human activity recognition is very interesting due to its potential for various applications such as in the field of medical rehabilitation. The need to advance its development has become increasingly necessary to enable efficient detection and response to a wide range of movements. Current recognition methods rely on calculating changes in joint distance to classify activity patterns. Therefore, a different approach is required to identify the direction of movement to distinguish activities exhibiting similar joint distance changes but differing motion directions, such as sitting and standing. The research conducted in this study focused on determining the direction of movement using an innovative joint angle shift approach. By analyzing the joint angle shift value between specific joints and reference points in the sequence of activity frames, the research enabled the detection of variations in activity direction. The joint angle shift method was combined with a Deep Convolutional Neural Network (DCNN) model to classify 3D datasets encompassing spatial-temporal information from RGB-D video image data. Model performance was evaluated using the confusion matrix. The results show that the model successfully classified nine activities in the Florence 3D Actions dataset, including sitting and standing, obtaining an accuracy of (96.72 ± 0.83)%. In addition, to evaluate its robustness, this model was tested on the UTKinect Action3D dataset, obtaining an accuracy of 97.44%, proving that state-of-the-art performance has been achieved.</description><subject>Activity patterns</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Combination model</subject><subject>Convolutional neural networks</subject><subject>Data models</subject><subject>Datasets</subject><subject>deep convolutional neural network</subject><subject>Deep Learning</subject><subject>Hidden Markov models</subject><subject>Human Activities</subject><subject>Human activity recognition</subject><subject>Human motion</subject><subject>Humans</subject><subject>Joints</subject><subject>Motion</subject><subject>Movement</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Performance evaluation</subject><subject>Postal services</subject><subject>shifting joint angles</subject><subject>Spatiotemporal phenomena</subject><subject>Three dimensional models</subject><subject>Three-dimensional displays</subject><issn>1534-4320</issn><issn>1558-0210</issn><issn>1558-0210</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>EIF</sourceid><sourceid>DOA</sourceid><recordid>eNpdkV1v0zAUhiMEYmPwBxBClrjhJsWfcXJZdYMNjSFtQ1xajn2cuiR2SZyh8utJ2jIhro519Lyvjvxk2WuCF4Tg6sP9zd3txYJiyheMScIlf5KdEiHKHFOCn85vxnPOKD7JXgzDBmMiCyGfZyes5JSUnJxmuyVaxa72QScfA_oSLbQoOnS39i750KDP0YeElqFpAa3WOjQwoO8-rRE7z88BtlM8PMR2nOO6RTcw9vuRfsX-B0oR3YKJTfC_AV2OnQ5oaZJ_8Gn3MnvmdDvAq-M8y759vLhfXebXXz9drZbXueGEpVyDsIWuiKTWVLbSupYOF4JJKzEtJaeY0LK0AlNmqXDOAubTGuuqEAKcZWfZ1aHXRr1R2953ut-pqL3aL2LfKN0nb1pQ2NRCOyp15TTnBa-xqMFUhjpcVSUvp673h65tH3-OMCTV-cFA2-oAcRwUrRinkgvGJvTdf-gmjv30RTMliskAYWSi6IEyfRyGHtzjgQSrWbLaS1azZHWUPIXeHqvHugP7GPlrdQLeHAAPAP80TocVmLA_v7eqAQ</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Rahayu, Endang Sri</creator><creator>Yuniarno, Eko Mulyanto</creator><creator>Purnama, I. Ketut Eddy</creator><creator>Purnomo, Mauridhi Hery</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>NAPCQ</scope><scope>P64</scope><scope>7X8</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-7438-7880</orcidid><orcidid>https://orcid.org/0000-0003-1243-3025</orcidid><orcidid>https://orcid.org/0000-0002-6221-7382</orcidid></search><sort><creationdate>2024</creationdate><title>A Combination Model of Shifting Joint Angle Changes With 3D-Deep Convolutional Neural Network to Recognize Human Activity</title><author>Rahayu, Endang Sri ; Yuniarno, Eko Mulyanto ; Purnama, I. Ketut Eddy ; Purnomo, Mauridhi Hery</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c413t-ae5d6a9172dc9d9aab7f06537d702874201288d5023d25ffde048740a9655efd3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Activity patterns</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Combination model</topic><topic>Convolutional neural networks</topic><topic>Data models</topic><topic>Datasets</topic><topic>deep convolutional neural network</topic><topic>Deep Learning</topic><topic>Hidden Markov models</topic><topic>Human Activities</topic><topic>Human activity recognition</topic><topic>Human motion</topic><topic>Humans</topic><topic>Joints</topic><topic>Motion</topic><topic>Movement</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Performance evaluation</topic><topic>Postal services</topic><topic>shifting joint angles</topic><topic>Spatiotemporal phenomena</topic><topic>Three dimensional models</topic><topic>Three-dimensional displays</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Rahayu, Endang Sri</creatorcontrib><creatorcontrib>Yuniarno, Eko Mulyanto</creatorcontrib><creatorcontrib>Purnama, I. Ketut Eddy</creatorcontrib><creatorcontrib>Purnomo, Mauridhi Hery</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Nursing & Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE transactions on neural systems and rehabilitation engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Rahayu, Endang Sri</au><au>Yuniarno, Eko Mulyanto</au><au>Purnama, I. Ketut Eddy</au><au>Purnomo, Mauridhi Hery</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Combination Model of Shifting Joint Angle Changes With 3D-Deep Convolutional Neural Network to Recognize Human Activity</atitle><jtitle>IEEE transactions on neural systems and rehabilitation engineering</jtitle><stitle>TNSRE</stitle><addtitle>IEEE Trans Neural Syst Rehabil Eng</addtitle><date>2024</date><risdate>2024</risdate><volume>32</volume><spage>1078</spage><epage>1089</epage><pages>1078-1089</pages><issn>1534-4320</issn><issn>1558-0210</issn><eissn>1558-0210</eissn><coden>ITNSB3</coden><abstract>Research in the field of human activity recognition is very interesting due to its potential for various applications such as in the field of medical rehabilitation. The need to advance its development has become increasingly necessary to enable efficient detection and response to a wide range of movements. Current recognition methods rely on calculating changes in joint distance to classify activity patterns. Therefore, a different approach is required to identify the direction of movement to distinguish activities exhibiting similar joint distance changes but differing motion directions, such as sitting and standing. The research conducted in this study focused on determining the direction of movement using an innovative joint angle shift approach. By analyzing the joint angle shift value between specific joints and reference points in the sequence of activity frames, the research enabled the detection of variations in activity direction. The joint angle shift method was combined with a Deep Convolutional Neural Network (DCNN) model to classify 3D datasets encompassing spatial-temporal information from RGB-D video image data. Model performance was evaluated using the confusion matrix. The results show that the model successfully classified nine activities in the Florence 3D Actions dataset, including sitting and standing, obtaining an accuracy of (96.72 ± 0.83)%. In addition, to evaluate its robustness, this model was tested on the UTKinect Action3D dataset, obtaining an accuracy of 97.44%, proving that state-of-the-art performance has been achieved.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>38421841</pmid><doi>10.1109/TNSRE.2024.3371474</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0002-7438-7880</orcidid><orcidid>https://orcid.org/0000-0003-1243-3025</orcidid><orcidid>https://orcid.org/0000-0002-6221-7382</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1534-4320 |
ispartof | IEEE transactions on neural systems and rehabilitation engineering, 2024, Vol.32, p.1078-1089 |
issn | 1534-4320 1558-0210 1558-0210 |
language | eng |
recordid | cdi_ieee_primary_10453601 |
source | MEDLINE; DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals |
subjects | Activity patterns Artificial neural networks Classification Combination model Convolutional neural networks Data models Datasets deep convolutional neural network Deep Learning Hidden Markov models Human Activities Human activity recognition Human motion Humans Joints Motion Movement Neural networks Neural Networks, Computer Performance evaluation Postal services shifting joint angles Spatiotemporal phenomena Three dimensional models Three-dimensional displays |
title | A Combination Model of Shifting Joint Angle Changes With 3D-Deep Convolutional Neural Network to Recognize Human Activity |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T00%3A43%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Combination%20Model%20of%20Shifting%20Joint%20Angle%20Changes%20With%203D-Deep%20Convolutional%20Neural%20Network%20to%20Recognize%20Human%20Activity&rft.jtitle=IEEE%20transactions%20on%20neural%20systems%20and%20rehabilitation%20engineering&rft.au=Rahayu,%20Endang%20Sri&rft.date=2024&rft.volume=32&rft.spage=1078&rft.epage=1089&rft.pages=1078-1089&rft.issn=1534-4320&rft.eissn=1558-0210&rft.coden=ITNSB3&rft_id=info:doi/10.1109/TNSRE.2024.3371474&rft_dat=%3Cproquest_ieee_%3E2956384131%3C/proquest_ieee_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2956384131&rft_id=info:pmid/38421841&rft_ieee_id=10453601&rft_doaj_id=oai_doaj_org_article_0cb5af27a9fa4464b05bec9c2f099848&rfr_iscdi=true |