Deep ensemble learning approach for lower extremity activities recognition using wearable sensors

Human walking is a very challenging task and always requires rigorous practice. It is a learning process that involves the complex coordination of the brain and lower limbs. The bipedal robots that mimic the human morphological structure to produce human similar walking, are not capable of producing...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems 2022-07, Vol.39 (6), p.n/a
Hauptverfasser: Jain, Rahul, Semwal, Vijay Bhaskar, Kaushik, Praveen
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page n/a
container_issue 6
container_start_page
container_title Expert systems
container_volume 39
creator Jain, Rahul
Semwal, Vijay Bhaskar
Kaushik, Praveen
description Human walking is a very challenging task and always requires rigorous practice. It is a learning process that involves the complex coordination of the brain and lower limbs. The bipedal robots that mimic the human morphological structure to produce human similar walking, are not capable of producing an efficient walk. Due to walking challenges and structural differences, a robot cannot walk like a human being. In this research, to achieve the aforementioned objective to produce a human similar walk, human lower extremity activities are considered to understand walking behaviour. The experiment involves different walking styles on different terrains. To capture the learning process of bipedal robot locomotion, a deep learning‐based ensemble classifier is introduced for human lower activities recognition. To understand the learning process seven different walking activities are considered for analysis purposes. An Inertial measurement unit (IMU) is used as a wearable device due to its small form factor and unobtrusive nature to capture the walking movement of different lower limbs joints. Three public datasets viz. mHealth, OU‐ISIR similar action and HAPT inertial sensor data sets are considered for this study. To classify the activities, 2 different deep learning models namely convolutional neural network (CNN) and long short‐term memory (LSTM) are used. To generalize the results, an ensemble of different classifiers is implemented. The Classifier has reported accuracy of 99.25%, 88.48% and 97.44%, respectively, on the aforementioned data sets. This work can be utilized for elderly subjects' postural stability, rehabilitation of patients post‐stroke and trauma, generation of robot walk trajectories in cluttered environment and reconstruction of impaired walking.
doi_str_mv 10.1111/exsy.12743
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2679755742</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2679755742</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3673-7dfb49b5acaac721dbeed9f9de93cf0206697e43de7df7cafe45491b9fa8af3d3</originalsourceid><addsrcrecordid>eNp9kD1PwzAQhi0EEqWw8AsssSGl2Ilj1yMq5UOqxABIMFmOcy6p0jjYKW3-PQ5h5pa74bnnTi9Cl5TMaKwbOIR-RlPBsiM0oYzPE5JJdowmJOU8YSIlp-gshA0hhArBJ0jfAbQYmgDbogZcg_ZN1ayxblvvtPnE1nlcuz14DIfOw7bqeqxNV31XXQUBezBu3cTZNXgXhs19VOjBFaLV-XCOTqyuA1z89Sl6u1--Lh6T1fPD0-J2lZiMiywRpS2YLHJttDYipWUBUEorS5CZsSQlnEsBLCshksJoCyxnkhbS6rm2WZlN0dXojY9_7SB0auN2voknVcqFFHkuWBqp65Ey3oXgwarWV1vte0WJGiJUQ4TqN8II0xHeVzX0_5Bq-f7yMe78AOH3d98</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2679755742</pqid></control><display><type>article</type><title>Deep ensemble learning approach for lower extremity activities recognition using wearable sensors</title><source>Business Source Complete</source><source>Wiley Online Library All Journals</source><creator>Jain, Rahul ; Semwal, Vijay Bhaskar ; Kaushik, Praveen</creator><creatorcontrib>Jain, Rahul ; Semwal, Vijay Bhaskar ; Kaushik, Praveen</creatorcontrib><description>Human walking is a very challenging task and always requires rigorous practice. It is a learning process that involves the complex coordination of the brain and lower limbs. The bipedal robots that mimic the human morphological structure to produce human similar walking, are not capable of producing an efficient walk. Due to walking challenges and structural differences, a robot cannot walk like a human being. In this research, to achieve the aforementioned objective to produce a human similar walk, human lower extremity activities are considered to understand walking behaviour. The experiment involves different walking styles on different terrains. To capture the learning process of bipedal robot locomotion, a deep learning‐based ensemble classifier is introduced for human lower activities recognition. To understand the learning process seven different walking activities are considered for analysis purposes. An Inertial measurement unit (IMU) is used as a wearable device due to its small form factor and unobtrusive nature to capture the walking movement of different lower limbs joints. Three public datasets viz. mHealth, OU‐ISIR similar action and HAPT inertial sensor data sets are considered for this study. To classify the activities, 2 different deep learning models namely convolutional neural network (CNN) and long short‐term memory (LSTM) are used. To generalize the results, an ensemble of different classifiers is implemented. The Classifier has reported accuracy of 99.25%, 88.48% and 97.44%, respectively, on the aforementioned data sets. This work can be utilized for elderly subjects' postural stability, rehabilitation of patients post‐stroke and trauma, generation of robot walk trajectories in cluttered environment and reconstruction of impaired walking.</description><identifier>ISSN: 0266-4720</identifier><identifier>EISSN: 1468-0394</identifier><identifier>DOI: 10.1111/exsy.12743</identifier><language>eng</language><publisher>Oxford: Blackwell Publishing Ltd</publisher><subject>Artificial neural networks ; bipedal robots ; Classifiers ; CNN ; Datasets ; Deep learning ; edge computing ; Ensemble learning ; Form factors ; human activity recognition (HAR) ; IMU ; Inertial platforms ; Inertial sensing devices ; internet of things (IoT) ; Locomotion ; LSTM ; Recognition ; Rehabilitation ; Robot dynamics ; robot walk generation ; Robots ; tinyML ; Walking ; wearable sensors ; Wearable technology</subject><ispartof>Expert systems, 2022-07, Vol.39 (6), p.n/a</ispartof><rights>2021 John Wiley &amp; Sons Ltd.</rights><rights>2022 John Wiley &amp; Sons, Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3673-7dfb49b5acaac721dbeed9f9de93cf0206697e43de7df7cafe45491b9fa8af3d3</citedby><cites>FETCH-LOGICAL-c3673-7dfb49b5acaac721dbeed9f9de93cf0206697e43de7df7cafe45491b9fa8af3d3</cites><orcidid>0000-0001-6706-0735 ; 0000-0003-0767-6057 ; 0000-0002-4969-6956</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1111%2Fexsy.12743$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1111%2Fexsy.12743$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,780,784,1416,27922,27923,45572,45573</link.rule.ids></links><search><creatorcontrib>Jain, Rahul</creatorcontrib><creatorcontrib>Semwal, Vijay Bhaskar</creatorcontrib><creatorcontrib>Kaushik, Praveen</creatorcontrib><title>Deep ensemble learning approach for lower extremity activities recognition using wearable sensors</title><title>Expert systems</title><description>Human walking is a very challenging task and always requires rigorous practice. It is a learning process that involves the complex coordination of the brain and lower limbs. The bipedal robots that mimic the human morphological structure to produce human similar walking, are not capable of producing an efficient walk. Due to walking challenges and structural differences, a robot cannot walk like a human being. In this research, to achieve the aforementioned objective to produce a human similar walk, human lower extremity activities are considered to understand walking behaviour. The experiment involves different walking styles on different terrains. To capture the learning process of bipedal robot locomotion, a deep learning‐based ensemble classifier is introduced for human lower activities recognition. To understand the learning process seven different walking activities are considered for analysis purposes. An Inertial measurement unit (IMU) is used as a wearable device due to its small form factor and unobtrusive nature to capture the walking movement of different lower limbs joints. Three public datasets viz. mHealth, OU‐ISIR similar action and HAPT inertial sensor data sets are considered for this study. To classify the activities, 2 different deep learning models namely convolutional neural network (CNN) and long short‐term memory (LSTM) are used. To generalize the results, an ensemble of different classifiers is implemented. The Classifier has reported accuracy of 99.25%, 88.48% and 97.44%, respectively, on the aforementioned data sets. This work can be utilized for elderly subjects' postural stability, rehabilitation of patients post‐stroke and trauma, generation of robot walk trajectories in cluttered environment and reconstruction of impaired walking.</description><subject>Artificial neural networks</subject><subject>bipedal robots</subject><subject>Classifiers</subject><subject>CNN</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>edge computing</subject><subject>Ensemble learning</subject><subject>Form factors</subject><subject>human activity recognition (HAR)</subject><subject>IMU</subject><subject>Inertial platforms</subject><subject>Inertial sensing devices</subject><subject>internet of things (IoT)</subject><subject>Locomotion</subject><subject>LSTM</subject><subject>Recognition</subject><subject>Rehabilitation</subject><subject>Robot dynamics</subject><subject>robot walk generation</subject><subject>Robots</subject><subject>tinyML</subject><subject>Walking</subject><subject>wearable sensors</subject><subject>Wearable technology</subject><issn>0266-4720</issn><issn>1468-0394</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kD1PwzAQhi0EEqWw8AsssSGl2Ilj1yMq5UOqxABIMFmOcy6p0jjYKW3-PQ5h5pa74bnnTi9Cl5TMaKwbOIR-RlPBsiM0oYzPE5JJdowmJOU8YSIlp-gshA0hhArBJ0jfAbQYmgDbogZcg_ZN1ayxblvvtPnE1nlcuz14DIfOw7bqeqxNV31XXQUBezBu3cTZNXgXhs19VOjBFaLV-XCOTqyuA1z89Sl6u1--Lh6T1fPD0-J2lZiMiywRpS2YLHJttDYipWUBUEorS5CZsSQlnEsBLCshksJoCyxnkhbS6rm2WZlN0dXojY9_7SB0auN2voknVcqFFHkuWBqp65Ey3oXgwarWV1vte0WJGiJUQ4TqN8II0xHeVzX0_5Bq-f7yMe78AOH3d98</recordid><startdate>202207</startdate><enddate>202207</enddate><creator>Jain, Rahul</creator><creator>Semwal, Vijay Bhaskar</creator><creator>Kaushik, Praveen</creator><general>Blackwell Publishing Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-6706-0735</orcidid><orcidid>https://orcid.org/0000-0003-0767-6057</orcidid><orcidid>https://orcid.org/0000-0002-4969-6956</orcidid></search><sort><creationdate>202207</creationdate><title>Deep ensemble learning approach for lower extremity activities recognition using wearable sensors</title><author>Jain, Rahul ; Semwal, Vijay Bhaskar ; Kaushik, Praveen</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3673-7dfb49b5acaac721dbeed9f9de93cf0206697e43de7df7cafe45491b9fa8af3d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Artificial neural networks</topic><topic>bipedal robots</topic><topic>Classifiers</topic><topic>CNN</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>edge computing</topic><topic>Ensemble learning</topic><topic>Form factors</topic><topic>human activity recognition (HAR)</topic><topic>IMU</topic><topic>Inertial platforms</topic><topic>Inertial sensing devices</topic><topic>internet of things (IoT)</topic><topic>Locomotion</topic><topic>LSTM</topic><topic>Recognition</topic><topic>Rehabilitation</topic><topic>Robot dynamics</topic><topic>robot walk generation</topic><topic>Robots</topic><topic>tinyML</topic><topic>Walking</topic><topic>wearable sensors</topic><topic>Wearable technology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Jain, Rahul</creatorcontrib><creatorcontrib>Semwal, Vijay Bhaskar</creatorcontrib><creatorcontrib>Kaushik, Praveen</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Expert systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Jain, Rahul</au><au>Semwal, Vijay Bhaskar</au><au>Kaushik, Praveen</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Deep ensemble learning approach for lower extremity activities recognition using wearable sensors</atitle><jtitle>Expert systems</jtitle><date>2022-07</date><risdate>2022</risdate><volume>39</volume><issue>6</issue><epage>n/a</epage><issn>0266-4720</issn><eissn>1468-0394</eissn><abstract>Human walking is a very challenging task and always requires rigorous practice. It is a learning process that involves the complex coordination of the brain and lower limbs. The bipedal robots that mimic the human morphological structure to produce human similar walking, are not capable of producing an efficient walk. Due to walking challenges and structural differences, a robot cannot walk like a human being. In this research, to achieve the aforementioned objective to produce a human similar walk, human lower extremity activities are considered to understand walking behaviour. The experiment involves different walking styles on different terrains. To capture the learning process of bipedal robot locomotion, a deep learning‐based ensemble classifier is introduced for human lower activities recognition. To understand the learning process seven different walking activities are considered for analysis purposes. An Inertial measurement unit (IMU) is used as a wearable device due to its small form factor and unobtrusive nature to capture the walking movement of different lower limbs joints. Three public datasets viz. mHealth, OU‐ISIR similar action and HAPT inertial sensor data sets are considered for this study. To classify the activities, 2 different deep learning models namely convolutional neural network (CNN) and long short‐term memory (LSTM) are used. To generalize the results, an ensemble of different classifiers is implemented. The Classifier has reported accuracy of 99.25%, 88.48% and 97.44%, respectively, on the aforementioned data sets. This work can be utilized for elderly subjects' postural stability, rehabilitation of patients post‐stroke and trauma, generation of robot walk trajectories in cluttered environment and reconstruction of impaired walking.</abstract><cop>Oxford</cop><pub>Blackwell Publishing Ltd</pub><doi>10.1111/exsy.12743</doi><tpages>17</tpages><orcidid>https://orcid.org/0000-0001-6706-0735</orcidid><orcidid>https://orcid.org/0000-0003-0767-6057</orcidid><orcidid>https://orcid.org/0000-0002-4969-6956</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0266-4720
ispartof Expert systems, 2022-07, Vol.39 (6), p.n/a
issn 0266-4720
1468-0394
language eng
recordid cdi_proquest_journals_2679755742
source Business Source Complete; Wiley Online Library All Journals
subjects Artificial neural networks
bipedal robots
Classifiers
CNN
Datasets
Deep learning
edge computing
Ensemble learning
Form factors
human activity recognition (HAR)
IMU
Inertial platforms
Inertial sensing devices
internet of things (IoT)
Locomotion
LSTM
Recognition
Rehabilitation
Robot dynamics
robot walk generation
Robots
tinyML
Walking
wearable sensors
Wearable technology
title Deep ensemble learning approach for lower extremity activities recognition using wearable sensors
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-09T21%3A28%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Deep%20ensemble%20learning%20approach%20for%20lower%20extremity%20activities%20recognition%20using%20wearable%20sensors&rft.jtitle=Expert%20systems&rft.au=Jain,%20Rahul&rft.date=2022-07&rft.volume=39&rft.issue=6&rft.epage=n/a&rft.issn=0266-4720&rft.eissn=1468-0394&rft_id=info:doi/10.1111/exsy.12743&rft_dat=%3Cproquest_cross%3E2679755742%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2679755742&rft_id=info:pmid/&rfr_iscdi=true