Deep Human Activity Recognition with Localisation of Wearable Sensors

Automatic recognition of human activities using wearable sensors remains a challenging problem due to high variability in inter-person gait and movements. Moreover, finding the best on-body location for a wearable sensor is also critical though it provides valuable context information that can be us...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2020-01, Vol.8, p.1-1
Hauptverfasser: Lawal, Isah A., Bano, Sophia
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1
container_issue
container_start_page 1
container_title IEEE access
container_volume 8
creator Lawal, Isah A.
Bano, Sophia
description Automatic recognition of human activities using wearable sensors remains a challenging problem due to high variability in inter-person gait and movements. Moreover, finding the best on-body location for a wearable sensor is also critical though it provides valuable context information that can be used for accurate recognition. This paper addresses the problem of classifying motion signals generated by multiple wearable sensors for the recognition of human activity and localisation of the wearable sensors. Unlike existing methods that used the raw accelerometer and gyroscope signals for extracting time and frequency-based features for activity inference, we propose to create frequency images for the raw signals and show this representation to be more robust. The frequency image sequences are generated from the accelerometer and gyroscope signals from seven different body parts. These frequency images serve as the input to our proposed two-stream Convolutional Neural Networks (CNN) for predicting the human activity and the location of the sensor generating the activity signal. We show that the complementary information collected by both accelerometer and gyroscope sensors can be leveraged to develop an effective classifier that can accurately predict the performed human activity. We evaluate the performance of the proposed method using the cross-subjects approach and show that it achieves an impressive F1-score of 0.90 on a publicly available real-world human activity dataset. This performance is superior to that reported by another state-of-the-art method on the same dataset. Moreover, we also experimented with the datasets from different body locations to predict the best position for the underlying task. We show that shin and waist are the best places on the body for placing sensors and this could help other researchers to collect higher quality activity data. We plan to publicly release the generated frequency images from all sensor positions and activities and our implementation code with the publication.
doi_str_mv 10.1109/ACCESS.2020.3017681
format Article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_proquest_journals_2454644240</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9170502</ieee_id><doaj_id>oai_doaj_org_article_5f559eccea394fc8bc7dde2544c5e372</doaj_id><sourcerecordid>2454644240</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-afb1dd1785cb9375e0ceed6e70e19c120be4df352b16b2d8a136ff3ce55e65fb3</originalsourceid><addsrcrecordid>eNpNkFtLw0AQhYMoWLS_oC8Bn1P3mstjqdUWCoJVfFz2Mlu3pNm6myr996ZNKc7LDIc5Z4YvSUYYjTFG1eNkOp2tVmOCCBpThIu8xFfJgOC8yiin-fW_-TYZxrhBXZWdxItBMnsC2KXz_VY26US37se1h_QNtF83rnW-SX9d-5UuvZa1i_KkeJt-ggxS1ZCuoIk-xPvkxso6wvDc75KP59n7dJ4tX18W08ky0wyVbSatwsbgouRaVbTggDSAyaFAgCuNCVLAjKWcKJwrYkqJaW4t1cA55Nwqepcs-lzj5UbsgtvKcBBeOnESfFgLGVqnaxDccl6B1iBpxawulS6MAcIZ0xxoQbqshz5rF_z3HmIrNn4fmu59QRhnOWOEoW6L9ls6-BgD2MtVjMQRv-jxiyN-ccbfuUa9ywHAxVHhAnFE6B_9LoFT</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2454644240</pqid></control><display><type>article</type><title>Deep Human Activity Recognition with Localisation of Wearable Sensors</title><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><source>IEEE Xplore Open Access Journals</source><creator>Lawal, Isah A. ; Bano, Sophia</creator><creatorcontrib>Lawal, Isah A. ; Bano, Sophia</creatorcontrib><description>Automatic recognition of human activities using wearable sensors remains a challenging problem due to high variability in inter-person gait and movements. Moreover, finding the best on-body location for a wearable sensor is also critical though it provides valuable context information that can be used for accurate recognition. This paper addresses the problem of classifying motion signals generated by multiple wearable sensors for the recognition of human activity and localisation of the wearable sensors. Unlike existing methods that used the raw accelerometer and gyroscope signals for extracting time and frequency-based features for activity inference, we propose to create frequency images for the raw signals and show this representation to be more robust. The frequency image sequences are generated from the accelerometer and gyroscope signals from seven different body parts. These frequency images serve as the input to our proposed two-stream Convolutional Neural Networks (CNN) for predicting the human activity and the location of the sensor generating the activity signal. We show that the complementary information collected by both accelerometer and gyroscope sensors can be leveraged to develop an effective classifier that can accurately predict the performed human activity. We evaluate the performance of the proposed method using the cross-subjects approach and show that it achieves an impressive F1-score of 0.90 on a publicly available real-world human activity dataset. This performance is superior to that reported by another state-of-the-art method on the same dataset. Moreover, we also experimented with the datasets from different body locations to predict the best position for the underlying task. We show that shin and waist are the best places on the body for placing sensors and this could help other researchers to collect higher quality activity data. We plan to publicly release the generated frequency images from all sensor positions and activities and our implementation code with the publication.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2020.3017681</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Accelerometers ; Artificial neural networks ; Body parts ; Datasets ; Deep learning ; Feature extraction ; Human activity recognition ; Human performance ; Localization ; Moving object recognition ; Performance evaluation ; Sensor localisation ; Sensors ; Signal classification ; Wearable sensors ; Wearable technology</subject><ispartof>IEEE access, 2020-01, Vol.8, p.1-1</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-afb1dd1785cb9375e0ceed6e70e19c120be4df352b16b2d8a136ff3ce55e65fb3</citedby><cites>FETCH-LOGICAL-c408t-afb1dd1785cb9375e0ceed6e70e19c120be4df352b16b2d8a136ff3ce55e65fb3</cites><orcidid>0000-0002-3108-5997 ; 0000-0003-1329-4565</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9170502$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,860,2096,27610,27901,27902,54908</link.rule.ids></links><search><creatorcontrib>Lawal, Isah A.</creatorcontrib><creatorcontrib>Bano, Sophia</creatorcontrib><title>Deep Human Activity Recognition with Localisation of Wearable Sensors</title><title>IEEE access</title><addtitle>Access</addtitle><description>Automatic recognition of human activities using wearable sensors remains a challenging problem due to high variability in inter-person gait and movements. Moreover, finding the best on-body location for a wearable sensor is also critical though it provides valuable context information that can be used for accurate recognition. This paper addresses the problem of classifying motion signals generated by multiple wearable sensors for the recognition of human activity and localisation of the wearable sensors. Unlike existing methods that used the raw accelerometer and gyroscope signals for extracting time and frequency-based features for activity inference, we propose to create frequency images for the raw signals and show this representation to be more robust. The frequency image sequences are generated from the accelerometer and gyroscope signals from seven different body parts. These frequency images serve as the input to our proposed two-stream Convolutional Neural Networks (CNN) for predicting the human activity and the location of the sensor generating the activity signal. We show that the complementary information collected by both accelerometer and gyroscope sensors can be leveraged to develop an effective classifier that can accurately predict the performed human activity. We evaluate the performance of the proposed method using the cross-subjects approach and show that it achieves an impressive F1-score of 0.90 on a publicly available real-world human activity dataset. This performance is superior to that reported by another state-of-the-art method on the same dataset. Moreover, we also experimented with the datasets from different body locations to predict the best position for the underlying task. We show that shin and waist are the best places on the body for placing sensors and this could help other researchers to collect higher quality activity data. We plan to publicly release the generated frequency images from all sensor positions and activities and our implementation code with the publication.</description><subject>Accelerometers</subject><subject>Artificial neural networks</subject><subject>Body parts</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Feature extraction</subject><subject>Human activity recognition</subject><subject>Human performance</subject><subject>Localization</subject><subject>Moving object recognition</subject><subject>Performance evaluation</subject><subject>Sensor localisation</subject><subject>Sensors</subject><subject>Signal classification</subject><subject>Wearable sensors</subject><subject>Wearable technology</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNkFtLw0AQhYMoWLS_oC8Bn1P3mstjqdUWCoJVfFz2Mlu3pNm6myr996ZNKc7LDIc5Z4YvSUYYjTFG1eNkOp2tVmOCCBpThIu8xFfJgOC8yiin-fW_-TYZxrhBXZWdxItBMnsC2KXz_VY26US37se1h_QNtF83rnW-SX9d-5UuvZa1i_KkeJt-ggxS1ZCuoIk-xPvkxso6wvDc75KP59n7dJ4tX18W08ky0wyVbSatwsbgouRaVbTggDSAyaFAgCuNCVLAjKWcKJwrYkqJaW4t1cA55Nwqepcs-lzj5UbsgtvKcBBeOnESfFgLGVqnaxDccl6B1iBpxawulS6MAcIZ0xxoQbqshz5rF_z3HmIrNn4fmu59QRhnOWOEoW6L9ls6-BgD2MtVjMQRv-jxiyN-ccbfuUa9ywHAxVHhAnFE6B_9LoFT</recordid><startdate>20200101</startdate><enddate>20200101</enddate><creator>Lawal, Isah A.</creator><creator>Bano, Sophia</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-3108-5997</orcidid><orcidid>https://orcid.org/0000-0003-1329-4565</orcidid></search><sort><creationdate>20200101</creationdate><title>Deep Human Activity Recognition with Localisation of Wearable Sensors</title><author>Lawal, Isah A. ; Bano, Sophia</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-afb1dd1785cb9375e0ceed6e70e19c120be4df352b16b2d8a136ff3ce55e65fb3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Accelerometers</topic><topic>Artificial neural networks</topic><topic>Body parts</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Feature extraction</topic><topic>Human activity recognition</topic><topic>Human performance</topic><topic>Localization</topic><topic>Moving object recognition</topic><topic>Performance evaluation</topic><topic>Sensor localisation</topic><topic>Sensors</topic><topic>Signal classification</topic><topic>Wearable sensors</topic><topic>Wearable technology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lawal, Isah A.</creatorcontrib><creatorcontrib>Bano, Sophia</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Xplore Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lawal, Isah A.</au><au>Bano, Sophia</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Deep Human Activity Recognition with Localisation of Wearable Sensors</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2020-01-01</date><risdate>2020</risdate><volume>8</volume><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Automatic recognition of human activities using wearable sensors remains a challenging problem due to high variability in inter-person gait and movements. Moreover, finding the best on-body location for a wearable sensor is also critical though it provides valuable context information that can be used for accurate recognition. This paper addresses the problem of classifying motion signals generated by multiple wearable sensors for the recognition of human activity and localisation of the wearable sensors. Unlike existing methods that used the raw accelerometer and gyroscope signals for extracting time and frequency-based features for activity inference, we propose to create frequency images for the raw signals and show this representation to be more robust. The frequency image sequences are generated from the accelerometer and gyroscope signals from seven different body parts. These frequency images serve as the input to our proposed two-stream Convolutional Neural Networks (CNN) for predicting the human activity and the location of the sensor generating the activity signal. We show that the complementary information collected by both accelerometer and gyroscope sensors can be leveraged to develop an effective classifier that can accurately predict the performed human activity. We evaluate the performance of the proposed method using the cross-subjects approach and show that it achieves an impressive F1-score of 0.90 on a publicly available real-world human activity dataset. This performance is superior to that reported by another state-of-the-art method on the same dataset. Moreover, we also experimented with the datasets from different body locations to predict the best position for the underlying task. We show that shin and waist are the best places on the body for placing sensors and this could help other researchers to collect higher quality activity data. We plan to publicly release the generated frequency images from all sensor positions and activities and our implementation code with the publication.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2020.3017681</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-3108-5997</orcidid><orcidid>https://orcid.org/0000-0003-1329-4565</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2020-01, Vol.8, p.1-1
issn 2169-3536
2169-3536
language eng
recordid cdi_proquest_journals_2454644240
source DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals; IEEE Xplore Open Access Journals
subjects Accelerometers
Artificial neural networks
Body parts
Datasets
Deep learning
Feature extraction
Human activity recognition
Human performance
Localization
Moving object recognition
Performance evaluation
Sensor localisation
Sensors
Signal classification
Wearable sensors
Wearable technology
title Deep Human Activity Recognition with Localisation of Wearable Sensors
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T11%3A35%3A50IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Deep%20Human%20Activity%20Recognition%20with%20Localisation%20of%20Wearable%20Sensors&rft.jtitle=IEEE%20access&rft.au=Lawal,%20Isah%20A.&rft.date=2020-01-01&rft.volume=8&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2020.3017681&rft_dat=%3Cproquest_doaj_%3E2454644240%3C/proquest_doaj_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2454644240&rft_id=info:pmid/&rft_ieee_id=9170502&rft_doaj_id=oai_doaj_org_article_5f559eccea394fc8bc7dde2544c5e372&rfr_iscdi=true