An efficient deep learning-based approach for human activity recognition using smartphone inertial sensors

Human activity recognition (HAR) has recently witnessed outstanding growth in health and entertainment applications. Owing to the availability of smartphones, many new methods and protocols for using the data from smartphones' embedded sensors are emerging. Nonetheless, the methods carried out...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of computers & applications 2023-04, Vol.45 (4), p.323-336
Hauptverfasser: Djemili, Rafik, Zamouche, Merouane
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 336
container_issue 4
container_start_page 323
container_title International journal of computers & applications
container_volume 45
creator Djemili, Rafik
Zamouche, Merouane
description Human activity recognition (HAR) has recently witnessed outstanding growth in health and entertainment applications. Owing to the availability of smartphones, many new methods and protocols for using the data from smartphones' embedded sensors are emerging. Nonetheless, the methods carried out and published in the literature leave a wide area for improvement, in terms of accuracy, resource economy, and adaptation to real-world nuisances. On top of that, a novel classification method that is more economical and efficient is proposed in this paper using both 1D convolutional neural network (1D-CNN) parameters and handcrafted temporal and frequency features with the proficiency of a multilayer perceptron neural network (MLP) classifier. The method proposed requires only tri-axial accelerometer data, allowing it to be deployed even into lower equipment devices; it was tested within the two well-known benchmark datasets: UCI-HAR and Uni-MIB SHAR. Experimental results yield a classification accuracy exceeding 99%, outperforming many of the methods recently shown in the literature.
doi_str_mv 10.1080/1206212X.2023.2198785
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2817872534</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2817872534</sourcerecordid><originalsourceid>FETCH-LOGICAL-c253t-96b19f3a5137387bcc5fb55105a2b8f196309592597a3decb0036f73027cdf223</originalsourceid><addsrcrecordid>eNp9kMtKAzEUhoMoWKuPIARcT82lmWR2luINCm4U3IVMJmlTpsmYZJS-vSmtW1fnLP7LOR8AtxjNMBLoHhNUE0w-ZwQROiO4EVywMzDBDWEVR3x-XvaiqQ6iS3CV0hahOSe1mIDtwkNjrdPO-Aw7YwbYGxW98-uqVcl0UA1DDEpvoA0Rbsad8lDp7L5d3sNodFh7l13wcEzFA9NOxTxsgjfQeROzUz1MxqcQ0zW4sKpP5uY0p-Dj6fF9-VKt3p5fl4tVpQmjuWrqFjeWKoYpp4K3WjPbMoYRU6QVFjc1RQ0rrzVc0c7oFiFaW04R4bqzhNApuDvmlru_RpOy3IYx-lIpicBc8FIzLyp2VOkYUorGyiG6cvxeYiQPWOUfVnnAKk9Yi-_h6HO-ANmpnxD7Tma170O0UXntkqT_R_wCsJ6AJw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2817872534</pqid></control><display><type>article</type><title>An efficient deep learning-based approach for human activity recognition using smartphone inertial sensors</title><source>Taylor &amp; Francis Journals Complete</source><creator>Djemili, Rafik ; Zamouche, Merouane</creator><creatorcontrib>Djemili, Rafik ; Zamouche, Merouane</creatorcontrib><description>Human activity recognition (HAR) has recently witnessed outstanding growth in health and entertainment applications. Owing to the availability of smartphones, many new methods and protocols for using the data from smartphones' embedded sensors are emerging. Nonetheless, the methods carried out and published in the literature leave a wide area for improvement, in terms of accuracy, resource economy, and adaptation to real-world nuisances. On top of that, a novel classification method that is more economical and efficient is proposed in this paper using both 1D convolutional neural network (1D-CNN) parameters and handcrafted temporal and frequency features with the proficiency of a multilayer perceptron neural network (MLP) classifier. The method proposed requires only tri-axial accelerometer data, allowing it to be deployed even into lower equipment devices; it was tested within the two well-known benchmark datasets: UCI-HAR and Uni-MIB SHAR. Experimental results yield a classification accuracy exceeding 99%, outperforming many of the methods recently shown in the literature.</description><identifier>ISSN: 1206-212X</identifier><identifier>EISSN: 1925-7074</identifier><identifier>DOI: 10.1080/1206212X.2023.2198785</identifier><language>eng</language><publisher>Calgary: Taylor &amp; Francis</publisher><subject>Accelerometers ; Artificial neural networks ; Classification ; convolutional neural network (CNN) ; deep learnin ; Deep learning ; eatures ; Embedded sensors ; handcrafted features ; Human activity recognition ; Human activity recognition (HAR) ; Inertial sensing devices ; inertial signals ; Machine learning ; Multilayer perceptrons ; Neural networks ; smartphone accelerometers ; Smartphones</subject><ispartof>International journal of computers &amp; applications, 2023-04, Vol.45 (4), p.323-336</ispartof><rights>2023 Informa UK Limited, trading as Taylor &amp; Francis Group 2023</rights><rights>2023 Informa UK Limited, trading as Taylor &amp; Francis Group</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c253t-96b19f3a5137387bcc5fb55105a2b8f196309592597a3decb0036f73027cdf223</citedby><cites>FETCH-LOGICAL-c253t-96b19f3a5137387bcc5fb55105a2b8f196309592597a3decb0036f73027cdf223</cites><orcidid>0000-0003-3476-567X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.tandfonline.com/doi/pdf/10.1080/1206212X.2023.2198785$$EPDF$$P50$$Ginformaworld$$H</linktopdf><linktohtml>$$Uhttps://www.tandfonline.com/doi/full/10.1080/1206212X.2023.2198785$$EHTML$$P50$$Ginformaworld$$H</linktohtml><link.rule.ids>314,780,784,27922,27923,59645,60434</link.rule.ids></links><search><creatorcontrib>Djemili, Rafik</creatorcontrib><creatorcontrib>Zamouche, Merouane</creatorcontrib><title>An efficient deep learning-based approach for human activity recognition using smartphone inertial sensors</title><title>International journal of computers &amp; applications</title><description>Human activity recognition (HAR) has recently witnessed outstanding growth in health and entertainment applications. Owing to the availability of smartphones, many new methods and protocols for using the data from smartphones' embedded sensors are emerging. Nonetheless, the methods carried out and published in the literature leave a wide area for improvement, in terms of accuracy, resource economy, and adaptation to real-world nuisances. On top of that, a novel classification method that is more economical and efficient is proposed in this paper using both 1D convolutional neural network (1D-CNN) parameters and handcrafted temporal and frequency features with the proficiency of a multilayer perceptron neural network (MLP) classifier. The method proposed requires only tri-axial accelerometer data, allowing it to be deployed even into lower equipment devices; it was tested within the two well-known benchmark datasets: UCI-HAR and Uni-MIB SHAR. Experimental results yield a classification accuracy exceeding 99%, outperforming many of the methods recently shown in the literature.</description><subject>Accelerometers</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>convolutional neural network (CNN)</subject><subject>deep learnin</subject><subject>Deep learning</subject><subject>eatures</subject><subject>Embedded sensors</subject><subject>handcrafted features</subject><subject>Human activity recognition</subject><subject>Human activity recognition (HAR)</subject><subject>Inertial sensing devices</subject><subject>inertial signals</subject><subject>Machine learning</subject><subject>Multilayer perceptrons</subject><subject>Neural networks</subject><subject>smartphone accelerometers</subject><subject>Smartphones</subject><issn>1206-212X</issn><issn>1925-7074</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9kMtKAzEUhoMoWKuPIARcT82lmWR2luINCm4U3IVMJmlTpsmYZJS-vSmtW1fnLP7LOR8AtxjNMBLoHhNUE0w-ZwQROiO4EVywMzDBDWEVR3x-XvaiqQ6iS3CV0hahOSe1mIDtwkNjrdPO-Aw7YwbYGxW98-uqVcl0UA1DDEpvoA0Rbsad8lDp7L5d3sNodFh7l13wcEzFA9NOxTxsgjfQeROzUz1MxqcQ0zW4sKpP5uY0p-Dj6fF9-VKt3p5fl4tVpQmjuWrqFjeWKoYpp4K3WjPbMoYRU6QVFjc1RQ0rrzVc0c7oFiFaW04R4bqzhNApuDvmlru_RpOy3IYx-lIpicBc8FIzLyp2VOkYUorGyiG6cvxeYiQPWOUfVnnAKk9Yi-_h6HO-ANmpnxD7Tma170O0UXntkqT_R_wCsJ6AJw</recordid><startdate>20230403</startdate><enddate>20230403</enddate><creator>Djemili, Rafik</creator><creator>Zamouche, Merouane</creator><general>Taylor &amp; Francis</general><general>Taylor &amp; Francis Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-3476-567X</orcidid></search><sort><creationdate>20230403</creationdate><title>An efficient deep learning-based approach for human activity recognition using smartphone inertial sensors</title><author>Djemili, Rafik ; Zamouche, Merouane</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c253t-96b19f3a5137387bcc5fb55105a2b8f196309592597a3decb0036f73027cdf223</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accelerometers</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>convolutional neural network (CNN)</topic><topic>deep learnin</topic><topic>Deep learning</topic><topic>eatures</topic><topic>Embedded sensors</topic><topic>handcrafted features</topic><topic>Human activity recognition</topic><topic>Human activity recognition (HAR)</topic><topic>Inertial sensing devices</topic><topic>inertial signals</topic><topic>Machine learning</topic><topic>Multilayer perceptrons</topic><topic>Neural networks</topic><topic>smartphone accelerometers</topic><topic>Smartphones</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Djemili, Rafik</creatorcontrib><creatorcontrib>Zamouche, Merouane</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>International journal of computers &amp; applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Djemili, Rafik</au><au>Zamouche, Merouane</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An efficient deep learning-based approach for human activity recognition using smartphone inertial sensors</atitle><jtitle>International journal of computers &amp; applications</jtitle><date>2023-04-03</date><risdate>2023</risdate><volume>45</volume><issue>4</issue><spage>323</spage><epage>336</epage><pages>323-336</pages><issn>1206-212X</issn><eissn>1925-7074</eissn><abstract>Human activity recognition (HAR) has recently witnessed outstanding growth in health and entertainment applications. Owing to the availability of smartphones, many new methods and protocols for using the data from smartphones' embedded sensors are emerging. Nonetheless, the methods carried out and published in the literature leave a wide area for improvement, in terms of accuracy, resource economy, and adaptation to real-world nuisances. On top of that, a novel classification method that is more economical and efficient is proposed in this paper using both 1D convolutional neural network (1D-CNN) parameters and handcrafted temporal and frequency features with the proficiency of a multilayer perceptron neural network (MLP) classifier. The method proposed requires only tri-axial accelerometer data, allowing it to be deployed even into lower equipment devices; it was tested within the two well-known benchmark datasets: UCI-HAR and Uni-MIB SHAR. Experimental results yield a classification accuracy exceeding 99%, outperforming many of the methods recently shown in the literature.</abstract><cop>Calgary</cop><pub>Taylor &amp; Francis</pub><doi>10.1080/1206212X.2023.2198785</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0003-3476-567X</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1206-212X
ispartof International journal of computers & applications, 2023-04, Vol.45 (4), p.323-336
issn 1206-212X
1925-7074
language eng
recordid cdi_proquest_journals_2817872534
source Taylor & Francis Journals Complete
subjects Accelerometers
Artificial neural networks
Classification
convolutional neural network (CNN)
deep learnin
Deep learning
eatures
Embedded sensors
handcrafted features
Human activity recognition
Human activity recognition (HAR)
Inertial sensing devices
inertial signals
Machine learning
Multilayer perceptrons
Neural networks
smartphone accelerometers
Smartphones
title An efficient deep learning-based approach for human activity recognition using smartphone inertial sensors
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T02%3A55%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20efficient%20deep%20learning-based%20approach%20for%20human%20activity%20recognition%20using%20smartphone%20inertial%20sensors&rft.jtitle=International%20journal%20of%20computers%20&%20applications&rft.au=Djemili,%20Rafik&rft.date=2023-04-03&rft.volume=45&rft.issue=4&rft.spage=323&rft.epage=336&rft.pages=323-336&rft.issn=1206-212X&rft.eissn=1925-7074&rft_id=info:doi/10.1080/1206212X.2023.2198785&rft_dat=%3Cproquest_cross%3E2817872534%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2817872534&rft_id=info:pmid/&rfr_iscdi=true