A Human Activity Recognition Method Based on Lightweight Feature Extraction Combined With Pruned and Quantized CNN for Wearable Device
Human Activity Recognition (HAR) is becoming an essential part of human life care. Existing HAR methods are usually developed using a two-level approach, wherein a first-level Machine Learning (ML) classifier is employed to distinguish the static and dynamic activities, followed by a second-level cl...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on consumer electronics 2023-08, Vol.69 (3), p.657-670 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 670 |
---|---|
container_issue | 3 |
container_start_page | 657 |
container_title | IEEE transactions on consumer electronics |
container_volume | 69 |
creator | Yi, Myung-Kyu Lee, Wai-Kong Hwang, Seong Oun |
description | Human Activity Recognition (HAR) is becoming an essential part of human life care. Existing HAR methods are usually developed using a two-level approach, wherein a first-level Machine Learning (ML) classifier is employed to distinguish the static and dynamic activities, followed by a second-level classifier to identify the specific activity. These approaches are not suitable for wearable devices, due to the high computational and memory consumption. Our rigorous analysis of various HAR datasets opens up a new possibility that static or dynamic activities can be discriminated against through a simple statistical technique. Therefore, we propose to utilize a statistical feature extraction technique to replace the first-level ML classifier, thus achieving more lightweight computation. Next, we employ Random Forest (RF) and Convolutional Neural Networks (CNN) to classify the specific activities, achieving higher accuracy compared to the state-of-the-art results. We further reduce the computation and memory consumption of the above combined approach by applying pruning and quantizing techniques to CNN (PQ-CNN). Experimental results show the proposed lightweight HAR method achieved an F1 score of 0.9417 and 0.9438 for unbalanced and balanced datasets, respectively. On top of lightweight and accuracy, the proposed HAR method is practical for wearable devices by using a single accelerometer. |
doi_str_mv | 10.1109/TCE.2023.3266506 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2851333884</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10100934</ieee_id><sourcerecordid>2851333884</sourcerecordid><originalsourceid>FETCH-LOGICAL-c292t-990bc9753c9f271430819da878de44a5f799abb38c8d815aed4f4f7d8d68bfac3</originalsourceid><addsrcrecordid>eNpNkMlOwzAQhi0EEmW5c-BgiXOKtyT2sYSySKUsKuoxcuwJNaIxOA5QHoDnJqUcuMzML30zI30IHVEypJSo01kxHjLC-JCzLEtJtoUGNE1lIijLt9GAECUTTjK-i_ba9pkQKlImB-h7hK-6pW7wyET37uIKP4DxT42Lzjf4BuLCW3ymW7C4zxP3tIgfsK74AnTsAuDxZwza_OKFX1au6dG5iwt8F7r1rBuL7zvdRPfVp2I6xbUPeA466OoF8Dm8OwMHaKfWLy0c_vV99HgxnhVXyeT28roYTRLDFIuJUqQyKk-5UTXLqeBEUmW1zKUFIXRa50rpquLSSCtpqsGKWtS5lTaTVa0N30cnm7uvwb910Mby2Xeh6V-WTKaUcy6l6CmyoUzwbRugLl-DW-qwKikp17bL3na5tl3-2e5XjjcrDgD-4bQXzwX_AQv9fEQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2851333884</pqid></control><display><type>article</type><title>A Human Activity Recognition Method Based on Lightweight Feature Extraction Combined With Pruned and Quantized CNN for Wearable Device</title><source>IEEE Electronic Library (IEL)</source><creator>Yi, Myung-Kyu ; Lee, Wai-Kong ; Hwang, Seong Oun</creator><creatorcontrib>Yi, Myung-Kyu ; Lee, Wai-Kong ; Hwang, Seong Oun</creatorcontrib><description>Human Activity Recognition (HAR) is becoming an essential part of human life care. Existing HAR methods are usually developed using a two-level approach, wherein a first-level Machine Learning (ML) classifier is employed to distinguish the static and dynamic activities, followed by a second-level classifier to identify the specific activity. These approaches are not suitable for wearable devices, due to the high computational and memory consumption. Our rigorous analysis of various HAR datasets opens up a new possibility that static or dynamic activities can be discriminated against through a simple statistical technique. Therefore, we propose to utilize a statistical feature extraction technique to replace the first-level ML classifier, thus achieving more lightweight computation. Next, we employ Random Forest (RF) and Convolutional Neural Networks (CNN) to classify the specific activities, achieving higher accuracy compared to the state-of-the-art results. We further reduce the computation and memory consumption of the above combined approach by applying pruning and quantizing techniques to CNN (PQ-CNN). Experimental results show the proposed lightweight HAR method achieved an F1 score of 0.9417 and 0.9438 for unbalanced and balanced datasets, respectively. On top of lightweight and accuracy, the proposed HAR method is practical for wearable devices by using a single accelerometer.</description><identifier>ISSN: 0098-3063</identifier><identifier>EISSN: 1558-4127</identifier><identifier>DOI: 10.1109/TCE.2023.3266506</identifier><identifier>CODEN: ITCEDA</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Accelerometers ; Accuracy ; Artificial neural networks ; Biomedical monitoring ; Classifiers ; Computation ; Consumption ; convolutional neural network ; Convolutional neural networks ; Datasets ; Deep learning ; Feature extraction ; Human activity recognition ; Lightweight ; Machine learning ; Support vector machines ; Wearable computers ; Wearable sensors ; Wearable technology</subject><ispartof>IEEE transactions on consumer electronics, 2023-08, Vol.69 (3), p.657-670</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c292t-990bc9753c9f271430819da878de44a5f799abb38c8d815aed4f4f7d8d68bfac3</citedby><cites>FETCH-LOGICAL-c292t-990bc9753c9f271430819da878de44a5f799abb38c8d815aed4f4f7d8d68bfac3</cites><orcidid>0000-0003-4659-8979 ; 0000-0002-2360-4523 ; 0000-0003-4240-6255</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10100934$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10100934$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Yi, Myung-Kyu</creatorcontrib><creatorcontrib>Lee, Wai-Kong</creatorcontrib><creatorcontrib>Hwang, Seong Oun</creatorcontrib><title>A Human Activity Recognition Method Based on Lightweight Feature Extraction Combined With Pruned and Quantized CNN for Wearable Device</title><title>IEEE transactions on consumer electronics</title><addtitle>T-CE</addtitle><description>Human Activity Recognition (HAR) is becoming an essential part of human life care. Existing HAR methods are usually developed using a two-level approach, wherein a first-level Machine Learning (ML) classifier is employed to distinguish the static and dynamic activities, followed by a second-level classifier to identify the specific activity. These approaches are not suitable for wearable devices, due to the high computational and memory consumption. Our rigorous analysis of various HAR datasets opens up a new possibility that static or dynamic activities can be discriminated against through a simple statistical technique. Therefore, we propose to utilize a statistical feature extraction technique to replace the first-level ML classifier, thus achieving more lightweight computation. Next, we employ Random Forest (RF) and Convolutional Neural Networks (CNN) to classify the specific activities, achieving higher accuracy compared to the state-of-the-art results. We further reduce the computation and memory consumption of the above combined approach by applying pruning and quantizing techniques to CNN (PQ-CNN). Experimental results show the proposed lightweight HAR method achieved an F1 score of 0.9417 and 0.9438 for unbalanced and balanced datasets, respectively. On top of lightweight and accuracy, the proposed HAR method is practical for wearable devices by using a single accelerometer.</description><subject>Accelerometers</subject><subject>Accuracy</subject><subject>Artificial neural networks</subject><subject>Biomedical monitoring</subject><subject>Classifiers</subject><subject>Computation</subject><subject>Consumption</subject><subject>convolutional neural network</subject><subject>Convolutional neural networks</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Feature extraction</subject><subject>Human activity recognition</subject><subject>Lightweight</subject><subject>Machine learning</subject><subject>Support vector machines</subject><subject>Wearable computers</subject><subject>Wearable sensors</subject><subject>Wearable technology</subject><issn>0098-3063</issn><issn>1558-4127</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkMlOwzAQhi0EEmW5c-BgiXOKtyT2sYSySKUsKuoxcuwJNaIxOA5QHoDnJqUcuMzML30zI30IHVEypJSo01kxHjLC-JCzLEtJtoUGNE1lIijLt9GAECUTTjK-i_ba9pkQKlImB-h7hK-6pW7wyET37uIKP4DxT42Lzjf4BuLCW3ymW7C4zxP3tIgfsK74AnTsAuDxZwza_OKFX1au6dG5iwt8F7r1rBuL7zvdRPfVp2I6xbUPeA466OoF8Dm8OwMHaKfWLy0c_vV99HgxnhVXyeT28roYTRLDFIuJUqQyKk-5UTXLqeBEUmW1zKUFIXRa50rpquLSSCtpqsGKWtS5lTaTVa0N30cnm7uvwb910Mby2Xeh6V-WTKaUcy6l6CmyoUzwbRugLl-DW-qwKikp17bL3na5tl3-2e5XjjcrDgD-4bQXzwX_AQv9fEQ</recordid><startdate>20230801</startdate><enddate>20230801</enddate><creator>Yi, Myung-Kyu</creator><creator>Lee, Wai-Kong</creator><creator>Hwang, Seong Oun</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0003-4659-8979</orcidid><orcidid>https://orcid.org/0000-0002-2360-4523</orcidid><orcidid>https://orcid.org/0000-0003-4240-6255</orcidid></search><sort><creationdate>20230801</creationdate><title>A Human Activity Recognition Method Based on Lightweight Feature Extraction Combined With Pruned and Quantized CNN for Wearable Device</title><author>Yi, Myung-Kyu ; Lee, Wai-Kong ; Hwang, Seong Oun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c292t-990bc9753c9f271430819da878de44a5f799abb38c8d815aed4f4f7d8d68bfac3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accelerometers</topic><topic>Accuracy</topic><topic>Artificial neural networks</topic><topic>Biomedical monitoring</topic><topic>Classifiers</topic><topic>Computation</topic><topic>Consumption</topic><topic>convolutional neural network</topic><topic>Convolutional neural networks</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Feature extraction</topic><topic>Human activity recognition</topic><topic>Lightweight</topic><topic>Machine learning</topic><topic>Support vector machines</topic><topic>Wearable computers</topic><topic>Wearable sensors</topic><topic>Wearable technology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yi, Myung-Kyu</creatorcontrib><creatorcontrib>Lee, Wai-Kong</creatorcontrib><creatorcontrib>Hwang, Seong Oun</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on consumer electronics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Yi, Myung-Kyu</au><au>Lee, Wai-Kong</au><au>Hwang, Seong Oun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Human Activity Recognition Method Based on Lightweight Feature Extraction Combined With Pruned and Quantized CNN for Wearable Device</atitle><jtitle>IEEE transactions on consumer electronics</jtitle><stitle>T-CE</stitle><date>2023-08-01</date><risdate>2023</risdate><volume>69</volume><issue>3</issue><spage>657</spage><epage>670</epage><pages>657-670</pages><issn>0098-3063</issn><eissn>1558-4127</eissn><coden>ITCEDA</coden><abstract>Human Activity Recognition (HAR) is becoming an essential part of human life care. Existing HAR methods are usually developed using a two-level approach, wherein a first-level Machine Learning (ML) classifier is employed to distinguish the static and dynamic activities, followed by a second-level classifier to identify the specific activity. These approaches are not suitable for wearable devices, due to the high computational and memory consumption. Our rigorous analysis of various HAR datasets opens up a new possibility that static or dynamic activities can be discriminated against through a simple statistical technique. Therefore, we propose to utilize a statistical feature extraction technique to replace the first-level ML classifier, thus achieving more lightweight computation. Next, we employ Random Forest (RF) and Convolutional Neural Networks (CNN) to classify the specific activities, achieving higher accuracy compared to the state-of-the-art results. We further reduce the computation and memory consumption of the above combined approach by applying pruning and quantizing techniques to CNN (PQ-CNN). Experimental results show the proposed lightweight HAR method achieved an F1 score of 0.9417 and 0.9438 for unbalanced and balanced datasets, respectively. On top of lightweight and accuracy, the proposed HAR method is practical for wearable devices by using a single accelerometer.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TCE.2023.3266506</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0003-4659-8979</orcidid><orcidid>https://orcid.org/0000-0002-2360-4523</orcidid><orcidid>https://orcid.org/0000-0003-4240-6255</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 0098-3063 |
ispartof | IEEE transactions on consumer electronics, 2023-08, Vol.69 (3), p.657-670 |
issn | 0098-3063 1558-4127 |
language | eng |
recordid | cdi_proquest_journals_2851333884 |
source | IEEE Electronic Library (IEL) |
subjects | Accelerometers Accuracy Artificial neural networks Biomedical monitoring Classifiers Computation Consumption convolutional neural network Convolutional neural networks Datasets Deep learning Feature extraction Human activity recognition Lightweight Machine learning Support vector machines Wearable computers Wearable sensors Wearable technology |
title | A Human Activity Recognition Method Based on Lightweight Feature Extraction Combined With Pruned and Quantized CNN for Wearable Device |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T19%3A42%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Human%20Activity%20Recognition%20Method%20Based%20on%20Lightweight%20Feature%20Extraction%20Combined%20With%20Pruned%20and%20Quantized%20CNN%20for%20Wearable%20Device&rft.jtitle=IEEE%20transactions%20on%20consumer%20electronics&rft.au=Yi,%20Myung-Kyu&rft.date=2023-08-01&rft.volume=69&rft.issue=3&rft.spage=657&rft.epage=670&rft.pages=657-670&rft.issn=0098-3063&rft.eissn=1558-4127&rft.coden=ITCEDA&rft_id=info:doi/10.1109/TCE.2023.3266506&rft_dat=%3Cproquest_RIE%3E2851333884%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2851333884&rft_id=info:pmid/&rft_ieee_id=10100934&rfr_iscdi=true |