DFTerNet: Towards 2-bit Dynamic Fusion Networks for Accurate Human Activity Recognition
Deep convolutional neural networks (DCNNs) are currently popular in human activity recognition (HAR) applications. However, in the face of modern artificial intelligence sensor-based games, many research achievements cannot be practically applied on portable devices (i.e., smart phone, VR/AR). DCNNs...
Gespeichert in:
Veröffentlicht in: | IEEE access 2018, Vol.6, p.56750-56764 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 56764 |
---|---|
container_issue | |
container_start_page | 56750 |
container_title | IEEE access |
container_volume | 6 |
creator | Yang, Zhan Raymond, Osolo Ian Zhang, Chengyuan Wan, Ying Long, Jun |
description | Deep convolutional neural networks (DCNNs) are currently popular in human activity recognition (HAR) applications. However, in the face of modern artificial intelligence sensor-based games, many research achievements cannot be practically applied on portable devices (i.e., smart phone, VR/AR). DCNNs are typically resource-intensive and too large to be deployed on portable devices, and thus, this limits the practical application of complex activity detection. In addition, since portable devices do not possess high-performance graphic processing units, there is hardly any improvement in Action Game (ACT) experience. Besides, in order to deal with multi-sensor collaboration, all previous HAR models typically treated the representations from different sensor signal sources equally. However, distinct types of activities should adopt different fusion strategies. In this paper, a novel scheme is proposed. This scheme is used to train 2-bit CNNs with weights and activations constrained to {−0.5, 0, 0.5}. It takes into account the correlation between different sensor signal sources and the activity types. This model, which we refer to as DFTerNet, aims at producing a more reliable inference and better trade-offs for practical applications. It is known that quantization of weights and activations can substantially reduce memory size and use more efficient bitwise operations to replace floating or matrix operations to achieve much faster calculation and lower power consumption. Our basic idea is to exploit quantization of weights and activations directly in pre-trained filter banks and adopt dynamic fusion strategies for different activity types. Experiments demonstrate that by using a dynamic fusion strategy, it is possible to exceed the baseline model performance by up to ~5% on activity recognition data sets, such as the OPPORTUNITY and PAMAP2 data sets. Using the quantization method proposed, we were able to achieve performances closer to that of the full-precision counterpart. These results were also verified using the UniMiB-SHAR data set. In addition, the proposed method can achieve \sim 9\times acceleration on CPUs and \sim 11\times memory saving. |
doi_str_mv | 10.1109/ACCESS.2018.2873315 |
format | Article |
fullrecord | <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_proquest_journals_2455924295</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8478282</ieee_id><doaj_id>oai_doaj_org_article_9ef3e86584fa452782bb411e431cfd08</doaj_id><sourcerecordid>2455924295</sourcerecordid><originalsourceid>FETCH-LOGICAL-c474t-733e8488020601915ccd48cf5ef1462795d69edffc1d3fae90ad925e2ac074373</originalsourceid><addsrcrecordid>eNpNUU1LAzEUXERBUX-Bl4DnrfncTbyValUQBa14DGn2RVLtRpOspf_e6Ir4Lu-DmXkDU1UnBE8IwepsOptdPj5OKCZyQmXLGBE71QEljaqZYM3uv3m_Ok5phUvJchLtQfV8MV9AvIN8jhZhY2KXEK2XPqOLbW_W3qL5kHzoUUFsQnxNyIWIptYO0WRA18Pa9GXN_tPnLXoAG156nwvhqNpz5i3B8W8_rJ7ml4vZdX17f3Uzm97Wlrc818UtSC4lprjBRBFhbceldQIc4Q1tlegaBZ1zlnTMGVDYdIoKoMbilrOWHVY3o24XzEq_R782cauD8frnEOKLNjF7-wZagSvPGiG5M1zQVtLlkhMCnBHrOiyL1umo9R7DxwAp61UYYl_sa8qFUJRTJQqKjSgbQ0oR3N9XgvV3IHoMRH8Hon8DKayTkeUB4I8heXEhKfsCNmKFcw</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2455924295</pqid></control><display><type>article</type><title>DFTerNet: Towards 2-bit Dynamic Fusion Networks for Accurate Human Activity Recognition</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Yang, Zhan ; Raymond, Osolo Ian ; Zhang, Chengyuan ; Wan, Ying ; Long, Jun</creator><creatorcontrib>Yang, Zhan ; Raymond, Osolo Ian ; Zhang, Chengyuan ; Wan, Ying ; Long, Jun</creatorcontrib><description><![CDATA[Deep convolutional neural networks (DCNNs) are currently popular in human activity recognition (HAR) applications. However, in the face of modern artificial intelligence sensor-based games, many research achievements cannot be practically applied on portable devices (i.e., smart phone, VR/AR). DCNNs are typically resource-intensive and too large to be deployed on portable devices, and thus, this limits the practical application of complex activity detection. In addition, since portable devices do not possess high-performance graphic processing units, there is hardly any improvement in Action Game (ACT) experience. Besides, in order to deal with multi-sensor collaboration, all previous HAR models typically treated the representations from different sensor signal sources equally. However, distinct types of activities should adopt different fusion strategies. In this paper, a novel scheme is proposed. This scheme is used to train 2-bit CNNs with weights and activations constrained to {−0.5, 0, 0.5}. It takes into account the correlation between different sensor signal sources and the activity types. This model, which we refer to as DFTerNet, aims at producing a more reliable inference and better trade-offs for practical applications. It is known that quantization of weights and activations can substantially reduce memory size and use more efficient bitwise operations to replace floating or matrix operations to achieve much faster calculation and lower power consumption. Our basic idea is to exploit quantization of weights and activations directly in pre-trained filter banks and adopt dynamic fusion strategies for different activity types. Experiments demonstrate that by using a dynamic fusion strategy, it is possible to exceed the baseline model performance by up to ~5% on activity recognition data sets, such as the OPPORTUNITY and PAMAP2 data sets. Using the quantization method proposed, we were able to achieve performances closer to that of the full-precision counterpart. These results were also verified using the UniMiB-SHAR data set. In addition, the proposed method can achieve <inline-formula> <tex-math notation="LaTeX">\sim 9\times </tex-math></inline-formula> acceleration on CPUs and <inline-formula> <tex-math notation="LaTeX">\sim 11\times </tex-math></inline-formula> memory saving.]]></description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2018.2873315</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>2-bit neural networks ; Activity recognition ; Artificial intelligence ; Artificial neural networks ; Computational modeling ; Convolutional neural networks ; Datasets ; dynamic fusion strategy ; Filter banks ; Games ; Human activity recognition ; Measurement ; Memory management ; Moving object recognition ; Portable equipment ; Power consumption ; Quantization (signal) ; Sensors</subject><ispartof>IEEE access, 2018, Vol.6, p.56750-56764</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c474t-733e8488020601915ccd48cf5ef1462795d69edffc1d3fae90ad925e2ac074373</citedby><cites>FETCH-LOGICAL-c474t-733e8488020601915ccd48cf5ef1462795d69edffc1d3fae90ad925e2ac074373</cites><orcidid>0000-0002-6336-0228</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8478282$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2100,4022,27632,27922,27923,27924,54932</link.rule.ids></links><search><creatorcontrib>Yang, Zhan</creatorcontrib><creatorcontrib>Raymond, Osolo Ian</creatorcontrib><creatorcontrib>Zhang, Chengyuan</creatorcontrib><creatorcontrib>Wan, Ying</creatorcontrib><creatorcontrib>Long, Jun</creatorcontrib><title>DFTerNet: Towards 2-bit Dynamic Fusion Networks for Accurate Human Activity Recognition</title><title>IEEE access</title><addtitle>Access</addtitle><description><![CDATA[Deep convolutional neural networks (DCNNs) are currently popular in human activity recognition (HAR) applications. However, in the face of modern artificial intelligence sensor-based games, many research achievements cannot be practically applied on portable devices (i.e., smart phone, VR/AR). DCNNs are typically resource-intensive and too large to be deployed on portable devices, and thus, this limits the practical application of complex activity detection. In addition, since portable devices do not possess high-performance graphic processing units, there is hardly any improvement in Action Game (ACT) experience. Besides, in order to deal with multi-sensor collaboration, all previous HAR models typically treated the representations from different sensor signal sources equally. However, distinct types of activities should adopt different fusion strategies. In this paper, a novel scheme is proposed. This scheme is used to train 2-bit CNNs with weights and activations constrained to {−0.5, 0, 0.5}. It takes into account the correlation between different sensor signal sources and the activity types. This model, which we refer to as DFTerNet, aims at producing a more reliable inference and better trade-offs for practical applications. It is known that quantization of weights and activations can substantially reduce memory size and use more efficient bitwise operations to replace floating or matrix operations to achieve much faster calculation and lower power consumption. Our basic idea is to exploit quantization of weights and activations directly in pre-trained filter banks and adopt dynamic fusion strategies for different activity types. Experiments demonstrate that by using a dynamic fusion strategy, it is possible to exceed the baseline model performance by up to ~5% on activity recognition data sets, such as the OPPORTUNITY and PAMAP2 data sets. Using the quantization method proposed, we were able to achieve performances closer to that of the full-precision counterpart. These results were also verified using the UniMiB-SHAR data set. In addition, the proposed method can achieve <inline-formula> <tex-math notation="LaTeX">\sim 9\times </tex-math></inline-formula> acceleration on CPUs and <inline-formula> <tex-math notation="LaTeX">\sim 11\times </tex-math></inline-formula> memory saving.]]></description><subject>2-bit neural networks</subject><subject>Activity recognition</subject><subject>Artificial intelligence</subject><subject>Artificial neural networks</subject><subject>Computational modeling</subject><subject>Convolutional neural networks</subject><subject>Datasets</subject><subject>dynamic fusion strategy</subject><subject>Filter banks</subject><subject>Games</subject><subject>Human activity recognition</subject><subject>Measurement</subject><subject>Memory management</subject><subject>Moving object recognition</subject><subject>Portable equipment</subject><subject>Power consumption</subject><subject>Quantization (signal)</subject><subject>Sensors</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUU1LAzEUXERBUX-Bl4DnrfncTbyValUQBa14DGn2RVLtRpOspf_e6Ir4Lu-DmXkDU1UnBE8IwepsOptdPj5OKCZyQmXLGBE71QEljaqZYM3uv3m_Ok5phUvJchLtQfV8MV9AvIN8jhZhY2KXEK2XPqOLbW_W3qL5kHzoUUFsQnxNyIWIptYO0WRA18Pa9GXN_tPnLXoAG156nwvhqNpz5i3B8W8_rJ7ml4vZdX17f3Uzm97Wlrc818UtSC4lprjBRBFhbceldQIc4Q1tlegaBZ1zlnTMGVDYdIoKoMbilrOWHVY3o24XzEq_R782cauD8frnEOKLNjF7-wZagSvPGiG5M1zQVtLlkhMCnBHrOiyL1umo9R7DxwAp61UYYl_sa8qFUJRTJQqKjSgbQ0oR3N9XgvV3IHoMRH8Hon8DKayTkeUB4I8heXEhKfsCNmKFcw</recordid><startdate>2018</startdate><enddate>2018</enddate><creator>Yang, Zhan</creator><creator>Raymond, Osolo Ian</creator><creator>Zhang, Chengyuan</creator><creator>Wan, Ying</creator><creator>Long, Jun</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-6336-0228</orcidid></search><sort><creationdate>2018</creationdate><title>DFTerNet: Towards 2-bit Dynamic Fusion Networks for Accurate Human Activity Recognition</title><author>Yang, Zhan ; Raymond, Osolo Ian ; Zhang, Chengyuan ; Wan, Ying ; Long, Jun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c474t-733e8488020601915ccd48cf5ef1462795d69edffc1d3fae90ad925e2ac074373</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>2-bit neural networks</topic><topic>Activity recognition</topic><topic>Artificial intelligence</topic><topic>Artificial neural networks</topic><topic>Computational modeling</topic><topic>Convolutional neural networks</topic><topic>Datasets</topic><topic>dynamic fusion strategy</topic><topic>Filter banks</topic><topic>Games</topic><topic>Human activity recognition</topic><topic>Measurement</topic><topic>Memory management</topic><topic>Moving object recognition</topic><topic>Portable equipment</topic><topic>Power consumption</topic><topic>Quantization (signal)</topic><topic>Sensors</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yang, Zhan</creatorcontrib><creatorcontrib>Raymond, Osolo Ian</creatorcontrib><creatorcontrib>Zhang, Chengyuan</creatorcontrib><creatorcontrib>Wan, Ying</creatorcontrib><creatorcontrib>Long, Jun</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yang, Zhan</au><au>Raymond, Osolo Ian</au><au>Zhang, Chengyuan</au><au>Wan, Ying</au><au>Long, Jun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>DFTerNet: Towards 2-bit Dynamic Fusion Networks for Accurate Human Activity Recognition</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2018</date><risdate>2018</risdate><volume>6</volume><spage>56750</spage><epage>56764</epage><pages>56750-56764</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract><![CDATA[Deep convolutional neural networks (DCNNs) are currently popular in human activity recognition (HAR) applications. However, in the face of modern artificial intelligence sensor-based games, many research achievements cannot be practically applied on portable devices (i.e., smart phone, VR/AR). DCNNs are typically resource-intensive and too large to be deployed on portable devices, and thus, this limits the practical application of complex activity detection. In addition, since portable devices do not possess high-performance graphic processing units, there is hardly any improvement in Action Game (ACT) experience. Besides, in order to deal with multi-sensor collaboration, all previous HAR models typically treated the representations from different sensor signal sources equally. However, distinct types of activities should adopt different fusion strategies. In this paper, a novel scheme is proposed. This scheme is used to train 2-bit CNNs with weights and activations constrained to {−0.5, 0, 0.5}. It takes into account the correlation between different sensor signal sources and the activity types. This model, which we refer to as DFTerNet, aims at producing a more reliable inference and better trade-offs for practical applications. It is known that quantization of weights and activations can substantially reduce memory size and use more efficient bitwise operations to replace floating or matrix operations to achieve much faster calculation and lower power consumption. Our basic idea is to exploit quantization of weights and activations directly in pre-trained filter banks and adopt dynamic fusion strategies for different activity types. Experiments demonstrate that by using a dynamic fusion strategy, it is possible to exceed the baseline model performance by up to ~5% on activity recognition data sets, such as the OPPORTUNITY and PAMAP2 data sets. Using the quantization method proposed, we were able to achieve performances closer to that of the full-precision counterpart. These results were also verified using the UniMiB-SHAR data set. In addition, the proposed method can achieve <inline-formula> <tex-math notation="LaTeX">\sim 9\times </tex-math></inline-formula> acceleration on CPUs and <inline-formula> <tex-math notation="LaTeX">\sim 11\times </tex-math></inline-formula> memory saving.]]></abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2018.2873315</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0002-6336-0228</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2169-3536 |
ispartof | IEEE access, 2018, Vol.6, p.56750-56764 |
issn | 2169-3536 2169-3536 |
language | eng |
recordid | cdi_proquest_journals_2455924295 |
source | IEEE Open Access Journals; DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals |
subjects | 2-bit neural networks Activity recognition Artificial intelligence Artificial neural networks Computational modeling Convolutional neural networks Datasets dynamic fusion strategy Filter banks Games Human activity recognition Measurement Memory management Moving object recognition Portable equipment Power consumption Quantization (signal) Sensors |
title | DFTerNet: Towards 2-bit Dynamic Fusion Networks for Accurate Human Activity Recognition |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-13T09%3A53%3A13IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=DFTerNet:%20Towards%202-bit%20Dynamic%20Fusion%20Networks%20for%20Accurate%20Human%20Activity%20Recognition&rft.jtitle=IEEE%20access&rft.au=Yang,%20Zhan&rft.date=2018&rft.volume=6&rft.spage=56750&rft.epage=56764&rft.pages=56750-56764&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2018.2873315&rft_dat=%3Cproquest_ieee_%3E2455924295%3C/proquest_ieee_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2455924295&rft_id=info:pmid/&rft_ieee_id=8478282&rft_doaj_id=oai_doaj_org_article_9ef3e86584fa452782bb411e431cfd08&rfr_iscdi=true |