Pedestrian Navigation Method Based on Machine Learning and Gait Feature Assistance
In recent years, as the mechanical structure of humanoid robots increasingly resembles the human form, research on pedestrian navigation technology has become of great significance for the development of humanoid robot navigation systems. To solve the problem that the wearable inertial navigation sy...
Gespeichert in:
Veröffentlicht in: | Sensors (Basel, Switzerland) Switzerland), 2020-03, Vol.20 (5), p.1530 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 5 |
container_start_page | 1530 |
container_title | Sensors (Basel, Switzerland) |
container_volume | 20 |
creator | Zhou, Zijun Yang, Shuqin Ni, Zhisen Qian, Weixing Gu, Cuihong Cao, Zekun |
description | In recent years, as the mechanical structure of humanoid robots increasingly resembles the human form, research on pedestrian navigation technology has become of great significance for the development of humanoid robot navigation systems. To solve the problem that the wearable inertial navigation system based on micro-inertial measurement units (MIMUs) installed on feet cannot effectively realize its positioning function when the body movement is too drastic to be measured correctly by commercial grade inertial sensors, a pedestrian navigation method based on construction of a virtual inertial measurement unit (VIMU) and gait feature assistance is proposed. The inertial data from different positions of pedestrians' lower limbs are collected synchronously via actual IMUs as training samples. The nonlinear mapping relationship between inertial information from the human foot and leg is established by a visual geometry group-long short term memory (VGG-LSTM) neural network model, based on which the foot VIMU and virtual inertial navigation system (VINS) are constructed. The VINS experimental results show that, combined with zero-velocity update (ZUPT), the integrated method of error modification proposed in this paper can effectively reduce the accumulation of positioning errors in situations where the gait type exceeds the measurement range of the inertial sensors. The positioning performance of the proposed method is more accurate and stable in complex gait types than that merely using ZUPT. |
doi_str_mv | 10.3390/s20051530 |
format | Article |
fullrecord | <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_crossref_primary_10_3390_s20051530</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_da5d63a726fe4f0abd06247d60e33b34</doaj_id><sourcerecordid>2377330456</sourcerecordid><originalsourceid>FETCH-LOGICAL-c469t-7f6c6c6c80975a740d80c58c009c1995bd7a433761948a49f5256d68d11eb2333</originalsourceid><addsrcrecordid>eNpdkU1vEzEQhi0EoiVw4A-glbjAITD-Xl-QSkVLpfAhBGdr1vYmjjZ2sXcr8e_ZNCVqkQ_22I8ejecl5CWFd5wbeF8ZgKSSwyNySgUTy5YxeHzvfEKe1boFYJzz9ik54YwqwVp9Sn58Dz7UsURMzVe8iWscY07NlzBusm8-Yg2-2dfoNjGFZhWwpJjWDSbfXGIcm4uA41RCc1ZrrCMmF56TJz0ONby42xfk18Wnn-efl6tvl1fnZ6ulE8qMS90rt18tGC1RC_AtONk6AOOoMbLzGgXnWlEjWhSml0wqr1pPaej2H1mQq4PXZ9za6xJ3WP7YjNHeXuSytljG6IZgPUqvOGqm-iB6wM6DYkJ7BYHzjovZ9eHgup66XfAupLHg8ED68CXFjV3nG6uhlcqYWfDmTlDy72meqN3F6sIwYAp5qpZxrTkHIdWMvv4P3eappHlUtxRQqWZ2Qd4eKFdyrSX0x2Yo2H3q9pj6zL663_2R_Bcz_wuC9KVK</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2377015677</pqid></control><display><type>article</type><title>Pedestrian Navigation Method Based on Machine Learning and Gait Feature Assistance</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>MDPI - Multidisciplinary Digital Publishing Institute</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><creator>Zhou, Zijun ; Yang, Shuqin ; Ni, Zhisen ; Qian, Weixing ; Gu, Cuihong ; Cao, Zekun</creator><creatorcontrib>Zhou, Zijun ; Yang, Shuqin ; Ni, Zhisen ; Qian, Weixing ; Gu, Cuihong ; Cao, Zekun</creatorcontrib><description>In recent years, as the mechanical structure of humanoid robots increasingly resembles the human form, research on pedestrian navigation technology has become of great significance for the development of humanoid robot navigation systems. To solve the problem that the wearable inertial navigation system based on micro-inertial measurement units (MIMUs) installed on feet cannot effectively realize its positioning function when the body movement is too drastic to be measured correctly by commercial grade inertial sensors, a pedestrian navigation method based on construction of a virtual inertial measurement unit (VIMU) and gait feature assistance is proposed. The inertial data from different positions of pedestrians' lower limbs are collected synchronously via actual IMUs as training samples. The nonlinear mapping relationship between inertial information from the human foot and leg is established by a visual geometry group-long short term memory (VGG-LSTM) neural network model, based on which the foot VIMU and virtual inertial navigation system (VINS) are constructed. The VINS experimental results show that, combined with zero-velocity update (ZUPT), the integrated method of error modification proposed in this paper can effectively reduce the accumulation of positioning errors in situations where the gait type exceeds the measurement range of the inertial sensors. The positioning performance of the proposed method is more accurate and stable in complex gait types than that merely using ZUPT.</description><identifier>ISSN: 1424-8220</identifier><identifier>EISSN: 1424-8220</identifier><identifier>DOI: 10.3390/s20051530</identifier><identifier>PMID: 32164287</identifier><language>eng</language><publisher>Switzerland: MDPI AG</publisher><subject>Acceleration ; Algorithms ; Biomechanical Phenomena ; Feet ; Foot - physiology ; Gait ; gait feature assistance ; gait phase recognition ; Global positioning systems ; GPS ; Humanoid ; Humans ; Inertial coordinates ; Inertial navigation ; Inertial platforms ; Kinematics ; Machine Learning ; Mapping ; Monitoring, Ambulatory - instrumentation ; Monitoring, Ambulatory - methods ; Motion ; Navigation systems ; Neural networks ; Neural Networks, Computer ; pedestrian navigation ; Pedestrians ; Reproducibility of Results ; Robotics ; Sensors ; Velocity ; virtual inertial navigation system ; Walking ; Wearable Electronic Devices</subject><ispartof>Sensors (Basel, Switzerland), 2020-03, Vol.20 (5), p.1530</ispartof><rights>2020. This work is licensed under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2020 by the authors. 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c469t-7f6c6c6c80975a740d80c58c009c1995bd7a433761948a49f5256d68d11eb2333</citedby><cites>FETCH-LOGICAL-c469t-7f6c6c6c80975a740d80c58c009c1995bd7a433761948a49f5256d68d11eb2333</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC7085699/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC7085699/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,864,885,2102,27924,27925,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32164287$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Zhou, Zijun</creatorcontrib><creatorcontrib>Yang, Shuqin</creatorcontrib><creatorcontrib>Ni, Zhisen</creatorcontrib><creatorcontrib>Qian, Weixing</creatorcontrib><creatorcontrib>Gu, Cuihong</creatorcontrib><creatorcontrib>Cao, Zekun</creatorcontrib><title>Pedestrian Navigation Method Based on Machine Learning and Gait Feature Assistance</title><title>Sensors (Basel, Switzerland)</title><addtitle>Sensors (Basel)</addtitle><description>In recent years, as the mechanical structure of humanoid robots increasingly resembles the human form, research on pedestrian navigation technology has become of great significance for the development of humanoid robot navigation systems. To solve the problem that the wearable inertial navigation system based on micro-inertial measurement units (MIMUs) installed on feet cannot effectively realize its positioning function when the body movement is too drastic to be measured correctly by commercial grade inertial sensors, a pedestrian navigation method based on construction of a virtual inertial measurement unit (VIMU) and gait feature assistance is proposed. The inertial data from different positions of pedestrians' lower limbs are collected synchronously via actual IMUs as training samples. The nonlinear mapping relationship between inertial information from the human foot and leg is established by a visual geometry group-long short term memory (VGG-LSTM) neural network model, based on which the foot VIMU and virtual inertial navigation system (VINS) are constructed. The VINS experimental results show that, combined with zero-velocity update (ZUPT), the integrated method of error modification proposed in this paper can effectively reduce the accumulation of positioning errors in situations where the gait type exceeds the measurement range of the inertial sensors. The positioning performance of the proposed method is more accurate and stable in complex gait types than that merely using ZUPT.</description><subject>Acceleration</subject><subject>Algorithms</subject><subject>Biomechanical Phenomena</subject><subject>Feet</subject><subject>Foot - physiology</subject><subject>Gait</subject><subject>gait feature assistance</subject><subject>gait phase recognition</subject><subject>Global positioning systems</subject><subject>GPS</subject><subject>Humanoid</subject><subject>Humans</subject><subject>Inertial coordinates</subject><subject>Inertial navigation</subject><subject>Inertial platforms</subject><subject>Kinematics</subject><subject>Machine Learning</subject><subject>Mapping</subject><subject>Monitoring, Ambulatory - instrumentation</subject><subject>Monitoring, Ambulatory - methods</subject><subject>Motion</subject><subject>Navigation systems</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>pedestrian navigation</subject><subject>Pedestrians</subject><subject>Reproducibility of Results</subject><subject>Robotics</subject><subject>Sensors</subject><subject>Velocity</subject><subject>virtual inertial navigation system</subject><subject>Walking</subject><subject>Wearable Electronic Devices</subject><issn>1424-8220</issn><issn>1424-8220</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>DOA</sourceid><recordid>eNpdkU1vEzEQhi0EoiVw4A-glbjAITD-Xl-QSkVLpfAhBGdr1vYmjjZ2sXcr8e_ZNCVqkQ_22I8ejecl5CWFd5wbeF8ZgKSSwyNySgUTy5YxeHzvfEKe1boFYJzz9ik54YwqwVp9Sn58Dz7UsURMzVe8iWscY07NlzBusm8-Yg2-2dfoNjGFZhWwpJjWDSbfXGIcm4uA41RCc1ZrrCMmF56TJz0ONby42xfk18Wnn-efl6tvl1fnZ6ulE8qMS90rt18tGC1RC_AtONk6AOOoMbLzGgXnWlEjWhSml0wqr1pPaej2H1mQq4PXZ9za6xJ3WP7YjNHeXuSytljG6IZgPUqvOGqm-iB6wM6DYkJ7BYHzjovZ9eHgup66XfAupLHg8ED68CXFjV3nG6uhlcqYWfDmTlDy72meqN3F6sIwYAp5qpZxrTkHIdWMvv4P3eappHlUtxRQqWZ2Qd4eKFdyrSX0x2Yo2H3q9pj6zL663_2R_Bcz_wuC9KVK</recordid><startdate>20200310</startdate><enddate>20200310</enddate><creator>Zhou, Zijun</creator><creator>Yang, Shuqin</creator><creator>Ni, Zhisen</creator><creator>Qian, Weixing</creator><creator>Gu, Cuihong</creator><creator>Cao, Zekun</creator><general>MDPI AG</general><general>MDPI</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20200310</creationdate><title>Pedestrian Navigation Method Based on Machine Learning and Gait Feature Assistance</title><author>Zhou, Zijun ; Yang, Shuqin ; Ni, Zhisen ; Qian, Weixing ; Gu, Cuihong ; Cao, Zekun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c469t-7f6c6c6c80975a740d80c58c009c1995bd7a433761948a49f5256d68d11eb2333</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Acceleration</topic><topic>Algorithms</topic><topic>Biomechanical Phenomena</topic><topic>Feet</topic><topic>Foot - physiology</topic><topic>Gait</topic><topic>gait feature assistance</topic><topic>gait phase recognition</topic><topic>Global positioning systems</topic><topic>GPS</topic><topic>Humanoid</topic><topic>Humans</topic><topic>Inertial coordinates</topic><topic>Inertial navigation</topic><topic>Inertial platforms</topic><topic>Kinematics</topic><topic>Machine Learning</topic><topic>Mapping</topic><topic>Monitoring, Ambulatory - instrumentation</topic><topic>Monitoring, Ambulatory - methods</topic><topic>Motion</topic><topic>Navigation systems</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>pedestrian navigation</topic><topic>Pedestrians</topic><topic>Reproducibility of Results</topic><topic>Robotics</topic><topic>Sensors</topic><topic>Velocity</topic><topic>virtual inertial navigation system</topic><topic>Walking</topic><topic>Wearable Electronic Devices</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhou, Zijun</creatorcontrib><creatorcontrib>Yang, Shuqin</creatorcontrib><creatorcontrib>Ni, Zhisen</creatorcontrib><creatorcontrib>Qian, Weixing</creatorcontrib><creatorcontrib>Gu, Cuihong</creatorcontrib><creatorcontrib>Cao, Zekun</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Sensors (Basel, Switzerland)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhou, Zijun</au><au>Yang, Shuqin</au><au>Ni, Zhisen</au><au>Qian, Weixing</au><au>Gu, Cuihong</au><au>Cao, Zekun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Pedestrian Navigation Method Based on Machine Learning and Gait Feature Assistance</atitle><jtitle>Sensors (Basel, Switzerland)</jtitle><addtitle>Sensors (Basel)</addtitle><date>2020-03-10</date><risdate>2020</risdate><volume>20</volume><issue>5</issue><spage>1530</spage><pages>1530-</pages><issn>1424-8220</issn><eissn>1424-8220</eissn><abstract>In recent years, as the mechanical structure of humanoid robots increasingly resembles the human form, research on pedestrian navigation technology has become of great significance for the development of humanoid robot navigation systems. To solve the problem that the wearable inertial navigation system based on micro-inertial measurement units (MIMUs) installed on feet cannot effectively realize its positioning function when the body movement is too drastic to be measured correctly by commercial grade inertial sensors, a pedestrian navigation method based on construction of a virtual inertial measurement unit (VIMU) and gait feature assistance is proposed. The inertial data from different positions of pedestrians' lower limbs are collected synchronously via actual IMUs as training samples. The nonlinear mapping relationship between inertial information from the human foot and leg is established by a visual geometry group-long short term memory (VGG-LSTM) neural network model, based on which the foot VIMU and virtual inertial navigation system (VINS) are constructed. The VINS experimental results show that, combined with zero-velocity update (ZUPT), the integrated method of error modification proposed in this paper can effectively reduce the accumulation of positioning errors in situations where the gait type exceeds the measurement range of the inertial sensors. The positioning performance of the proposed method is more accurate and stable in complex gait types than that merely using ZUPT.</abstract><cop>Switzerland</cop><pub>MDPI AG</pub><pmid>32164287</pmid><doi>10.3390/s20051530</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1424-8220 |
ispartof | Sensors (Basel, Switzerland), 2020-03, Vol.20 (5), p.1530 |
issn | 1424-8220 1424-8220 |
language | eng |
recordid | cdi_crossref_primary_10_3390_s20051530 |
source | MEDLINE; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; MDPI - Multidisciplinary Digital Publishing Institute; PubMed Central; Free Full-Text Journals in Chemistry |
subjects | Acceleration Algorithms Biomechanical Phenomena Feet Foot - physiology Gait gait feature assistance gait phase recognition Global positioning systems GPS Humanoid Humans Inertial coordinates Inertial navigation Inertial platforms Kinematics Machine Learning Mapping Monitoring, Ambulatory - instrumentation Monitoring, Ambulatory - methods Motion Navigation systems Neural networks Neural Networks, Computer pedestrian navigation Pedestrians Reproducibility of Results Robotics Sensors Velocity virtual inertial navigation system Walking Wearable Electronic Devices |
title | Pedestrian Navigation Method Based on Machine Learning and Gait Feature Assistance |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T04%3A35%3A26IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Pedestrian%20Navigation%20Method%20Based%20on%20Machine%20Learning%20and%20Gait%20Feature%20Assistance&rft.jtitle=Sensors%20(Basel,%20Switzerland)&rft.au=Zhou,%20Zijun&rft.date=2020-03-10&rft.volume=20&rft.issue=5&rft.spage=1530&rft.pages=1530-&rft.issn=1424-8220&rft.eissn=1424-8220&rft_id=info:doi/10.3390/s20051530&rft_dat=%3Cproquest_doaj_%3E2377330456%3C/proquest_doaj_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2377015677&rft_id=info:pmid/32164287&rft_doaj_id=oai_doaj_org_article_da5d63a726fe4f0abd06247d60e33b34&rfr_iscdi=true |