Multiposture leg tracking for temporarily vision restricted environments based on fusion of laser and radar sensor data

Leg tracking is an established field in mobile robotics and machine vision in general. These algorithms, however, only distinguish the scene between leg and nonleg detections. In application fields like firefighting, where people tend to choose squatting or crouching over standing postures, those me...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of field robotics 2023-09, Vol.40 (6), p.1620-1638
Hauptverfasser: Mandischer, Nils, Hou, Ruikun, Corves, Burkhard
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1638
container_issue 6
container_start_page 1620
container_title Journal of field robotics
container_volume 40
creator Mandischer, Nils
Hou, Ruikun
Corves, Burkhard
description Leg tracking is an established field in mobile robotics and machine vision in general. These algorithms, however, only distinguish the scene between leg and nonleg detections. In application fields like firefighting, where people tend to choose squatting or crouching over standing postures, those methods will inevitably fail. Further, tracking based on a single sensor system may reduce the overall reliability if brought to outdoor or complex environments with limited vision on the target objectives. Therefore, we extend our recent work to a multiposture detection system based on laser and radar sensors, that are fused to allow for maximal reliability and accuracy in scenarios as complex as indoor firefighting with vastly limited vision. The proposed tracking pipeline is trained and extensively validated on a new data set. We show that the radar tracker reaches state‐of‐the‐art performance, and that laser and fusion tracker outperform recent methods.
doi_str_mv 10.1002/rob.22195
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2844495333</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2844495333</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3325-11654fc49adae61e06c42b3af32ce8fbd3995147f1d7f6185b1f02822ed872443</originalsourceid><addsrcrecordid>eNp1kMtOwzAQRS0EEqWw4A8ssWKRNn7lsYSKl1RUCcHachK7cknjMHZa9e8xDWLHbGZ0de7M6CJ0TdIZSVM6B1fNKCWlOEETIkSW8DLLT_9mUZ6jC-83acpZUYoJ2r8ObbC982EAjVu9xgFU_Wm7NTYOcNDb3oEC2x7wznrrOgzaB7B10A3W3c6C67a6Cx5XykcpAmY4cs7gNkqAVddgUI0C7HXn49JGBXWJzoxqvb767VP08fjwvnhOlqunl8XdMqkZoyIhJBPc1LyMfp0RnWY1pxVThtFaF6ZqWFkKwnNDmtxkpBAVMSktKNVNkVPO2RTdjHt7cF9DfF1u3ABdPClpwXlMhMWaotuRqsF5D9rIHuxWwUGSVP7kKmOu8phrZOcju7etPvwPyrfV_ej4BpUGe-c</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2844495333</pqid></control><display><type>article</type><title>Multiposture leg tracking for temporarily vision restricted environments based on fusion of laser and radar sensor data</title><source>Wiley Online Library Journals Frontfile Complete</source><creator>Mandischer, Nils ; Hou, Ruikun ; Corves, Burkhard</creator><creatorcontrib>Mandischer, Nils ; Hou, Ruikun ; Corves, Burkhard</creatorcontrib><description>Leg tracking is an established field in mobile robotics and machine vision in general. These algorithms, however, only distinguish the scene between leg and nonleg detections. In application fields like firefighting, where people tend to choose squatting or crouching over standing postures, those methods will inevitably fail. Further, tracking based on a single sensor system may reduce the overall reliability if brought to outdoor or complex environments with limited vision on the target objectives. Therefore, we extend our recent work to a multiposture detection system based on laser and radar sensors, that are fused to allow for maximal reliability and accuracy in scenarios as complex as indoor firefighting with vastly limited vision. The proposed tracking pipeline is trained and extensively validated on a new data set. We show that the radar tracker reaches state‐of‐the‐art performance, and that laser and fusion tracker outperform recent methods.</description><identifier>ISSN: 1556-4959</identifier><identifier>EISSN: 1556-4967</identifier><identifier>DOI: 10.1002/rob.22195</identifier><language>eng</language><publisher>Hoboken: Wiley Subscription Services, Inc</publisher><subject>adaptive Kalman filter ; Algorithms ; Fire fighting ; laser ; leg tracking ; Machine vision ; multiposture ; radar ; Radar tracking ; Reliability ; rescue robotics ; Robotics ; sensor fusion</subject><ispartof>Journal of field robotics, 2023-09, Vol.40 (6), p.1620-1638</ispartof><rights>2023 The Authors. published by Wiley Periodicals LLC.</rights><rights>2023. This article is published under http://creativecommons.org/licenses/by-nc/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3325-11654fc49adae61e06c42b3af32ce8fbd3995147f1d7f6185b1f02822ed872443</citedby><cites>FETCH-LOGICAL-c3325-11654fc49adae61e06c42b3af32ce8fbd3995147f1d7f6185b1f02822ed872443</cites><orcidid>0000-0003-1926-4359</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Frob.22195$$EPDF$$P50$$Gwiley$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Frob.22195$$EHTML$$P50$$Gwiley$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,1411,27901,27902,45550,45551</link.rule.ids></links><search><creatorcontrib>Mandischer, Nils</creatorcontrib><creatorcontrib>Hou, Ruikun</creatorcontrib><creatorcontrib>Corves, Burkhard</creatorcontrib><title>Multiposture leg tracking for temporarily vision restricted environments based on fusion of laser and radar sensor data</title><title>Journal of field robotics</title><description>Leg tracking is an established field in mobile robotics and machine vision in general. These algorithms, however, only distinguish the scene between leg and nonleg detections. In application fields like firefighting, where people tend to choose squatting or crouching over standing postures, those methods will inevitably fail. Further, tracking based on a single sensor system may reduce the overall reliability if brought to outdoor or complex environments with limited vision on the target objectives. Therefore, we extend our recent work to a multiposture detection system based on laser and radar sensors, that are fused to allow for maximal reliability and accuracy in scenarios as complex as indoor firefighting with vastly limited vision. The proposed tracking pipeline is trained and extensively validated on a new data set. We show that the radar tracker reaches state‐of‐the‐art performance, and that laser and fusion tracker outperform recent methods.</description><subject>adaptive Kalman filter</subject><subject>Algorithms</subject><subject>Fire fighting</subject><subject>laser</subject><subject>leg tracking</subject><subject>Machine vision</subject><subject>multiposture</subject><subject>radar</subject><subject>Radar tracking</subject><subject>Reliability</subject><subject>rescue robotics</subject><subject>Robotics</subject><subject>sensor fusion</subject><issn>1556-4959</issn><issn>1556-4967</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>24P</sourceid><recordid>eNp1kMtOwzAQRS0EEqWw4A8ssWKRNn7lsYSKl1RUCcHachK7cknjMHZa9e8xDWLHbGZ0de7M6CJ0TdIZSVM6B1fNKCWlOEETIkSW8DLLT_9mUZ6jC-83acpZUYoJ2r8ObbC982EAjVu9xgFU_Wm7NTYOcNDb3oEC2x7wznrrOgzaB7B10A3W3c6C67a6Cx5XykcpAmY4cs7gNkqAVddgUI0C7HXn49JGBXWJzoxqvb767VP08fjwvnhOlqunl8XdMqkZoyIhJBPc1LyMfp0RnWY1pxVThtFaF6ZqWFkKwnNDmtxkpBAVMSktKNVNkVPO2RTdjHt7cF9DfF1u3ABdPClpwXlMhMWaotuRqsF5D9rIHuxWwUGSVP7kKmOu8phrZOcju7etPvwPyrfV_ej4BpUGe-c</recordid><startdate>202309</startdate><enddate>202309</enddate><creator>Mandischer, Nils</creator><creator>Hou, Ruikun</creator><creator>Corves, Burkhard</creator><general>Wiley Subscription Services, Inc</general><scope>24P</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-1926-4359</orcidid></search><sort><creationdate>202309</creationdate><title>Multiposture leg tracking for temporarily vision restricted environments based on fusion of laser and radar sensor data</title><author>Mandischer, Nils ; Hou, Ruikun ; Corves, Burkhard</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3325-11654fc49adae61e06c42b3af32ce8fbd3995147f1d7f6185b1f02822ed872443</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>adaptive Kalman filter</topic><topic>Algorithms</topic><topic>Fire fighting</topic><topic>laser</topic><topic>leg tracking</topic><topic>Machine vision</topic><topic>multiposture</topic><topic>radar</topic><topic>Radar tracking</topic><topic>Reliability</topic><topic>rescue robotics</topic><topic>Robotics</topic><topic>sensor fusion</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mandischer, Nils</creatorcontrib><creatorcontrib>Hou, Ruikun</creatorcontrib><creatorcontrib>Corves, Burkhard</creatorcontrib><collection>Wiley Online Library Open Access</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of field robotics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Mandischer, Nils</au><au>Hou, Ruikun</au><au>Corves, Burkhard</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multiposture leg tracking for temporarily vision restricted environments based on fusion of laser and radar sensor data</atitle><jtitle>Journal of field robotics</jtitle><date>2023-09</date><risdate>2023</risdate><volume>40</volume><issue>6</issue><spage>1620</spage><epage>1638</epage><pages>1620-1638</pages><issn>1556-4959</issn><eissn>1556-4967</eissn><abstract>Leg tracking is an established field in mobile robotics and machine vision in general. These algorithms, however, only distinguish the scene between leg and nonleg detections. In application fields like firefighting, where people tend to choose squatting or crouching over standing postures, those methods will inevitably fail. Further, tracking based on a single sensor system may reduce the overall reliability if brought to outdoor or complex environments with limited vision on the target objectives. Therefore, we extend our recent work to a multiposture detection system based on laser and radar sensors, that are fused to allow for maximal reliability and accuracy in scenarios as complex as indoor firefighting with vastly limited vision. The proposed tracking pipeline is trained and extensively validated on a new data set. We show that the radar tracker reaches state‐of‐the‐art performance, and that laser and fusion tracker outperform recent methods.</abstract><cop>Hoboken</cop><pub>Wiley Subscription Services, Inc</pub><doi>10.1002/rob.22195</doi><tpages>19</tpages><orcidid>https://orcid.org/0000-0003-1926-4359</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1556-4959
ispartof Journal of field robotics, 2023-09, Vol.40 (6), p.1620-1638
issn 1556-4959
1556-4967
language eng
recordid cdi_proquest_journals_2844495333
source Wiley Online Library Journals Frontfile Complete
subjects adaptive Kalman filter
Algorithms
Fire fighting
laser
leg tracking
Machine vision
multiposture
radar
Radar tracking
Reliability
rescue robotics
Robotics
sensor fusion
title Multiposture leg tracking for temporarily vision restricted environments based on fusion of laser and radar sensor data
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-19T07%3A00%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multiposture%20leg%20tracking%20for%20temporarily%20vision%20restricted%20environments%20based%20on%20fusion%20of%20laser%20and%20radar%20sensor%20data&rft.jtitle=Journal%20of%20field%20robotics&rft.au=Mandischer,%20Nils&rft.date=2023-09&rft.volume=40&rft.issue=6&rft.spage=1620&rft.epage=1638&rft.pages=1620-1638&rft.issn=1556-4959&rft.eissn=1556-4967&rft_id=info:doi/10.1002/rob.22195&rft_dat=%3Cproquest_cross%3E2844495333%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2844495333&rft_id=info:pmid/&rfr_iscdi=true