Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower

Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improv...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of intelligent & robotic systems 2024-03, Vol.110 (1), p.30, Article 30
Hauptverfasser: Sarmento, José, Neves dos Santos, Filipe, Silva Aguiar, André, Filipe, Vítor, Valente, António
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 1
container_start_page 30
container_title Journal of intelligent & robotic systems
container_volume 110
creator Sarmento, José
Neves dos Santos, Filipe
Silva Aguiar, André
Filipe, Vítor
Valente, António
description Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.
doi_str_mv 10.1007/s10846-023-02037-4
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2921593620</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2921593620</sourcerecordid><originalsourceid>FETCH-LOGICAL-c314t-b67ba0bb2820a42b2133fd06ea33ee6e64fa9d7d7fb1b4969dddddc867a83dca3</originalsourceid><addsrcrecordid>eNp9kE1LxDAQhoMouK7-AU8Bz9XJx6bNURerwoqi68lDSNpkt0u30aRl8d-btYI3B4ZhmHnfYR6EzglcEoD8KhIouMiAspTA8owfoAmZ5anlIA_RBCQlaSTFMTqJcQMAspjJCXovh9j4DnuHl83WZt5lZdus1j2-0dHW-NV20YeId02_xo--89XQ6oDnemuDjtj5gDV-8cb3TYWfbYjJq_Rt63c2nKIjp9toz37rFL2Vt8v5fbZ4unuYXy-yihHeZ0bkRoMxtKCgOTWUMOZqEFYzZq2wgjst67zOnSGGSyHrfVSFyHXB6kqzKboYfT-C_xxs7NXGD6FLJxVNb88kE4nJFNFxqwo-xmCd-gjNVocvRUDtIaoRokoQ1Q9ExZOIjaKYlruVDX_W_6i-AZcudUU</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2921593620</pqid></control><display><type>article</type><title>Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower</title><source>Springer Nature OA Free Journals</source><source>Alma/SFX Local Collection</source><source>SpringerLink Journals - AutoHoldings</source><creator>Sarmento, José ; Neves dos Santos, Filipe ; Silva Aguiar, André ; Filipe, Vítor ; Valente, António</creator><creatorcontrib>Sarmento, José ; Neves dos Santos, Filipe ; Silva Aguiar, André ; Filipe, Vítor ; Valente, António</creatorcontrib><description>Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.</description><identifier>ISSN: 0921-0296</identifier><identifier>EISSN: 1573-0409</identifier><identifier>DOI: 10.1007/s10846-023-02037-4</identifier><language>eng</language><publisher>Dordrecht: Springer Netherlands</publisher><subject>Algorithms ; Artificial Intelligence ; Cameras ; Collaboration ; Control ; Cooperation ; Electrical Engineering ; Engineering ; Error reduction ; Injury prevention ; Localization ; Measuring instruments ; Mechanical Engineering ; Mechatronics ; Monocular vision ; Morale ; Multisensor fusion ; Object recognition ; Regular Paper ; Robotics ; Robots ; Sensors ; Tracking ; Ultrawideband</subject><ispartof>Journal of intelligent &amp; robotic systems, 2024-03, Vol.110 (1), p.30, Article 30</ispartof><rights>The Author(s) 2024</rights><rights>The Author(s) 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c314t-b67ba0bb2820a42b2133fd06ea33ee6e64fa9d7d7fb1b4969dddddc867a83dca3</cites><orcidid>0000-0002-4332-9645</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10846-023-02037-4$$EPDF$$P50$$Gspringer$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10846-023-02037-4$$EHTML$$P50$$Gspringer$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,27923,27924,41119,41487,42188,42556,51318,51575</link.rule.ids></links><search><creatorcontrib>Sarmento, José</creatorcontrib><creatorcontrib>Neves dos Santos, Filipe</creatorcontrib><creatorcontrib>Silva Aguiar, André</creatorcontrib><creatorcontrib>Filipe, Vítor</creatorcontrib><creatorcontrib>Valente, António</creatorcontrib><title>Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower</title><title>Journal of intelligent &amp; robotic systems</title><addtitle>J Intell Robot Syst</addtitle><description>Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Cameras</subject><subject>Collaboration</subject><subject>Control</subject><subject>Cooperation</subject><subject>Electrical Engineering</subject><subject>Engineering</subject><subject>Error reduction</subject><subject>Injury prevention</subject><subject>Localization</subject><subject>Measuring instruments</subject><subject>Mechanical Engineering</subject><subject>Mechatronics</subject><subject>Monocular vision</subject><subject>Morale</subject><subject>Multisensor fusion</subject><subject>Object recognition</subject><subject>Regular Paper</subject><subject>Robotics</subject><subject>Robots</subject><subject>Sensors</subject><subject>Tracking</subject><subject>Ultrawideband</subject><issn>0921-0296</issn><issn>1573-0409</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>C6C</sourceid><recordid>eNp9kE1LxDAQhoMouK7-AU8Bz9XJx6bNURerwoqi68lDSNpkt0u30aRl8d-btYI3B4ZhmHnfYR6EzglcEoD8KhIouMiAspTA8owfoAmZ5anlIA_RBCQlaSTFMTqJcQMAspjJCXovh9j4DnuHl83WZt5lZdus1j2-0dHW-NV20YeId02_xo--89XQ6oDnemuDjtj5gDV-8cb3TYWfbYjJq_Rt63c2nKIjp9toz37rFL2Vt8v5fbZ4unuYXy-yihHeZ0bkRoMxtKCgOTWUMOZqEFYzZq2wgjst67zOnSGGSyHrfVSFyHXB6kqzKboYfT-C_xxs7NXGD6FLJxVNb88kE4nJFNFxqwo-xmCd-gjNVocvRUDtIaoRokoQ1Q9ExZOIjaKYlruVDX_W_6i-AZcudUU</recordid><startdate>20240301</startdate><enddate>20240301</enddate><creator>Sarmento, José</creator><creator>Neves dos Santos, Filipe</creator><creator>Silva Aguiar, André</creator><creator>Filipe, Vítor</creator><creator>Valente, António</creator><general>Springer Netherlands</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-4332-9645</orcidid></search><sort><creationdate>20240301</creationdate><title>Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower</title><author>Sarmento, José ; Neves dos Santos, Filipe ; Silva Aguiar, André ; Filipe, Vítor ; Valente, António</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c314t-b67ba0bb2820a42b2133fd06ea33ee6e64fa9d7d7fb1b4969dddddc867a83dca3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Cameras</topic><topic>Collaboration</topic><topic>Control</topic><topic>Cooperation</topic><topic>Electrical Engineering</topic><topic>Engineering</topic><topic>Error reduction</topic><topic>Injury prevention</topic><topic>Localization</topic><topic>Measuring instruments</topic><topic>Mechanical Engineering</topic><topic>Mechatronics</topic><topic>Monocular vision</topic><topic>Morale</topic><topic>Multisensor fusion</topic><topic>Object recognition</topic><topic>Regular Paper</topic><topic>Robotics</topic><topic>Robots</topic><topic>Sensors</topic><topic>Tracking</topic><topic>Ultrawideband</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Sarmento, José</creatorcontrib><creatorcontrib>Neves dos Santos, Filipe</creatorcontrib><creatorcontrib>Silva Aguiar, André</creatorcontrib><creatorcontrib>Filipe, Vítor</creatorcontrib><creatorcontrib>Valente, António</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of intelligent &amp; robotic systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Sarmento, José</au><au>Neves dos Santos, Filipe</au><au>Silva Aguiar, André</au><au>Filipe, Vítor</au><au>Valente, António</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower</atitle><jtitle>Journal of intelligent &amp; robotic systems</jtitle><stitle>J Intell Robot Syst</stitle><date>2024-03-01</date><risdate>2024</risdate><volume>110</volume><issue>1</issue><spage>30</spage><pages>30-</pages><artnum>30</artnum><issn>0921-0296</issn><eissn>1573-0409</eissn><abstract>Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.</abstract><cop>Dordrecht</cop><pub>Springer Netherlands</pub><doi>10.1007/s10846-023-02037-4</doi><orcidid>https://orcid.org/0000-0002-4332-9645</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0921-0296
ispartof Journal of intelligent & robotic systems, 2024-03, Vol.110 (1), p.30, Article 30
issn 0921-0296
1573-0409
language eng
recordid cdi_proquest_journals_2921593620
source Springer Nature OA Free Journals; Alma/SFX Local Collection; SpringerLink Journals - AutoHoldings
subjects Algorithms
Artificial Intelligence
Cameras
Collaboration
Control
Cooperation
Electrical Engineering
Engineering
Error reduction
Injury prevention
Localization
Measuring instruments
Mechanical Engineering
Mechatronics
Monocular vision
Morale
Multisensor fusion
Object recognition
Regular Paper
Robotics
Robots
Sensors
Tracking
Ultrawideband
title Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-09T01%3A32%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Fusion%20of%20Time-of-Flight%20Based%20Sensors%20with%20Monocular%20Cameras%20for%20a%20Robotic%20Person%20Follower&rft.jtitle=Journal%20of%20intelligent%20&%20robotic%20systems&rft.au=Sarmento,%20Jos%C3%A9&rft.date=2024-03-01&rft.volume=110&rft.issue=1&rft.spage=30&rft.pages=30-&rft.artnum=30&rft.issn=0921-0296&rft.eissn=1573-0409&rft_id=info:doi/10.1007/s10846-023-02037-4&rft_dat=%3Cproquest_cross%3E2921593620%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2921593620&rft_id=info:pmid/&rfr_iscdi=true