Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments

RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The International journal of robotics research 2012-09, Vol.31 (11), p.1320-1343
Hauptverfasser: Bachrach, Abraham, Prentice, Samuel, He, Ruijie, Henry, Peter, Huang, Albert S, Krainin, Michael, Maturana, Daniel, Fox, Dieter, Roy, Nicholas
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1343
container_issue 11
container_start_page 1320
container_title The International journal of robotics research
container_volume 31
creator Bachrach, Abraham
Prentice, Samuel
He, Ruijie
Henry, Peter
Huang, Albert S
Krainin, Michael
Maturana, Daniel
Fox, Dieter
Roy, Nicholas
description RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. By leveraging results from recent state-of-the-art algorithms and hardware, our system enables 3D flight in cluttered environments using only onboard sensor data. All computation and sensing required for local position control are performed onboard the vehicle, reducing the dependence on an unreliable wireless link to a ground station. However, even with accurate 3D sensing and position estimation, some parts of the environment have more perceptual structure than others, leading to state estimates that vary in accuracy across the environment. If the vehicle plans a path without regard to how well it can localize itself along that path, it runs the risk of becoming lost or worse. We show how the belief roadmap algorithm prentice2009belief, a belief space extension of the probabilistic roadmap algorithm, can be used to plan vehicle trajectories that incorporate the sensing model of the RGB-D camera. We evaluate the effectiveness of our system for controlling a quadrotor micro air vehicle, demonstrate its use for constructing detailed 3D maps of an indoor environment, and discuss its limitations.
doi_str_mv 10.1177/0278364912455256
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1136416784</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sage_id>10.1177_0278364912455256</sage_id><sourcerecordid>1136416784</sourcerecordid><originalsourceid>FETCH-LOGICAL-c384t-9b4883810a38cfe82690af1554103ab77cff72ad70da3784b9d0175d6d2045ac3</originalsourceid><addsrcrecordid>eNp1kEtLxDAUhYMoOI7uXQbcuJho0qRNuvQxjsKA4mNdMk0yk6FNatIK_ntTxoUMuLpczncO9x4Azgm-IoTza5xxQQtWkozleZYXB2BCOCOIEl4cgskoo1E_BicxbjHGtMDlBGzmsbet7K13M9g10jnr1jMonYKt7Lq0QOMDlEPvnW_9EKFp7HrTwyGOmnTwdXGL7mEtWx0ktA4uXt6Q0s5qBbX7ssG7Vrs-noIjI5uoz37nFHw8zN_vHtHyefF0d7NENRWsR-WKCUEFwZKK2miRFSWWhuQ5I5jKFee1MTyTimMlKRdsVSpMeK4KlWGWy5pOweUutwv-c9Cxr1oba92k13Q6vyIktUCKZE3oxR669UNw6bqKYIFLWmI-UnhH1cHHGLSpupAaC98Jqsbqq_3qkwXtLFGu9d_Qf_gfeYeByQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1080939074</pqid></control><display><type>article</type><title>Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments</title><source>SAGE Complete A-Z List</source><creator>Bachrach, Abraham ; Prentice, Samuel ; He, Ruijie ; Henry, Peter ; Huang, Albert S ; Krainin, Michael ; Maturana, Daniel ; Fox, Dieter ; Roy, Nicholas</creator><creatorcontrib>Bachrach, Abraham ; Prentice, Samuel ; He, Ruijie ; Henry, Peter ; Huang, Albert S ; Krainin, Michael ; Maturana, Daniel ; Fox, Dieter ; Roy, Nicholas</creatorcontrib><description>RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. By leveraging results from recent state-of-the-art algorithms and hardware, our system enables 3D flight in cluttered environments using only onboard sensor data. All computation and sensing required for local position control are performed onboard the vehicle, reducing the dependence on an unreliable wireless link to a ground station. However, even with accurate 3D sensing and position estimation, some parts of the environment have more perceptual structure than others, leading to state estimates that vary in accuracy across the environment. If the vehicle plans a path without regard to how well it can localize itself along that path, it runs the risk of becoming lost or worse. We show how the belief roadmap algorithm prentice2009belief, a belief space extension of the probabilistic roadmap algorithm, can be used to plan vehicle trajectories that incorporate the sensing model of the RGB-D camera. We evaluate the effectiveness of our system for controlling a quadrotor micro air vehicle, demonstrate its use for constructing detailed 3D maps of an indoor environment, and discuss its limitations.</description><identifier>ISSN: 0278-3649</identifier><identifier>EISSN: 1741-3176</identifier><identifier>DOI: 10.1177/0278364912455256</identifier><identifier>CODEN: IJRREL</identifier><language>eng</language><publisher>London, England: SAGE Publications</publisher><subject>Algorithms ; Autonomous ; Cameras ; Detection ; Estimates ; Global positioning systems ; GPS ; Mapping ; Onboard ; Robotics ; Sensors ; Three dimensional ; Three dimensional imaging ; Vehicles ; Vision systems</subject><ispartof>The International journal of robotics research, 2012-09, Vol.31 (11), p.1320-1343</ispartof><rights>The Author(s) 2012</rights><rights>Copyright SAGE PUBLICATIONS, INC. Sep 2012</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c384t-9b4883810a38cfe82690af1554103ab77cff72ad70da3784b9d0175d6d2045ac3</citedby><cites>FETCH-LOGICAL-c384t-9b4883810a38cfe82690af1554103ab77cff72ad70da3784b9d0175d6d2045ac3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://journals.sagepub.com/doi/pdf/10.1177/0278364912455256$$EPDF$$P50$$Gsage$$H</linktopdf><linktohtml>$$Uhttps://journals.sagepub.com/doi/10.1177/0278364912455256$$EHTML$$P50$$Gsage$$H</linktohtml><link.rule.ids>314,776,780,21799,27903,27904,43600,43601</link.rule.ids></links><search><creatorcontrib>Bachrach, Abraham</creatorcontrib><creatorcontrib>Prentice, Samuel</creatorcontrib><creatorcontrib>He, Ruijie</creatorcontrib><creatorcontrib>Henry, Peter</creatorcontrib><creatorcontrib>Huang, Albert S</creatorcontrib><creatorcontrib>Krainin, Michael</creatorcontrib><creatorcontrib>Maturana, Daniel</creatorcontrib><creatorcontrib>Fox, Dieter</creatorcontrib><creatorcontrib>Roy, Nicholas</creatorcontrib><title>Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments</title><title>The International journal of robotics research</title><description>RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. By leveraging results from recent state-of-the-art algorithms and hardware, our system enables 3D flight in cluttered environments using only onboard sensor data. All computation and sensing required for local position control are performed onboard the vehicle, reducing the dependence on an unreliable wireless link to a ground station. However, even with accurate 3D sensing and position estimation, some parts of the environment have more perceptual structure than others, leading to state estimates that vary in accuracy across the environment. If the vehicle plans a path without regard to how well it can localize itself along that path, it runs the risk of becoming lost or worse. We show how the belief roadmap algorithm prentice2009belief, a belief space extension of the probabilistic roadmap algorithm, can be used to plan vehicle trajectories that incorporate the sensing model of the RGB-D camera. We evaluate the effectiveness of our system for controlling a quadrotor micro air vehicle, demonstrate its use for constructing detailed 3D maps of an indoor environment, and discuss its limitations.</description><subject>Algorithms</subject><subject>Autonomous</subject><subject>Cameras</subject><subject>Detection</subject><subject>Estimates</subject><subject>Global positioning systems</subject><subject>GPS</subject><subject>Mapping</subject><subject>Onboard</subject><subject>Robotics</subject><subject>Sensors</subject><subject>Three dimensional</subject><subject>Three dimensional imaging</subject><subject>Vehicles</subject><subject>Vision systems</subject><issn>0278-3649</issn><issn>1741-3176</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2012</creationdate><recordtype>article</recordtype><recordid>eNp1kEtLxDAUhYMoOI7uXQbcuJho0qRNuvQxjsKA4mNdMk0yk6FNatIK_ntTxoUMuLpczncO9x4Azgm-IoTza5xxQQtWkozleZYXB2BCOCOIEl4cgskoo1E_BicxbjHGtMDlBGzmsbet7K13M9g10jnr1jMonYKt7Lq0QOMDlEPvnW_9EKFp7HrTwyGOmnTwdXGL7mEtWx0ktA4uXt6Q0s5qBbX7ssG7Vrs-noIjI5uoz37nFHw8zN_vHtHyefF0d7NENRWsR-WKCUEFwZKK2miRFSWWhuQ5I5jKFee1MTyTimMlKRdsVSpMeK4KlWGWy5pOweUutwv-c9Cxr1oba92k13Q6vyIktUCKZE3oxR669UNw6bqKYIFLWmI-UnhH1cHHGLSpupAaC98Jqsbqq_3qkwXtLFGu9d_Qf_gfeYeByQ</recordid><startdate>201209</startdate><enddate>201209</enddate><creator>Bachrach, Abraham</creator><creator>Prentice, Samuel</creator><creator>He, Ruijie</creator><creator>Henry, Peter</creator><creator>Huang, Albert S</creator><creator>Krainin, Michael</creator><creator>Maturana, Daniel</creator><creator>Fox, Dieter</creator><creator>Roy, Nicholas</creator><general>SAGE Publications</general><general>SAGE PUBLICATIONS, INC</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>F28</scope></search><sort><creationdate>201209</creationdate><title>Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments</title><author>Bachrach, Abraham ; Prentice, Samuel ; He, Ruijie ; Henry, Peter ; Huang, Albert S ; Krainin, Michael ; Maturana, Daniel ; Fox, Dieter ; Roy, Nicholas</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c384t-9b4883810a38cfe82690af1554103ab77cff72ad70da3784b9d0175d6d2045ac3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Algorithms</topic><topic>Autonomous</topic><topic>Cameras</topic><topic>Detection</topic><topic>Estimates</topic><topic>Global positioning systems</topic><topic>GPS</topic><topic>Mapping</topic><topic>Onboard</topic><topic>Robotics</topic><topic>Sensors</topic><topic>Three dimensional</topic><topic>Three dimensional imaging</topic><topic>Vehicles</topic><topic>Vision systems</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Bachrach, Abraham</creatorcontrib><creatorcontrib>Prentice, Samuel</creatorcontrib><creatorcontrib>He, Ruijie</creatorcontrib><creatorcontrib>Henry, Peter</creatorcontrib><creatorcontrib>Huang, Albert S</creatorcontrib><creatorcontrib>Krainin, Michael</creatorcontrib><creatorcontrib>Maturana, Daniel</creatorcontrib><creatorcontrib>Fox, Dieter</creatorcontrib><creatorcontrib>Roy, Nicholas</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><jtitle>The International journal of robotics research</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bachrach, Abraham</au><au>Prentice, Samuel</au><au>He, Ruijie</au><au>Henry, Peter</au><au>Huang, Albert S</au><au>Krainin, Michael</au><au>Maturana, Daniel</au><au>Fox, Dieter</au><au>Roy, Nicholas</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments</atitle><jtitle>The International journal of robotics research</jtitle><date>2012-09</date><risdate>2012</risdate><volume>31</volume><issue>11</issue><spage>1320</spage><epage>1343</epage><pages>1320-1343</pages><issn>0278-3649</issn><eissn>1741-3176</eissn><coden>IJRREL</coden><abstract>RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. By leveraging results from recent state-of-the-art algorithms and hardware, our system enables 3D flight in cluttered environments using only onboard sensor data. All computation and sensing required for local position control are performed onboard the vehicle, reducing the dependence on an unreliable wireless link to a ground station. However, even with accurate 3D sensing and position estimation, some parts of the environment have more perceptual structure than others, leading to state estimates that vary in accuracy across the environment. If the vehicle plans a path without regard to how well it can localize itself along that path, it runs the risk of becoming lost or worse. We show how the belief roadmap algorithm prentice2009belief, a belief space extension of the probabilistic roadmap algorithm, can be used to plan vehicle trajectories that incorporate the sensing model of the RGB-D camera. We evaluate the effectiveness of our system for controlling a quadrotor micro air vehicle, demonstrate its use for constructing detailed 3D maps of an indoor environment, and discuss its limitations.</abstract><cop>London, England</cop><pub>SAGE Publications</pub><doi>10.1177/0278364912455256</doi><tpages>24</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0278-3649
ispartof The International journal of robotics research, 2012-09, Vol.31 (11), p.1320-1343
issn 0278-3649
1741-3176
language eng
recordid cdi_proquest_miscellaneous_1136416784
source SAGE Complete A-Z List
subjects Algorithms
Autonomous
Cameras
Detection
Estimates
Global positioning systems
GPS
Mapping
Onboard
Robotics
Sensors
Three dimensional
Three dimensional imaging
Vehicles
Vision systems
title Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T11%3A22%3A22IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Estimation,%20planning,%20and%20mapping%20for%20autonomous%20flight%20using%20an%20RGB-D%20camera%20in%20GPS-denied%20environments&rft.jtitle=The%20International%20journal%20of%20robotics%20research&rft.au=Bachrach,%20Abraham&rft.date=2012-09&rft.volume=31&rft.issue=11&rft.spage=1320&rft.epage=1343&rft.pages=1320-1343&rft.issn=0278-3649&rft.eissn=1741-3176&rft.coden=IJRREL&rft_id=info:doi/10.1177/0278364912455256&rft_dat=%3Cproquest_cross%3E1136416784%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1080939074&rft_id=info:pmid/&rft_sage_id=10.1177_0278364912455256&rfr_iscdi=true