Vision and Learning for Deliberative Monocular Cluttered Flight
Cameras provide a rich source of information while being passive, cheap and lightweight for small and medium Unmanned Aerial Vehicles (UAVs). In this work we present the first implementation of receding horizon control, which is widely used in ground vehicles, with monocular vision as the only sensi...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2014-11 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Dey, Debadeepta Kumar Shaurya Shankar Zeng, Sam Mehta, Rupesh M Talha Agcayazi Eriksen, Christopher Daftry, Shreyansh Hebert, Martial Bagnell, J Andrew |
description | Cameras provide a rich source of information while being passive, cheap and lightweight for small and medium Unmanned Aerial Vehicles (UAVs). In this work we present the first implementation of receding horizon control, which is widely used in ground vehicles, with monocular vision as the only sensing mode for autonomous UAV flight in dense clutter. We make it feasible on UAVs via a number of contributions: novel coupling of perception and control via relevant and diverse, multiple interpretations of the scene around the robot, leveraging recent advances in machine learning to showcase anytime budgeted cost-sensitive feature selection, and fast non-linear regression for monocular depth prediction. We empirically demonstrate the efficacy of our novel pipeline via real world experiments of more than 2 kms through dense trees with a quadrotor built from off-the-shelf parts. Moreover our pipeline is designed to combine information from other modalities like stereo and lidar as well if available. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2084616735</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2084616735</sourcerecordid><originalsourceid>FETCH-proquest_journals_20846167353</originalsourceid><addsrcrecordid>eNqNyr0KwjAUQOEgCBbtO1xwLqRJ_zaHanHQTVxLtLc1JSR6k_j8OvgATmf4zoIlQso8awohViz1fuaci6oWZSkTtrtqr50FZQc4oSKr7QSjI9ij0TckFfQb4eysu0ejCFoTQ0DCATqjp0fYsOWojMf01zXbdodLe8ye5F4RfehnF8l-qRe8Kaq8qmUp_7s-7dU4pA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2084616735</pqid></control><display><type>article</type><title>Vision and Learning for Deliberative Monocular Cluttered Flight</title><source>Free E- Journals</source><creator>Dey, Debadeepta ; Kumar Shaurya Shankar ; Zeng, Sam ; Mehta, Rupesh ; M Talha Agcayazi ; Eriksen, Christopher ; Daftry, Shreyansh ; Hebert, Martial ; Bagnell, J Andrew</creator><creatorcontrib>Dey, Debadeepta ; Kumar Shaurya Shankar ; Zeng, Sam ; Mehta, Rupesh ; M Talha Agcayazi ; Eriksen, Christopher ; Daftry, Shreyansh ; Hebert, Martial ; Bagnell, J Andrew</creatorcontrib><description>Cameras provide a rich source of information while being passive, cheap and lightweight for small and medium Unmanned Aerial Vehicles (UAVs). In this work we present the first implementation of receding horizon control, which is widely used in ground vehicles, with monocular vision as the only sensing mode for autonomous UAV flight in dense clutter. We make it feasible on UAVs via a number of contributions: novel coupling of perception and control via relevant and diverse, multiple interpretations of the scene around the robot, leveraging recent advances in machine learning to showcase anytime budgeted cost-sensitive feature selection, and fast non-linear regression for monocular depth prediction. We empirically demonstrate the efficacy of our novel pipeline via real world experiments of more than 2 kms through dense trees with a quadrotor built from off-the-shelf parts. Moreover our pipeline is designed to combine information from other modalities like stereo and lidar as well if available.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Automotive parts ; Clutter ; Lidar ; Machine learning ; Monocular vision ; Pipeline design ; Unmanned aerial vehicles</subject><ispartof>arXiv.org, 2014-11</ispartof><rights>2014. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Dey, Debadeepta</creatorcontrib><creatorcontrib>Kumar Shaurya Shankar</creatorcontrib><creatorcontrib>Zeng, Sam</creatorcontrib><creatorcontrib>Mehta, Rupesh</creatorcontrib><creatorcontrib>M Talha Agcayazi</creatorcontrib><creatorcontrib>Eriksen, Christopher</creatorcontrib><creatorcontrib>Daftry, Shreyansh</creatorcontrib><creatorcontrib>Hebert, Martial</creatorcontrib><creatorcontrib>Bagnell, J Andrew</creatorcontrib><title>Vision and Learning for Deliberative Monocular Cluttered Flight</title><title>arXiv.org</title><description>Cameras provide a rich source of information while being passive, cheap and lightweight for small and medium Unmanned Aerial Vehicles (UAVs). In this work we present the first implementation of receding horizon control, which is widely used in ground vehicles, with monocular vision as the only sensing mode for autonomous UAV flight in dense clutter. We make it feasible on UAVs via a number of contributions: novel coupling of perception and control via relevant and diverse, multiple interpretations of the scene around the robot, leveraging recent advances in machine learning to showcase anytime budgeted cost-sensitive feature selection, and fast non-linear regression for monocular depth prediction. We empirically demonstrate the efficacy of our novel pipeline via real world experiments of more than 2 kms through dense trees with a quadrotor built from off-the-shelf parts. Moreover our pipeline is designed to combine information from other modalities like stereo and lidar as well if available.</description><subject>Automotive parts</subject><subject>Clutter</subject><subject>Lidar</subject><subject>Machine learning</subject><subject>Monocular vision</subject><subject>Pipeline design</subject><subject>Unmanned aerial vehicles</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNqNyr0KwjAUQOEgCBbtO1xwLqRJ_zaHanHQTVxLtLc1JSR6k_j8OvgATmf4zoIlQso8awohViz1fuaci6oWZSkTtrtqr50FZQc4oSKr7QSjI9ij0TckFfQb4eysu0ejCFoTQ0DCATqjp0fYsOWojMf01zXbdodLe8ye5F4RfehnF8l-qRe8Kaq8qmUp_7s-7dU4pA</recordid><startdate>20141124</startdate><enddate>20141124</enddate><creator>Dey, Debadeepta</creator><creator>Kumar Shaurya Shankar</creator><creator>Zeng, Sam</creator><creator>Mehta, Rupesh</creator><creator>M Talha Agcayazi</creator><creator>Eriksen, Christopher</creator><creator>Daftry, Shreyansh</creator><creator>Hebert, Martial</creator><creator>Bagnell, J Andrew</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20141124</creationdate><title>Vision and Learning for Deliberative Monocular Cluttered Flight</title><author>Dey, Debadeepta ; Kumar Shaurya Shankar ; Zeng, Sam ; Mehta, Rupesh ; M Talha Agcayazi ; Eriksen, Christopher ; Daftry, Shreyansh ; Hebert, Martial ; Bagnell, J Andrew</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_20846167353</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Automotive parts</topic><topic>Clutter</topic><topic>Lidar</topic><topic>Machine learning</topic><topic>Monocular vision</topic><topic>Pipeline design</topic><topic>Unmanned aerial vehicles</topic><toplevel>online_resources</toplevel><creatorcontrib>Dey, Debadeepta</creatorcontrib><creatorcontrib>Kumar Shaurya Shankar</creatorcontrib><creatorcontrib>Zeng, Sam</creatorcontrib><creatorcontrib>Mehta, Rupesh</creatorcontrib><creatorcontrib>M Talha Agcayazi</creatorcontrib><creatorcontrib>Eriksen, Christopher</creatorcontrib><creatorcontrib>Daftry, Shreyansh</creatorcontrib><creatorcontrib>Hebert, Martial</creatorcontrib><creatorcontrib>Bagnell, J Andrew</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dey, Debadeepta</au><au>Kumar Shaurya Shankar</au><au>Zeng, Sam</au><au>Mehta, Rupesh</au><au>M Talha Agcayazi</au><au>Eriksen, Christopher</au><au>Daftry, Shreyansh</au><au>Hebert, Martial</au><au>Bagnell, J Andrew</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Vision and Learning for Deliberative Monocular Cluttered Flight</atitle><jtitle>arXiv.org</jtitle><date>2014-11-24</date><risdate>2014</risdate><eissn>2331-8422</eissn><abstract>Cameras provide a rich source of information while being passive, cheap and lightweight for small and medium Unmanned Aerial Vehicles (UAVs). In this work we present the first implementation of receding horizon control, which is widely used in ground vehicles, with monocular vision as the only sensing mode for autonomous UAV flight in dense clutter. We make it feasible on UAVs via a number of contributions: novel coupling of perception and control via relevant and diverse, multiple interpretations of the scene around the robot, leveraging recent advances in machine learning to showcase anytime budgeted cost-sensitive feature selection, and fast non-linear regression for monocular depth prediction. We empirically demonstrate the efficacy of our novel pipeline via real world experiments of more than 2 kms through dense trees with a quadrotor built from off-the-shelf parts. Moreover our pipeline is designed to combine information from other modalities like stereo and lidar as well if available.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2014-11 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2084616735 |
source | Free E- Journals |
subjects | Automotive parts Clutter Lidar Machine learning Monocular vision Pipeline design Unmanned aerial vehicles |
title | Vision and Learning for Deliberative Monocular Cluttered Flight |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-11T02%3A58%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Vision%20and%20Learning%20for%20Deliberative%20Monocular%20Cluttered%20Flight&rft.jtitle=arXiv.org&rft.au=Dey,%20Debadeepta&rft.date=2014-11-24&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2084616735%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2084616735&rft_id=info:pmid/&rfr_iscdi=true |