Deep Bayesian Future Fusion for Self-Supervised, High-Resolution, Off-Road Mapping
High-speed off-road navigation requires long-range, high-resolution maps to enable robots to safely navigate over different surfaces while avoiding dangerous obstacles. However, due to limited computational power and sensing noise, most approaches to off-road mapping focus on producing coarse (20-40...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2024-09 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Aich, Shubhra Wang, Wenshan Maheshwari, Parv Sivaprakasam, Matthew Triest, Samuel Ho, Cherie Gregory, Jason M Rogers, John G Scherer, Sebastian |
description | High-speed off-road navigation requires long-range, high-resolution maps to enable robots to safely navigate over different surfaces while avoiding dangerous obstacles. However, due to limited computational power and sensing noise, most approaches to off-road mapping focus on producing coarse (20-40cm) maps of the environment. In this paper, we propose Future Fusion, a framework capable of generating dense, high-resolution maps from sparse sensing data (30m forward at 2cm). This is accomplished by - (1) the efficient realization of the well-known Bayes filtering within the standard deep learning models that explicitly accounts for the sparsity pattern in stereo and LiDAR depth data, and (2) leveraging perceptual losses common in generative image completion. The proposed methodology outperforms the conventional baselines. Moreover, the learned features and the completed dense maps lead to improvements in the downstream navigation task. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2968617979</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2968617979</sourcerecordid><originalsourceid>FETCH-proquest_journals_29686179793</originalsourceid><addsrcrecordid>eNqNyrsKwjAUgOEgCBbtOwRcG6iJva1eShcRWvcS6ElNKU3MaQTf3g4-gNM3_P-KBFyIA8uPnG9IiDjEcczTjCeJCEh9AbD0JD-AWk609LN3sIDaTFQZRxsYFWu8BffWCF1EK90_WQ1oRj8vU0TvSrHayI7epLV66ndkreSIEP7ckn15fZwrZp15ecC5HYx305JaXqR5esiKrBD_XV91Hz7r</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2968617979</pqid></control><display><type>article</type><title>Deep Bayesian Future Fusion for Self-Supervised, High-Resolution, Off-Road Mapping</title><source>Free E- Journals</source><creator>Aich, Shubhra ; Wang, Wenshan ; Maheshwari, Parv ; Sivaprakasam, Matthew ; Triest, Samuel ; Ho, Cherie ; Gregory, Jason M ; Rogers, John G ; Scherer, Sebastian</creator><creatorcontrib>Aich, Shubhra ; Wang, Wenshan ; Maheshwari, Parv ; Sivaprakasam, Matthew ; Triest, Samuel ; Ho, Cherie ; Gregory, Jason M ; Rogers, John G ; Scherer, Sebastian</creatorcontrib><description>High-speed off-road navigation requires long-range, high-resolution maps to enable robots to safely navigate over different surfaces while avoiding dangerous obstacles. However, due to limited computational power and sensing noise, most approaches to off-road mapping focus on producing coarse (20-40cm) maps of the environment. In this paper, we propose Future Fusion, a framework capable of generating dense, high-resolution maps from sparse sensing data (30m forward at 2cm). This is accomplished by - (1) the efficient realization of the well-known Bayes filtering within the standard deep learning models that explicitly accounts for the sparsity pattern in stereo and LiDAR depth data, and (2) leveraging perceptual losses common in generative image completion. The proposed methodology outperforms the conventional baselines. Moreover, the learned features and the completed dense maps lead to improvements in the downstream navigation task.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Bayesian analysis ; Datasets ; High resolution ; Off road vehicles ; Roads</subject><ispartof>arXiv.org, 2024-09</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Aich, Shubhra</creatorcontrib><creatorcontrib>Wang, Wenshan</creatorcontrib><creatorcontrib>Maheshwari, Parv</creatorcontrib><creatorcontrib>Sivaprakasam, Matthew</creatorcontrib><creatorcontrib>Triest, Samuel</creatorcontrib><creatorcontrib>Ho, Cherie</creatorcontrib><creatorcontrib>Gregory, Jason M</creatorcontrib><creatorcontrib>Rogers, John G</creatorcontrib><creatorcontrib>Scherer, Sebastian</creatorcontrib><title>Deep Bayesian Future Fusion for Self-Supervised, High-Resolution, Off-Road Mapping</title><title>arXiv.org</title><description>High-speed off-road navigation requires long-range, high-resolution maps to enable robots to safely navigate over different surfaces while avoiding dangerous obstacles. However, due to limited computational power and sensing noise, most approaches to off-road mapping focus on producing coarse (20-40cm) maps of the environment. In this paper, we propose Future Fusion, a framework capable of generating dense, high-resolution maps from sparse sensing data (30m forward at 2cm). This is accomplished by - (1) the efficient realization of the well-known Bayes filtering within the standard deep learning models that explicitly accounts for the sparsity pattern in stereo and LiDAR depth data, and (2) leveraging perceptual losses common in generative image completion. The proposed methodology outperforms the conventional baselines. Moreover, the learned features and the completed dense maps lead to improvements in the downstream navigation task.</description><subject>Bayesian analysis</subject><subject>Datasets</subject><subject>High resolution</subject><subject>Off road vehicles</subject><subject>Roads</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNyrsKwjAUgOEgCBbtOwRcG6iJva1eShcRWvcS6ElNKU3MaQTf3g4-gNM3_P-KBFyIA8uPnG9IiDjEcczTjCeJCEh9AbD0JD-AWk609LN3sIDaTFQZRxsYFWu8BffWCF1EK90_WQ1oRj8vU0TvSrHayI7epLV66ndkreSIEP7ckn15fZwrZp15ecC5HYx305JaXqR5esiKrBD_XV91Hz7r</recordid><startdate>20240927</startdate><enddate>20240927</enddate><creator>Aich, Shubhra</creator><creator>Wang, Wenshan</creator><creator>Maheshwari, Parv</creator><creator>Sivaprakasam, Matthew</creator><creator>Triest, Samuel</creator><creator>Ho, Cherie</creator><creator>Gregory, Jason M</creator><creator>Rogers, John G</creator><creator>Scherer, Sebastian</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240927</creationdate><title>Deep Bayesian Future Fusion for Self-Supervised, High-Resolution, Off-Road Mapping</title><author>Aich, Shubhra ; Wang, Wenshan ; Maheshwari, Parv ; Sivaprakasam, Matthew ; Triest, Samuel ; Ho, Cherie ; Gregory, Jason M ; Rogers, John G ; Scherer, Sebastian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_29686179793</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Bayesian analysis</topic><topic>Datasets</topic><topic>High resolution</topic><topic>Off road vehicles</topic><topic>Roads</topic><toplevel>online_resources</toplevel><creatorcontrib>Aich, Shubhra</creatorcontrib><creatorcontrib>Wang, Wenshan</creatorcontrib><creatorcontrib>Maheshwari, Parv</creatorcontrib><creatorcontrib>Sivaprakasam, Matthew</creatorcontrib><creatorcontrib>Triest, Samuel</creatorcontrib><creatorcontrib>Ho, Cherie</creatorcontrib><creatorcontrib>Gregory, Jason M</creatorcontrib><creatorcontrib>Rogers, John G</creatorcontrib><creatorcontrib>Scherer, Sebastian</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Aich, Shubhra</au><au>Wang, Wenshan</au><au>Maheshwari, Parv</au><au>Sivaprakasam, Matthew</au><au>Triest, Samuel</au><au>Ho, Cherie</au><au>Gregory, Jason M</au><au>Rogers, John G</au><au>Scherer, Sebastian</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Deep Bayesian Future Fusion for Self-Supervised, High-Resolution, Off-Road Mapping</atitle><jtitle>arXiv.org</jtitle><date>2024-09-27</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>High-speed off-road navigation requires long-range, high-resolution maps to enable robots to safely navigate over different surfaces while avoiding dangerous obstacles. However, due to limited computational power and sensing noise, most approaches to off-road mapping focus on producing coarse (20-40cm) maps of the environment. In this paper, we propose Future Fusion, a framework capable of generating dense, high-resolution maps from sparse sensing data (30m forward at 2cm). This is accomplished by - (1) the efficient realization of the well-known Bayes filtering within the standard deep learning models that explicitly accounts for the sparsity pattern in stereo and LiDAR depth data, and (2) leveraging perceptual losses common in generative image completion. The proposed methodology outperforms the conventional baselines. Moreover, the learned features and the completed dense maps lead to improvements in the downstream navigation task.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-09 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2968617979 |
source | Free E- Journals |
subjects | Bayesian analysis Datasets High resolution Off road vehicles Roads |
title | Deep Bayesian Future Fusion for Self-Supervised, High-Resolution, Off-Road Mapping |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T20%3A54%3A39IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Deep%20Bayesian%20Future%20Fusion%20for%20Self-Supervised,%20High-Resolution,%20Off-Road%20Mapping&rft.jtitle=arXiv.org&rft.au=Aich,%20Shubhra&rft.date=2024-09-27&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2968617979%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2968617979&rft_id=info:pmid/&rfr_iscdi=true |