Dense Omnidirectional RGB-D Mapping of Large-scale Outdoor Environments for Real-time Localization and Autonomous Navigation
This paper presents a novel method and innovative apparatus for building three‐dimensional (3D) dense visual maps of large‐scale unstructured environments for autonomous navigation and real‐time localization. The main contribution of the paper is focused on proposing an efficient and accurate 3D wor...
Gespeichert in:
Veröffentlicht in: | Journal of field robotics 2015-06, Vol.32 (4), p.474-503 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 503 |
---|---|
container_issue | 4 |
container_start_page | 474 |
container_title | Journal of field robotics |
container_volume | 32 |
creator | Meilland, Maxime Comport, Andrew I. Rives, Patrick |
description | This paper presents a novel method and innovative apparatus for building three‐dimensional (3D) dense visual maps of large‐scale unstructured environments for autonomous navigation and real‐time localization. The main contribution of the paper is focused on proposing an efficient and accurate 3D world representation that allows us to extend the boundaries of state‐of‐the‐art dense visual mapping to large scales. This is achieved via an omnidirectional key‐frame representation of the environment, which is able to synthesize photorealistic views of captured environments at arbitrary locations. Locally, the representation is image‐based (egocentric) and is composed of accurate augmented spherical panoramas combining photometric information (RGB), depth information (D), and saliency for all viewing directions at a particular point in space (i.e., a point in the light field). The spheres are related by a graph of six degree of freedom (DOF) poses (3 DOF translation and 3 DOF rotation) that are estimated through multiview spherical registration. It is shown that this world representation can be used to perform robust real‐time localization (in 6 DOF) of any configuration of visual sensors within their environment, whether they be monocular, stereo, or multiview. Contrary to feature‐based approaches, an efficient direct image registration technique is formulated. This approach directly exploits the advantages of the spherical representation by minimizing a photometric error between a current image and a reference sphere. Two novel multicamera acquisition systems have been developed and calibrated to acquire this information, and this paper reports for the first time the second system. Given the robustness and efficiency of this representation, field experiments demonstrating autonomous navigation and large‐scale mapping will be reported in detail for challenging unstructured environments, containing vegetation, pedestrians, varying illumination conditions, trams, and dense traffic. |
doi_str_mv | 10.1002/rob.21531 |
format | Article |
fullrecord | <record><control><sourceid>proquest_hal_p</sourceid><recordid>TN_cdi_hal_primary_oai_HAL_hal_01010429v1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3680065371</sourcerecordid><originalsourceid>FETCH-LOGICAL-c5071-7c790c674430cefa66428ec783948f181fa4b422ab0592f2078b6bd8a841d9133</originalsourceid><addsrcrecordid>eNp1kV9v0zAUxSMEEmPwwDewxAs8ZPPf2H7s1q0DhRZVICReLMdxikdiFzspDPHhcVeoEBLyg63r3zn36p6ieI7gGYIQn8fQnGHECHpQnCDGqpLKij88vpl8XDxJ6RZCSoRkJ8XPufXJgtXgXeuiNaMLXvdgvbgo5-Ct3m6d34DQgVrHjS2T0X2Gp7ENIYIrv3Mx-MH6MYEuF9ZW9-XoBgvqkEn3Q-_tgPYtmE1j8GEIUwJLvXOb-5-nxaNO98k--32fFh-ur95f3pT1avH6claXhkGOSm64hKbilBJobKerimJhDRdEUtEhgTpNG4qxbiCTuMOQi6ZqWqEFRa1EhJwWrw6-n3WvttENOt6poJ26mdVqX4MoH4rlDmX25YHdxvB1smlUg0vG9r32Nk-vUCUxkZARmtEX_6C3YYp5fXuKS1FB-HdzE0NK0XbHCRBU-8xUzkzdZ5bZ8wP7zfX27v-gWq8u_ijKg8Kl0X4_KnT8oipOOFMflwt1Td7g5Sc4V-_IL7X4peY</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1679860013</pqid></control><display><type>article</type><title>Dense Omnidirectional RGB-D Mapping of Large-scale Outdoor Environments for Real-time Localization and Autonomous Navigation</title><source>Wiley Online Library Journals Frontfile Complete</source><creator>Meilland, Maxime ; Comport, Andrew I. ; Rives, Patrick</creator><creatorcontrib>Meilland, Maxime ; Comport, Andrew I. ; Rives, Patrick</creatorcontrib><description>This paper presents a novel method and innovative apparatus for building three‐dimensional (3D) dense visual maps of large‐scale unstructured environments for autonomous navigation and real‐time localization. The main contribution of the paper is focused on proposing an efficient and accurate 3D world representation that allows us to extend the boundaries of state‐of‐the‐art dense visual mapping to large scales. This is achieved via an omnidirectional key‐frame representation of the environment, which is able to synthesize photorealistic views of captured environments at arbitrary locations. Locally, the representation is image‐based (egocentric) and is composed of accurate augmented spherical panoramas combining photometric information (RGB), depth information (D), and saliency for all viewing directions at a particular point in space (i.e., a point in the light field). The spheres are related by a graph of six degree of freedom (DOF) poses (3 DOF translation and 3 DOF rotation) that are estimated through multiview spherical registration. It is shown that this world representation can be used to perform robust real‐time localization (in 6 DOF) of any configuration of visual sensors within their environment, whether they be monocular, stereo, or multiview. Contrary to feature‐based approaches, an efficient direct image registration technique is formulated. This approach directly exploits the advantages of the spherical representation by minimizing a photometric error between a current image and a reference sphere. Two novel multicamera acquisition systems have been developed and calibrated to acquire this information, and this paper reports for the first time the second system. Given the robustness and efficiency of this representation, field experiments demonstrating autonomous navigation and large‐scale mapping will be reported in detail for challenging unstructured environments, containing vegetation, pedestrians, varying illumination conditions, trams, and dense traffic.</description><identifier>ISSN: 1556-4959</identifier><identifier>EISSN: 1556-4967</identifier><identifier>DOI: 10.1002/rob.21531</identifier><language>eng</language><publisher>Hoboken: Blackwell Publishing Ltd</publisher><subject>Autonomous navigation ; Computer Science ; Localization ; Mapping ; Position (location) ; Real time ; Representations ; Robotics ; Three dimensional ; Visual</subject><ispartof>Journal of field robotics, 2015-06, Vol.32 (4), p.474-503</ispartof><rights>2014 Wiley Periodicals, Inc.</rights><rights>Copyright © 2015 Wiley Periodicals, Inc.</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c5071-7c790c674430cefa66428ec783948f181fa4b422ab0592f2078b6bd8a841d9133</citedby><cites>FETCH-LOGICAL-c5071-7c790c674430cefa66428ec783948f181fa4b422ab0592f2078b6bd8a841d9133</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Frob.21531$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Frob.21531$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>230,314,780,784,885,1417,27923,27924,45573,45574</link.rule.ids><backlink>$$Uhttps://inria.hal.science/hal-01010429$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Meilland, Maxime</creatorcontrib><creatorcontrib>Comport, Andrew I.</creatorcontrib><creatorcontrib>Rives, Patrick</creatorcontrib><title>Dense Omnidirectional RGB-D Mapping of Large-scale Outdoor Environments for Real-time Localization and Autonomous Navigation</title><title>Journal of field robotics</title><addtitle>J. Field Robotics</addtitle><description>This paper presents a novel method and innovative apparatus for building three‐dimensional (3D) dense visual maps of large‐scale unstructured environments for autonomous navigation and real‐time localization. The main contribution of the paper is focused on proposing an efficient and accurate 3D world representation that allows us to extend the boundaries of state‐of‐the‐art dense visual mapping to large scales. This is achieved via an omnidirectional key‐frame representation of the environment, which is able to synthesize photorealistic views of captured environments at arbitrary locations. Locally, the representation is image‐based (egocentric) and is composed of accurate augmented spherical panoramas combining photometric information (RGB), depth information (D), and saliency for all viewing directions at a particular point in space (i.e., a point in the light field). The spheres are related by a graph of six degree of freedom (DOF) poses (3 DOF translation and 3 DOF rotation) that are estimated through multiview spherical registration. It is shown that this world representation can be used to perform robust real‐time localization (in 6 DOF) of any configuration of visual sensors within their environment, whether they be monocular, stereo, or multiview. Contrary to feature‐based approaches, an efficient direct image registration technique is formulated. This approach directly exploits the advantages of the spherical representation by minimizing a photometric error between a current image and a reference sphere. Two novel multicamera acquisition systems have been developed and calibrated to acquire this information, and this paper reports for the first time the second system. Given the robustness and efficiency of this representation, field experiments demonstrating autonomous navigation and large‐scale mapping will be reported in detail for challenging unstructured environments, containing vegetation, pedestrians, varying illumination conditions, trams, and dense traffic.</description><subject>Autonomous navigation</subject><subject>Computer Science</subject><subject>Localization</subject><subject>Mapping</subject><subject>Position (location)</subject><subject>Real time</subject><subject>Representations</subject><subject>Robotics</subject><subject>Three dimensional</subject><subject>Visual</subject><issn>1556-4959</issn><issn>1556-4967</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2015</creationdate><recordtype>article</recordtype><recordid>eNp1kV9v0zAUxSMEEmPwwDewxAs8ZPPf2H7s1q0DhRZVICReLMdxikdiFzspDPHhcVeoEBLyg63r3zn36p6ieI7gGYIQn8fQnGHECHpQnCDGqpLKij88vpl8XDxJ6RZCSoRkJ8XPufXJgtXgXeuiNaMLXvdgvbgo5-Ct3m6d34DQgVrHjS2T0X2Gp7ENIYIrv3Mx-MH6MYEuF9ZW9-XoBgvqkEn3Q-_tgPYtmE1j8GEIUwJLvXOb-5-nxaNO98k--32fFh-ur95f3pT1avH6claXhkGOSm64hKbilBJobKerimJhDRdEUtEhgTpNG4qxbiCTuMOQi6ZqWqEFRa1EhJwWrw6-n3WvttENOt6poJ26mdVqX4MoH4rlDmX25YHdxvB1smlUg0vG9r32Nk-vUCUxkZARmtEX_6C3YYp5fXuKS1FB-HdzE0NK0XbHCRBU-8xUzkzdZ5bZ8wP7zfX27v-gWq8u_ijKg8Kl0X4_KnT8oipOOFMflwt1Td7g5Sc4V-_IL7X4peY</recordid><startdate>201506</startdate><enddate>201506</enddate><creator>Meilland, Maxime</creator><creator>Comport, Andrew I.</creator><creator>Rives, Patrick</creator><general>Blackwell Publishing Ltd</general><general>Wiley Subscription Services, Inc</general><general>Wiley</general><scope>BSCLL</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>1XC</scope><scope>VOOES</scope></search><sort><creationdate>201506</creationdate><title>Dense Omnidirectional RGB-D Mapping of Large-scale Outdoor Environments for Real-time Localization and Autonomous Navigation</title><author>Meilland, Maxime ; Comport, Andrew I. ; Rives, Patrick</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c5071-7c790c674430cefa66428ec783948f181fa4b422ab0592f2078b6bd8a841d9133</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2015</creationdate><topic>Autonomous navigation</topic><topic>Computer Science</topic><topic>Localization</topic><topic>Mapping</topic><topic>Position (location)</topic><topic>Real time</topic><topic>Representations</topic><topic>Robotics</topic><topic>Three dimensional</topic><topic>Visual</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Meilland, Maxime</creatorcontrib><creatorcontrib>Comport, Andrew I.</creatorcontrib><creatorcontrib>Rives, Patrick</creatorcontrib><collection>Istex</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>Hyper Article en Ligne (HAL) (Open Access)</collection><jtitle>Journal of field robotics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Meilland, Maxime</au><au>Comport, Andrew I.</au><au>Rives, Patrick</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Dense Omnidirectional RGB-D Mapping of Large-scale Outdoor Environments for Real-time Localization and Autonomous Navigation</atitle><jtitle>Journal of field robotics</jtitle><addtitle>J. Field Robotics</addtitle><date>2015-06</date><risdate>2015</risdate><volume>32</volume><issue>4</issue><spage>474</spage><epage>503</epage><pages>474-503</pages><issn>1556-4959</issn><eissn>1556-4967</eissn><abstract>This paper presents a novel method and innovative apparatus for building three‐dimensional (3D) dense visual maps of large‐scale unstructured environments for autonomous navigation and real‐time localization. The main contribution of the paper is focused on proposing an efficient and accurate 3D world representation that allows us to extend the boundaries of state‐of‐the‐art dense visual mapping to large scales. This is achieved via an omnidirectional key‐frame representation of the environment, which is able to synthesize photorealistic views of captured environments at arbitrary locations. Locally, the representation is image‐based (egocentric) and is composed of accurate augmented spherical panoramas combining photometric information (RGB), depth information (D), and saliency for all viewing directions at a particular point in space (i.e., a point in the light field). The spheres are related by a graph of six degree of freedom (DOF) poses (3 DOF translation and 3 DOF rotation) that are estimated through multiview spherical registration. It is shown that this world representation can be used to perform robust real‐time localization (in 6 DOF) of any configuration of visual sensors within their environment, whether they be monocular, stereo, or multiview. Contrary to feature‐based approaches, an efficient direct image registration technique is formulated. This approach directly exploits the advantages of the spherical representation by minimizing a photometric error between a current image and a reference sphere. Two novel multicamera acquisition systems have been developed and calibrated to acquire this information, and this paper reports for the first time the second system. Given the robustness and efficiency of this representation, field experiments demonstrating autonomous navigation and large‐scale mapping will be reported in detail for challenging unstructured environments, containing vegetation, pedestrians, varying illumination conditions, trams, and dense traffic.</abstract><cop>Hoboken</cop><pub>Blackwell Publishing Ltd</pub><doi>10.1002/rob.21531</doi><tpages>30</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1556-4959 |
ispartof | Journal of field robotics, 2015-06, Vol.32 (4), p.474-503 |
issn | 1556-4959 1556-4967 |
language | eng |
recordid | cdi_hal_primary_oai_HAL_hal_01010429v1 |
source | Wiley Online Library Journals Frontfile Complete |
subjects | Autonomous navigation Computer Science Localization Mapping Position (location) Real time Representations Robotics Three dimensional Visual |
title | Dense Omnidirectional RGB-D Mapping of Large-scale Outdoor Environments for Real-time Localization and Autonomous Navigation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-09T07%3A11%3A06IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_hal_p&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Dense%20Omnidirectional%20RGB-D%20Mapping%20of%20Large-scale%20Outdoor%20Environments%20for%20Real-time%20Localization%20and%20Autonomous%20Navigation&rft.jtitle=Journal%20of%20field%20robotics&rft.au=Meilland,%20Maxime&rft.date=2015-06&rft.volume=32&rft.issue=4&rft.spage=474&rft.epage=503&rft.pages=474-503&rft.issn=1556-4959&rft.eissn=1556-4967&rft_id=info:doi/10.1002/rob.21531&rft_dat=%3Cproquest_hal_p%3E3680065371%3C/proquest_hal_p%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1679860013&rft_id=info:pmid/&rfr_iscdi=true |