Visual map matching and localization using a global feature map

This paper presents a novel method to support environmental perception of mobile robots by the use of a global feature map. While typical approaches to simultaneous localization and mapping (SLAM) mainly rely on an on-board camera for mapping, our approach uses geographically referenced aerial or sa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
1. Verfasser: Pink, O.
Format: Tagungsbericht
Sprache:eng ; jpn
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 7
container_issue
container_start_page 1
container_title
container_volume
creator Pink, O.
description This paper presents a novel method to support environmental perception of mobile robots by the use of a global feature map. While typical approaches to simultaneous localization and mapping (SLAM) mainly rely on an on-board camera for mapping, our approach uses geographically referenced aerial or satellite images to build a map in advance. The current position on the map is determined by matching features from the on-board camera to the global feature map. The problem of feature matching is posed as a standard point pattern matching problem and a solution using the iterative closest point method is given. The proposed algorithm is designed for use in a street vehicle and uses lane markings as features, but can be adapted to almost any other type of feature that is visible in aerial images. Our approach allows for estimating the robot position at a higher precision than by a purely GPS-based localization, while at the same time providing information about the environment far beyond the current field of view.
doi_str_mv 10.1109/CVPRW.2008.4563135
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_4563135</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>4563135</ieee_id><sourcerecordid>4563135</sourcerecordid><originalsourceid>FETCH-LOGICAL-i156t-2402cd25027c19b563baf2af7072b22c683d2424cad4904ebcadf75d37909f803</originalsourceid><addsrcrecordid>eNpNUMtOwzAQNI9KlJIfgEt-IGG9tmP7hFAEtFIlEIJyrBzHLkZpUuVxgK8nhQpxWM1qZne0O4RcUkgpBX2dr56e31IEUCkXGaNMHJFIS0U5co6MgzomU6QZJFLQ7OS_xjSe_mmgJuR8b6MBlcAzEnXdBwBQUEJoNiU3q9ANpoq3ZjdWb99DvYlNXcZVY00Vvkwfmjoeuh863lRNMQ57Z_qhdfulCzLxpupcdMAZeb2_e8nnyfLxYZHfLpNARdYnyAFtiQJQWqqL8aXCeDRegsQC0WaKlTjeb03JNXBXjI2XomRSg_YK2Ixc_foG59x614ataT_Xh2zYNxwPUEI</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Visual map matching and localization using a global feature map</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Pink, O.</creator><creatorcontrib>Pink, O.</creatorcontrib><description>This paper presents a novel method to support environmental perception of mobile robots by the use of a global feature map. While typical approaches to simultaneous localization and mapping (SLAM) mainly rely on an on-board camera for mapping, our approach uses geographically referenced aerial or satellite images to build a map in advance. The current position on the map is determined by matching features from the on-board camera to the global feature map. The problem of feature matching is posed as a standard point pattern matching problem and a solution using the iterative closest point method is given. The proposed algorithm is designed for use in a street vehicle and uses lane markings as features, but can be adapted to almost any other type of feature that is visible in aerial images. Our approach allows for estimating the robot position at a higher precision than by a purely GPS-based localization, while at the same time providing information about the environment far beyond the current field of view.</description><identifier>ISSN: 2160-7508</identifier><identifier>ISBN: 9781424423392</identifier><identifier>ISBN: 1424423392</identifier><identifier>EISSN: 2160-7516</identifier><identifier>EISBN: 9781424423408</identifier><identifier>EISBN: 1424423406</identifier><identifier>DOI: 10.1109/CVPRW.2008.4563135</identifier><identifier>LCCN: 2008902852</identifier><language>eng ; jpn</language><publisher>IEEE</publisher><subject>Algorithm design and analysis ; Cameras ; Iterative algorithms ; Iterative methods ; Mobile robots ; Pattern matching ; Robot vision systems ; Satellites ; Simultaneous localization and mapping ; Vehicles</subject><ispartof>2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008, p.1-7</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/4563135$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,777,781,786,787,2052,27906,54901</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/4563135$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Pink, O.</creatorcontrib><title>Visual map matching and localization using a global feature map</title><title>2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops</title><addtitle>CVPRW</addtitle><description>This paper presents a novel method to support environmental perception of mobile robots by the use of a global feature map. While typical approaches to simultaneous localization and mapping (SLAM) mainly rely on an on-board camera for mapping, our approach uses geographically referenced aerial or satellite images to build a map in advance. The current position on the map is determined by matching features from the on-board camera to the global feature map. The problem of feature matching is posed as a standard point pattern matching problem and a solution using the iterative closest point method is given. The proposed algorithm is designed for use in a street vehicle and uses lane markings as features, but can be adapted to almost any other type of feature that is visible in aerial images. Our approach allows for estimating the robot position at a higher precision than by a purely GPS-based localization, while at the same time providing information about the environment far beyond the current field of view.</description><subject>Algorithm design and analysis</subject><subject>Cameras</subject><subject>Iterative algorithms</subject><subject>Iterative methods</subject><subject>Mobile robots</subject><subject>Pattern matching</subject><subject>Robot vision systems</subject><subject>Satellites</subject><subject>Simultaneous localization and mapping</subject><subject>Vehicles</subject><issn>2160-7508</issn><issn>2160-7516</issn><isbn>9781424423392</isbn><isbn>1424423392</isbn><isbn>9781424423408</isbn><isbn>1424423406</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2008</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNpNUMtOwzAQNI9KlJIfgEt-IGG9tmP7hFAEtFIlEIJyrBzHLkZpUuVxgK8nhQpxWM1qZne0O4RcUkgpBX2dr56e31IEUCkXGaNMHJFIS0U5co6MgzomU6QZJFLQ7OS_xjSe_mmgJuR8b6MBlcAzEnXdBwBQUEJoNiU3q9ANpoq3ZjdWb99DvYlNXcZVY00Vvkwfmjoeuh863lRNMQ57Z_qhdfulCzLxpupcdMAZeb2_e8nnyfLxYZHfLpNARdYnyAFtiQJQWqqL8aXCeDRegsQC0WaKlTjeb03JNXBXjI2XomRSg_YK2Ixc_foG59x614ataT_Xh2zYNxwPUEI</recordid><startdate>200806</startdate><enddate>200806</enddate><creator>Pink, O.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>200806</creationdate><title>Visual map matching and localization using a global feature map</title><author>Pink, O.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i156t-2402cd25027c19b563baf2af7072b22c683d2424cad4904ebcadf75d37909f803</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng ; jpn</language><creationdate>2008</creationdate><topic>Algorithm design and analysis</topic><topic>Cameras</topic><topic>Iterative algorithms</topic><topic>Iterative methods</topic><topic>Mobile robots</topic><topic>Pattern matching</topic><topic>Robot vision systems</topic><topic>Satellites</topic><topic>Simultaneous localization and mapping</topic><topic>Vehicles</topic><toplevel>online_resources</toplevel><creatorcontrib>Pink, O.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Pink, O.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Visual map matching and localization using a global feature map</atitle><btitle>2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops</btitle><stitle>CVPRW</stitle><date>2008-06</date><risdate>2008</risdate><spage>1</spage><epage>7</epage><pages>1-7</pages><issn>2160-7508</issn><eissn>2160-7516</eissn><isbn>9781424423392</isbn><isbn>1424423392</isbn><eisbn>9781424423408</eisbn><eisbn>1424423406</eisbn><abstract>This paper presents a novel method to support environmental perception of mobile robots by the use of a global feature map. While typical approaches to simultaneous localization and mapping (SLAM) mainly rely on an on-board camera for mapping, our approach uses geographically referenced aerial or satellite images to build a map in advance. The current position on the map is determined by matching features from the on-board camera to the global feature map. The problem of feature matching is posed as a standard point pattern matching problem and a solution using the iterative closest point method is given. The proposed algorithm is designed for use in a street vehicle and uses lane markings as features, but can be adapted to almost any other type of feature that is visible in aerial images. Our approach allows for estimating the robot position at a higher precision than by a purely GPS-based localization, while at the same time providing information about the environment far beyond the current field of view.</abstract><pub>IEEE</pub><doi>10.1109/CVPRW.2008.4563135</doi><tpages>7</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2160-7508
ispartof 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008, p.1-7
issn 2160-7508
2160-7516
language eng ; jpn
recordid cdi_ieee_primary_4563135
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Algorithm design and analysis
Cameras
Iterative algorithms
Iterative methods
Mobile robots
Pattern matching
Robot vision systems
Satellites
Simultaneous localization and mapping
Vehicles
title Visual map matching and localization using a global feature map
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T20%3A57%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Visual%20map%20matching%20and%20localization%20using%20a%20global%20feature%20map&rft.btitle=2008%20IEEE%20Computer%20Society%20Conference%20on%20Computer%20Vision%20and%20Pattern%20Recognition%20Workshops&rft.au=Pink,%20O.&rft.date=2008-06&rft.spage=1&rft.epage=7&rft.pages=1-7&rft.issn=2160-7508&rft.eissn=2160-7516&rft.isbn=9781424423392&rft.isbn_list=1424423392&rft_id=info:doi/10.1109/CVPRW.2008.4563135&rft_dat=%3Cieee_6IE%3E4563135%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=9781424423408&rft.eisbn_list=1424423406&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=4563135&rfr_iscdi=true