Factor graph localization for mobile robots using Google Indoor Street View and CNN-based place recognition

This article proposes a mobile robot localization system developed using Google Indoor Street View (GISV) and Convolutional Neural Network (CNN)-based visual place recognition. The proposed localization system consists of two main modules. The first is a place recognition module based on GISV and a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Drone systems and applications 2023-01, Vol.11, p.1-19
Hauptverfasser: Tennakoon, Kusal B, De Silva, Oscar, Jayasiri, Awantha, Mann, George K.I, Gosine, Raymond G
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This article proposes a mobile robot localization system developed using Google Indoor Street View (GISV) and Convolutional Neural Network (CNN)-based visual place recognition. The proposed localization system consists of two main modules. The first is a place recognition module based on GISV and a net Vector of Locally Aggregated Descriptors (VLAD)-based CNN. The second is a factor graph-based optimization module. In this work, we show that a CNN-based approach can be utilized to overcome the lack of visually distinct features in indoor environments and changes in images that can occur when using different cameras at different points in time for localization. The proposed CNN-based localization system is implemented using reference and query images obtained from two different sources (GISV and a camera attached to a mobile robot). It has been experimentally validated using a custom indoor dataset captured at the Memorial University of Newfoundland engineering building basement. The main results of this paper show that GISV-based place recognition reduces the percentage drift by 4% for the dataset and achieves a Root Mean Square Error (RMSE) of 2 m for position and 2.5° for orientation.
ISSN:2564-4939
2564-4939
DOI:10.1139/dsa-2022-0045