A Novel Approach to Image-Sequence-Based Mobile Robot Place Recognition
Visual place recognition is a challenging problem in simultaneous localization and mapping (SLAM) due to a large variability of the scene appearance. A place is usually described by a single-frame image in conventional place recognition algorithms. However, it is unlikely to completely describe the...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on systems, man, and cybernetics. Systems man, and cybernetics. Systems, 2021-09, Vol.51 (9), p.5377-5391 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Visual place recognition is a challenging problem in simultaneous localization and mapping (SLAM) due to a large variability of the scene appearance. A place is usually described by a single-frame image in conventional place recognition algorithms. However, it is unlikely to completely describe the place appearance using a single frame image. Moreover, it is more sensitive to the change of environments. In this article, a novel image-sequence-based framework for place detection and recognition is proposed. Rather than a single frame image, a place is represented by an image sequence in this article. Position invariant robust feature (PIRF) descriptors are extracted from images and processed by the incremental bag-of-words (BoWs) for feature extraction. The robot automatically partitions the sequentially acquired images into different image sequences according to the change of the environmental appearance. Then, the echo state network (ESN) is applied to model each image sequence. The resultant states of the ESN are used as features of the corresponding image sequence for place recognition. The proposed method is evaluated on two public datasets. Experimental comparisons with the FAB-MAP 2.0 and SeqSLAM are conducted. Finally, a real-world experiment on place recognition with a mobile robot is performed to further verify the proposed method. |
---|---|
ISSN: | 2168-2216 2168-2232 |
DOI: | 10.1109/TSMC.2019.2956321 |