Glass Recognition and Map Optimization Method for Mobile Robot Based on Boundary Guidance
Current research on autonomous mobile robots focuses primarily on perceptual accuracy and autonomous performance. In commercial and domestic constructions, concrete, wood, and glass are typically used. Laser and visual mapping or planning algorithms are highly accurate in mapping wood panels and con...
Gespeichert in:
Veröffentlicht in: | Chinese journal of mechanical engineering 2023-06, Vol.36 (1), p.74-146, Article 74 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Current research on autonomous mobile robots focuses primarily on perceptual accuracy and autonomous performance. In commercial and domestic constructions, concrete, wood, and glass are typically used. Laser and visual mapping or planning algorithms are highly accurate in mapping wood panels and concrete walls. However, indoor and outdoor glass curtain walls may fail to perceive these transparent materials. In this study, a novel indoor glass recognition and map optimization method based on boundary guidance is proposed. First, the status of glass recognition techniques is analyzed comprehensively. Next, a glass image segmentation network based on boundary data guidance and the optimization of a planning map based on depth repair are proposed. Finally, map optimization and path-planning tests are conducted and compared using different algorithms. The results confirm the favorable adaptability of the proposed method to indoor transparent plates and glass curtain walls. Using the proposed method, the recognition accuracy of a public test set increases to 94.1%. After adding the planning map, incorrect coverage redundancies for two test scenes reduce by 59.84% and 55.7%. Herein, a glass recognition and map optimization method is proposed that offers sufficient capacity in perceiving indoor glass materials and recognizing indoor no-entry regions. |
---|---|
ISSN: | 2192-8258 1000-9345 2192-8258 |
DOI: | 10.1186/s10033-023-00902-9 |