Social and Robust Navigation for Indoor Robots Based on Object Semantic Grid and Topological Map
For the indoor navigation of service robots, human–robot interaction and adapting to the environment still need to be strengthened, including determining the navigation goal socially, improving the success rate of passing doors, and optimizing the path planning efficiency. This paper proposes an ind...
Gespeichert in:
Veröffentlicht in: | Applied sciences 2020-12, Vol.10 (24), p.8991 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | For the indoor navigation of service robots, human–robot interaction and adapting to the environment still need to be strengthened, including determining the navigation goal socially, improving the success rate of passing doors, and optimizing the path planning efficiency. This paper proposes an indoor navigation system based on object semantic grid and topological map, to optimize the above problems. First, natural language is used as a human–robot interaction form, from which the target room, object, and spatial relationship can be extracted by using speech recognition and word segmentation. Then, the robot selects the goal point from the target space by object affordance theory. To improve the navigation success rate and safety, we generate auxiliary navigation points on both sides of the door to correct the robot trajectory. Furthermore, based on the topological map and auxiliary navigation points, the global path is segmented into each topological area. The path planning algorithm is carried on respectively in every room, which significantly improves the navigation efficiency. This system has demonstrated to support autonomous navigation based on language interaction and significantly improve the safety, efficiency, and robustness of indoor robot navigation. Our system has been successfully tested in real domestic environments. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app10248991 |