Multi-modal mobile sensor data fusion for autonomous robot mapping problem
Perception is the first step for a mobile robot to perform any task and for it to gain perception mobile robots use sensors to measure the states which represent the surrounding environment. Sensors measurements are always combined with some sort of uncertainty and noise. Which can make the system v...
Gespeichert in:
Veröffentlicht in: | MATEC web of conferences 2016, Vol.42, p.3008 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Perception is the first step for a mobile robot to perform any task and for it to gain perception mobile robots use sensors to measure the states which represent the surrounding environment. Sensors measurements are always combined with some sort of uncertainty and noise. Which can make the system very unstable and unreliable. In order to get better readings we can always use better types of sensors where we come to a trade off between price and quality. And that’s why our proposed approach to solve this problem was to use data fusion techniques to eliminate the noise and reduce the uncertainty in the readings. The topic of data fusion has been under extensive research in the past decade many approaches had been suggested and yet the research on data fusion is increasing and this because of its importance and applications. This study discuss the use of probabilistic data fusion techniques to reduce the uncertainty and eliminate the noise of the measurements from range finder active sensors to improve the task of mapping for mobile robots. The data fusion methods used were Kalman filter and Bayes filter. |
---|---|
ISSN: | 2261-236X 2274-7214 2261-236X |
DOI: | 10.1051/matecconf/20164203008 |