Projectile Flight Trajectory and Position Estimation System Based on Stereo Vision
Projectile trajectory estimation systems, such as missiles, rockets, intelligent robots, and other parabolic trajectory prediction systems, are often used in military defense and our daily life. In this paper, we propose a projectile trajectory and position estimation system based on stereo vision....
Gespeichert in:
Veröffentlicht in: | Sensors and materials 2019-01, Vol.31 (11), p.3483 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Projectile trajectory estimation systems, such as missiles, rockets, intelligent robots, and other parabolic trajectory prediction systems, are often used in military defense and our daily life. In this paper, we propose a projectile trajectory and position estimation system based on stereo vision. When a human captures a flying projectile, he quickly looks at it attentively to estimate its flight path and then catches it. To simulate the human eyes sensing dynamic images, two parallel network cameras capture the dynamic images of projectiles. The projectile target images are found from a complex background using the image process. The target's three-dimensional positions are calculated using the sum of absolute difference (SAD) algorithm and stereo vision, and a Kalman filter (KF) or unscented Kalman filter (UKF) algorithm is used to estimate the trajectory and landing position of a projectile. Experiments were carried out using different illumination backgrounds and different weights of projectiles. Experiment results showed that the proposed projectile trajectory and position estimation system based on stereo vision can estimate a projectile's three-dimensional trajectories, which correspond closely to actual trajectories, and UKF has higher performance than KF. Our proposed estimation system can be widely applied in daily life and industry. |
---|---|
ISSN: | 0914-4935 |
DOI: | 10.18494/SAM.2019.2485 |