CoMo: A Novel Comoving 3D Camera System
Motivated by the theoretical interest in reconstructing long 3D trajectories of individual birds in large flocks, we developed CoMo, a comoving camera system of two synchronized cameras coupled with rotational stages, which allows us to dynamically follow the motion of a target flock. With the rotat...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on instrumentation and measurement 2021, Vol.70, p.1-16 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Motivated by the theoretical interest in reconstructing long 3D trajectories of individual birds in large flocks, we developed CoMo, a comoving camera system of two synchronized cameras coupled with rotational stages, which allows us to dynamically follow the motion of a target flock. With the rotation of the cameras, we overcome the limitations of standard static systems that restrict the duration of the collected data to the short interval of time in which targets are in the cameras common field of view, but, at the same time, we change, in time, the external parameters of the system, which have then to be calibrated frame by frame. We address the calibration of the external parameters measuring the position of the cameras and their three angles of yaw, pitch, and roll in the system home configuration (rotational stage at an angle equal to 0°) and combining this static information with the time-dependent rotation due to the stages. We evaluate the robustness and accuracy of the system by comparing reconstructed and measured 3D distances in what we call 3D tests that show a relative error of the order of 1%. The novelty of the work presented in this article is not only on the system itself but also on the approach that we use in the tests, which we show to be a very powerful tool in detecting and fixing calibration inaccuracies, and it, for this reason, may be relevant for a broad audience. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2021.3074388 |