Zoometric data extraction from drone imagery: the Arabian oryx (Oryx leucoryx)
Data extraction from unmanned aerial vehicle (UAV) imagery has proved effective in animal surveys and monitoring, but to date has scarcely been used for detailed population analysis and individual animal feature extraction. We assessed the zoometric and feature extraction of the Arabian oryx (Oryx l...
Gespeichert in:
Veröffentlicht in: | Environmental conservation 2021-12, Vol.48 (4), p.295-300 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Data extraction from unmanned aerial vehicle (UAV) imagery has proved effective in animal surveys and monitoring, but to date has scarcely been used for detailed population analysis and individual animal feature extraction. We assessed the zoometric and feature extraction of the Arabian oryx (Oryx leucoryx) using data acquired from a captive population for comparison with reintroduced populations monitored by UAVs. Highly accurate scaled and geo-rectified imagery derived from UAV surveys allowed precise morphometric measurements of the oryx. The scaled top-view imagery combined with baseline data from known sex, age, weight and pregnancy status of captive individuals were used to develop predictive models. A bracketed index developed from the predictive models showed high accuracy for classifying the age group ≤16 months, animals with a weight >80 kg and pregnancy. The pregnancy classification decision tree model performed with 91.7% accuracy. The polynomial weight predictive model performed well with relatively high accuracy when using the total top-view surface measurement. Photogrammetrically processed UAV-acquired imagery can yield valuable zoometric data, feature extraction and modelling; it is a tool with a practical application for field biologists that can assist in the decision-making process for species conservation management. |
---|---|
ISSN: | 0376-8929 1469-4387 |
DOI: | 10.1017/S0376892921000242 |