Automatic cattle activity recognition on grazing systems

The use of collars, pedometers or activity tags is expensive to record cattle's behavior in short periods (e.g. 24h). Under this particular situation, the development of low-cost and easy-to-use technologies is relevant. Similar to smartphone apps for human activity recognition, which analyzes...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Biotecnologia en el sector agropecuario y agroindustrial 2022-07, Vol.20 (2), p.117-128
Hauptverfasser: Ramirez Agudelo, John Fredy, Bedoya Mazo, Sebastian, Posada Ochoa, Sandra Lucia, Rosero Noguera, Jaime Ricardo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The use of collars, pedometers or activity tags is expensive to record cattle's behavior in short periods (e.g. 24h). Under this particular situation, the development of low-cost and easy-to-use technologies is relevant. Similar to smartphone apps for human activity recognition, which analyzes data from embedded triaxial accelerometer sensors, we develop an Android app to record activity in cattle. Four main steps were followed: a) data acquisition for model training, b) model training, c) app deploy, and d) app utilization. For data acquisition, we developed a system in which three components were used: two smartphones and a Google Firebase account for data storage. For model training, the generated database was used to train a recurrent neural network. The performance of training was assessed by the confusion matrix. For all actual activities, the trained model provided a high prediction (> 96 %). The trained model was used to deploy an Android app by using the TensorFlow API. Finally, three cell phones (LG gm730) were used to test the app and record the activity of six Holstein cows (3 lactating and 3 non-lactating). Direct and non-systematic observations of the animals were made to contrast the activities recorded by the device. Our results show consistency between the direct observations and the activity recorded by our Android app.
ISSN:1692-3561
1909-9959
1909-9959
DOI:10.18684/rbsaa.v20.n2.2022.1940