Recognition of Pantaneira cattle breed using computer vision and convolutional neural networks

•Individual recognition of Pantanal cattle through the application of Convolutional Neural Networks (CNN).•Annotadet dataset with 27,849 images of Pantanal cattle, extracted from 212 videos.•Experimental results show that the architectural models used in the research achieved 99.86% accuracy. The ob...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers and electronics in agriculture 2020-08, Vol.175, p.105548, Article 105548
Hauptverfasser: Weber, Fabricio de Lima, Weber, Vanessa Aparecida de Moraes, Menezes, Geazy Vilharva, Oliveira Junior, Adair da Silva, Alves, Daniela Arestides, de Oliveira, Marcus Vinicius Morais, Matsubara, Edson Takashi, Pistori, Hemerson, Abreu, Urbano Gomes Pinto de
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Individual recognition of Pantanal cattle through the application of Convolutional Neural Networks (CNN).•Annotadet dataset with 27,849 images of Pantanal cattle, extracted from 212 videos.•Experimental results show that the architectural models used in the research achieved 99.86% accuracy. The objective of this paper is to provide recognition for Pantaneira cattle breed using Convolutional Neural Networks (CNN). Fifty-one animals from the Aquidauana Pantaneira cattle Center (NUBOPAN) were studied. The center is located in the Midwest region of Brazil. Four monitoring cameras were distributed in the fences and took 27,849 images of Pantaneira cattle breed using different angles and positions. The following three CNN architectures were used for the experiment: DenseNet-201, Resnet50 and Inception-Resnet-V. All networks were submitted to 10-fold stratified cross-validation over 50 epochs. The results showed an accuracy of 99% in all networks, which is encouraging for future research.
ISSN:0168-1699
1872-7107
DOI:10.1016/j.compag.2020.105548