Automatic monitoring of lactation frequency of sows and movement quantification of newborn piglets in farrowing houses using convolutional neural networks

•Embedded systems were built to collect videos of sows and piglets in pig farms.•EfficientNet and LSTM were trained to recognize lactating behavior of the sows.•A Refined Rotation RetinaNet was trained to localize the piglets in the videos.•SORT algorithm was used to track the piglets and to quantif...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers and electronics in agriculture 2021-10, Vol.189, p.106376, Article 106376
Hauptverfasser: Ho, Kuan-Ying, Tsai, Yu-Jung, Kuo, Yan-Fu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Embedded systems were built to collect videos of sows and piglets in pig farms.•EfficientNet and LSTM were trained to recognize lactating behavior of the sows.•A Refined Rotation RetinaNet was trained to localize the piglets in the videos.•SORT algorithm was used to track the piglets and to quantify their movements.•The approach demonstrated automatic monitoring in typical pig farm. Pork is an essential source of protein in Taiwan and many other countries globally, and to meet increasing demand, maintaining the weaning rate of piglets is essential. Newborn piglets are relatively fragile and require more attention; however, manual observation is time-consuming and labor-intensive. This study aimed to develop an automated approach to recognize the lactating frequencies of sows, localize piglets, track individual piglets, and quantify their movements in videos. Embedded systems integrated with cameras were developed to capture bird’s-eye-view videos of sows and piglets in a farrowing house, which were then transmitted to a cloud server and converted to images. A combination of EfficientNet and long short-term memory (LSTM) was trained to recognize the lactation behavior from the videos, and a refined rotation RetinaNet (R3Det) model was trained to localize the piglets. Subsequently, the simple online and real-time tracking (SORT) algorithm was applied to track individual piglets and quantify their movements. The combination of EfficientNet and LSTM achieved an overall accuracy of 97.67% in lactation behavior recognition within a test time of 7.7 ms per 1-min video by using a GPU. The trained R3Det model achieved an overall mean average precision of 87.90%, precision of 93.52%, recall of 88.52%, and processing speed of 10.2 fps using a GPU. The piglet tracking using SORT achieved an overall multiple object tracking accuracy of 97.35%, multiple object tracking precision of 96.97%, IDF1 of 98.30%, and processing speed of 171.6 fps using a CPU. Thus, the feasibility of the proposed approaches to be used in typical pig farms was proven.
ISSN:0168-1699
1872-7107
DOI:10.1016/j.compag.2021.106376