Mackerel Fat Content Estimation Using RGB and Depth Images

We propose a method for estimating the fat content of mackerels from their images. The market value of fish varies greatly depending on the fat content. For example, mackerels with high-fat content are a high priority for business transactions in Japanese fisheries. The fat content is commonly measu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2021, Vol.9, p.164060-164069
Hauptverfasser: Sano, Shuya, Miyazaki, Tomo, Sugaya, Yoshihiro, Sekiguchi, Naohiro, Omachi, Shinichiro
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We propose a method for estimating the fat content of mackerels from their images. The market value of fish varies greatly depending on the fat content. For example, mackerels with high-fat content are a high priority for business transactions in Japanese fisheries. The fat content is commonly measured manually with special equipment using the near-infrared spectroscopy, which increases costs and reduces productivity. It is ideal to estimate the fat content automatically using inexpensive equipment such as ordinary cameras. However, fat content estimation from fish images is a challenging task because the difference in fat content appears only as a slight difference in their appearance. To tackle this problem, we propose to use not only RGB images but also depth images to utilize shape information as well as the textures. To detect subtle differences in texture and shape, we propose a convolutional neural network that extracts and concatenates features from part images, such as the head, body, and tail of a mackerel image. Color-texture and three-dimensional shape features extracted from RGB and depth images, respectively, are combined to estimate the fat content. Experimental results show that the proposed method estimated fat content with 2.25 points at mean absolute error.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3134260