Non-destructive Assessment of White Striping in Broiler Breast Meat Using Structured-Illumination Reflectance Imaging with Deep Learning

Highlights Broiler breast meat with white striping (WS) was imaged under sinusoidally modulated structured illumination. Amplitude component (AC) images resolved better WS characteristics than direct component (DC) images. Models built on deep features classified meat samples into 2 and 3 classes ac...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of the ASABE 2023, Vol.66 (6), p.1437-1447
Hauptverfasser: Olaniyi, Ebenezer, Lu, Yuzhen, Sukumaran, Anuraj Theradiyil, Jarvis, Tessa, Rowe, Clinton
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Highlights Broiler breast meat with white striping (WS) was imaged under sinusoidally modulated structured illumination. Amplitude component (AC) images resolved better WS characteristics than direct component (DC) images. Models built on deep features classified meat samples into 2 and 3 classes according to WS severity. The AC images consistently outperformed the DC images and achieved the best accuracy of 96.6% in 2-class modeling. Abstract. Visual inspection is the prevailing practice in the industry for assessing white striping (WS) in broiler breast meat. However, this approach is subjective, laborious, and prone to error. Several studies have utilized imaging technology under uniform illumination; however, detecting defects, such as WS, remains challenging. This study investigated the efficacy of the emerging structured illumination reflectance imaging (SIRI) combined with deep learning (DL) for the WS assessment of broiler meat. Broiler fillets with varying degrees of WS were imaged using a custom-assembled monochromatic SIRI system (0.05-0.40 cycles/mm). The acquired SIRI pattern images were demodulated into direct components (DC) and amplitude components (AC) at each spatial frequency. Pre-trained DL models, including two Very Deep Convolutional Networks (VGG16 and VGG19) and two Densely Connected Convolutional Neural Networks (DenseNet121 and DensetNet201), were evaluated as fine-tuned end-to-end classifiers and feature extractors separately to differentiate the meat samples. Fine-tuned VGG16 resulted in the best 2-class and 3-class classification accuracies of 94.5% and 74.6%, respectively, based on the AC images at 0.30 cycles/mm, improving over the accuracies of the corresponding DC images by 10.4% and 8.9%, respectively. Fine-tuned VGG19 and DenseNet201 also achieved substantial improvements of AC over DC images by 12% or higher. Linear discriminant analysis, in conjunction with principal component analysis for dimension reduction, yielded better accuracies of 96.6% in 2-class and 83.4% in 3-class classification, based on the deep features extracted by DenseNet121 from the AC images at 0.40 cycles/mm and 0.30 cycles/mm, respectively, representing improvements of 5.0% and 2.9% based on the features of the corresponding DC images. The SIRI technique combined with DL is effective for differentiating between normal and WS-affected broiler meat. Keywords: Deep learning, Imaging, Meat quality, Poultry, Structured illumination, White striping.
ISSN:2769-3287
2769-3287
DOI:10.13031/ja.15667