Identifying Capsule Defect Based on an Improved Convolutional Neural Network
Capsules are commonly used as containers for most pharmaceuticals, and capsule quality is closely related to human health. Given the actual demand for capsule production, this study proposes a capsule defect detection and recognition method based on an improved convolutional neural network (CNN) alg...
Gespeichert in:
Veröffentlicht in: | Shock and vibration 2020, Vol.2020 (2020), p.1-9 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Capsules are commonly used as containers for most pharmaceuticals, and capsule quality is closely related to human health. Given the actual demand for capsule production, this study proposes a capsule defect detection and recognition method based on an improved convolutional neural network (CNN) algorithm. The algorithm is used for defect detection and classification in capsule production. Defective and qualified capsule images in the actual production are collected as samples. Then, a deep learning model based on the improved CNN is designed to train and test a capsule image dataset and identify defective capsules. The improved CNN algorithm is based on regularization and the Adam optimizer (RACNN), on which a dropout layer and L2_regularization are added between the full connection and the output layer to solve the overfitting problem. The Adam optimizer is introduced to accelerate model training and improve model convergence. Then, cross entropy is used as a loss function to measure the prediction performance of the model. By comparing the results of RACNN with different parameters, a detection method based on the optimal parameters of the RACNN model is finally selected. Results show a 97.56% recognition accuracy of the proposed method. Hence, this method could be used for the automatic identification and classification of defective capsules. |
---|---|
ISSN: | 1070-9622 1875-9203 |
DOI: | 10.1155/2020/8887723 |