A Food Package Recognition and Sorting System Based on Structured Light and Deep Learning
Vision algorithm-based robotic arm grasping system is one of the robotic arm systems that can be applied to a wide range of scenarios. It uses algorithms to automatically identify the location of the target and guide the robotic arm to grasp it, which has more flexible features than the teachable ro...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Vision algorithm-based robotic arm grasping system is one of the robotic arm
systems that can be applied to a wide range of scenarios. It uses algorithms to
automatically identify the location of the target and guide the robotic arm to
grasp it, which has more flexible features than the teachable robotic arm
grasping system. However, for some food packages, their transparent packages or
reflective materials bring challenges to the recognition of vision algorithms,
and traditional vision algorithms cannot achieve high accuracy for these
packages. In addition, in the process of robotic arm grasping, the positioning
on the z-axis height still requires manual setting of parameters, which may
cause errors. Based on the above two problems, we designed a sorting system for
food packaging using deep learning algorithms and structured light 3D
reconstruction technology. Using a pre-trained MASK R-CNN model to recognize
the class of the object in the image and get its 2D coordinates, then using
structured light 3D reconstruction technique to calculate its 3D coordinates,
and finally after the coordinate system conversion to guide the robotic arm for
grasping. After testing, it is shown that the method can fully automate the
recognition and grasping of different kinds of food packages with high
accuracy. Using this method, it can help food manufacturers to reduce
production costs and improve production efficiency. |
---|---|
DOI: | 10.48550/arxiv.2309.03704 |