Vision-Based Mobile Manipulator for Handling and Transportation of Supermarket Products

Robot manipulators are growing more widely employed in the retail market, mostly for warehousing, but automating them in-store logistics processes is still a difficult task. Supermarkets and large retail stores face many challenges: shortages, handling, and placement of a single product on shelves....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Mathematical problems in engineering 2022-06, Vol.2022, p.1-10
Hauptverfasser: Zia Ur Rahman, Muhammad, Usman, Muhammad, Farea, Adhban, Ahmad, Nasir, Mahmood, Imran, Imran, Muhammad
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Robot manipulators are growing more widely employed in the retail market, mostly for warehousing, but automating them in-store logistics processes is still a difficult task. Supermarkets and large retail stores face many challenges: shortages, handling, and placement of a single product on shelves. Various issues needed to be considered to develop a robot which can manipulate products of different sizes, shapes, and weight in limited spaces on shelves. The aim of this article is to design and develop a system to address the issues of shortage, identification, moving, and placements of products in supermarkets by properly incorporating database, camera vision, and line following mobile manipulator. A four-wheeled differential drive mobile robot was designed and developed which has a 5 DOF robotic manipulator on it. The line following technique is used to move it around the warehouse. The barcode recognition technique for the localization of product sections and object detection using SIFT is successfully and efficiently employed. The demonstration of the usefulness of the method was shown by carrying out experiments in a relevant environment which imitates a real supermarket.
ISSN:1024-123X
1563-5147
DOI:10.1155/2022/3883845