Improved Camshift Algorithm in AGV Vision-based Tracking with Edge Computing
Automated guided vehicles (AGVs) are Internet of Things robots that navigate automatically as guided by a central control platform with distributed intelligence. Different methodologies have been proposed for AGV visual tracking applications. However, vision-based tracking in AGVs usually confronts...
Gespeichert in:
Veröffentlicht in: | The Journal of supercomputing 2022-02, Vol.78 (2), p.2709-2723 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Automated guided vehicles (AGVs) are Internet of Things robots that navigate automatically as guided by a central control platform with distributed intelligence. Different methodologies have been proposed for AGV visual tracking applications. However, vision-based tracking in AGVs usually confronts the problem of time delay caused by the complexity of image processing algorithms. To balance the trade-off among algorithm complexity, hardware cost and performance, precision and robustness are usually compromised in practical deployment. This paper proposes a prototype design of a visual tracking system. Edge computing is implemented which migrates computation intensive image processing to a local computer. The Raspberry Pi-based AGV captures the real-time image through the camera, sends the images to the computer and receives the processing results through the WiFi link. An improved Camshift algorithm is developed and implemented. Based on this algorithm, the AGV can make convergent prediction of the pixels in the target area after the first detection of the object. Relative coordinates of the target can be located more accurately in less time. As tested in the experiments, the system architecture and new algorithm lead to reduced hardware cost, less time delay, improved robustness and higher accuracy in tracking. |
---|---|
ISSN: | 0920-8542 1573-0484 |
DOI: | 10.1007/s11227-021-03974-3 |