An Embedded Convolutional Neural Network for Maze Classification and Navigation

Traditionally, the maze solving robots employ ultrasonic sensors to detect the maze walls around the robot. The robot is able to transverse along the maze omnidirectionally measured depth. However, this approach only perceives the presence of the objects without recognizing the type of these objects...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Jurnal nasional teknik elektro 2023-07
Hauptverfasser: Dewantoro, Gunawan, Hadiyanto, Dinar Rahmat, Febrianto, Andreas Ardian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Traditionally, the maze solving robots employ ultrasonic sensors to detect the maze walls around the robot. The robot is able to transverse along the maze omnidirectionally measured depth. However, this approach only perceives the presence of the objects without recognizing the type of these objects. Therefore, computer vision has become more popular for classification purpose in robot applications. In this study, a maze solving robot is equipped with a camera to recognize the types of obstacles in a maze. The types of obstacles are classified as: intersection, dead end, T junction, finish zone, start zone, straight path, T–junction, left turn, and right turn. Convolutional neural network, consisting of four convolution layers, three pooling layers, and three fully-connected layers, is employed to train the robot using a total of 24,000 images to recognize the obstacles. Jetson Nano development kit is used to implement the trained model and navigate the robot. The results show an average training accuracy of 82% with a training time of 30 minutes 15 seconds. As for the testing, the lowest accuracy is 90% for the T-junction with the computational time being 500 milliseconds for each frame. Therefore, the convolutional neural network is adequate to serve as classifier and navigate a maze solving robot.
ISSN:2302-2949
2407-7267
DOI:10.25077/jnte.v12n2.1091.2023