Recalling of multiple grasping methods from an object image with a convolutional neural network

In this study, a method for a robot to recall multiple grasping methods for a given object is proposed. The aim of this study was for robots to learn grasping methods for new objects by observing the grasping activities of humans in daily life without special instructions. For this setting, only one...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ROBOMECH Journal 2021-07, Vol.8 (1), p.1-13, Article 19
Hauptverfasser: Sanada, Makoto, Matsuo, Tadashi, Shimada, Nobutaka, Shirai, Yoshiaki
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this study, a method for a robot to recall multiple grasping methods for a given object is proposed. The aim of this study was for robots to learn grasping methods for new objects by observing the grasping activities of humans in daily life without special instructions. For this setting, only one grasping motion was observed for an object at a time, and it was never known whether other grasping methods were possible for the object, although supervised learning generally requires all possible answers for each training input. The proposed method gives a solution for that learning situations by employing a convolutional neural network with automatic clustering of the observed grasping method. In the proposed method, the grasping methods are clustered during the process of learning of the grasping position. The method first recalls grasping positions and the network estimates the multi-channel heatmap such that each channel heatmap indicates one grasping position, then checks the graspability for each estimated position. Finally, the method recalls the hand shapes based on the estimated grasping position and the object’s shape. This paper describes the results of recalling multiple grasping methods and demonstrates the effectiveness of the proposed method.
ISSN:2197-4225
2197-4225
DOI:10.1186/s40648-021-00206-4