Training of deep neural networks for the generation of dynamic movement primitives

Dynamic movement primitives (DMPs) have proven to be an effective movement representation for motor skill learning. In this paper, we propose a new approach for training deep neural networks to synthesize dynamic movement primitives. The distinguishing property of our approach is that it can utilize...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural networks 2020-07, Vol.127, p.121-131
Hauptverfasser: Pahič, Rok, Ridge, Barry, Gams, Andrej, Morimoto, Jun, Ude, Aleš
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Dynamic movement primitives (DMPs) have proven to be an effective movement representation for motor skill learning. In this paper, we propose a new approach for training deep neural networks to synthesize dynamic movement primitives. The distinguishing property of our approach is that it can utilize a novel loss function that measures the physical distance between movement trajectories as opposed to measuring the distance between the parameters of DMPs that have no physical meaning. This was made possible by deriving differential equations that can be applied to compute the gradients of the proposed loss function, thus enabling an effective application of backpropagation to optimize the parameters of the underlying deep neural network. While the developed approach is applicable to any neural network architecture, it was evaluated on two different architectures based on encoder–decoder networks and convolutional neural networks. Our results show that the minimization of the proposed loss function leads to better results than when more conventional loss functions are used.
ISSN:0893-6080
1879-2782
DOI:10.1016/j.neunet.2020.04.010