Real‐time Locomotion Controller using an Inverted‐Pendulum‐based Abstract Model

In this paper, we propose a novel motion controller for the online generation of natural character locomotion that adapts to new situations such as changing user control or applying external forces. This controller continuously estimates the next footstep while walking and running, and automatically...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer graphics forum 2018-05, Vol.37 (2), p.287-296
Hauptverfasser: Hwang, Jaepyung, Kim, Jongmin, Suh, Il Hong, Kwon, Taesoo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we propose a novel motion controller for the online generation of natural character locomotion that adapts to new situations such as changing user control or applying external forces. This controller continuously estimates the next footstep while walking and running, and automatically switches the stepping strategy based on situational changes. To develop the controller, we devise a new physical model called an inverted‐pendulum‐based model (IPAM). The proposed model represents high‐dimensional character motions, inheriting the naturalness of captured motions by estimating the appropriate footstep location, speed and switching time at every frame. The estimation is achieved by a deep learning based regressor that extracts important features in captured motions. To validate the proposed controller, we train the model using captured motions of a human stopping, walking, and running in a limited space. Then, the motion controller generates human‐like locomotion with continuously varying speeds, transitions between walking and running, and collision response strategies in a cluttered space in real time.
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.13361