z-logo
open-access-imgOpen Access
Data-driven biped control
Author(s) -
Yoonsang Lee,
Sungeun Kim,
Jehee Lee
Publication year - 2010
Publication title -
acm transactions on graphics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.153
H-Index - 218
eISSN - 1557-7368
pISSN - 0730-0301
DOI - 10.1145/1833351.1781155
Subject(s) - computer science , trajectory , controller (irrigation) , control theory (sociology) , motion capture , key (lock) , animation , control engineering , control (management) , motion (physics) , artificial intelligence , engineering , computer graphics (images) , physics , computer security , astronomy , agronomy , biology
We present a dynamic controller to physically simulate underactuated three-dimensional full-body biped locomotion. Our datadriven controller takes motion capture reference data to reproduce realistic human locomotion through realtime physically based simulation. The key idea is modulating the reference trajectory continuously and seamlessly such that even a simple dynamic tracking controller can follow the reference trajectory while maintaining its balance. In our framework, biped control can be facilitated by a large array of existing data-driven animation techniques because our controller can take a stream of reference data generated on-thefly at runtime. We demonstrate the effectiveness of our approach through examples that allow bipeds to turn, spin, and walk while steering its direction interactively. CR Categories: I.3.7 [Three-Dimensional Graphics and Realism]: Animation

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom