Premium
On‐line motion blending for real‐time locomotion generation
Author(s) -
Park Sang Il,
Shin Hyun Joon,
Kim Tae Hoon,
Shin Sung Yong
Publication year - 2004
Publication title -
computer animation and virtual worlds
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.225
H-Index - 49
eISSN - 1546-427X
pISSN - 1546-4261
DOI - 10.1002/cav.15
Subject(s) - retargeting , computer science , motion (physics) , node (physics) , computer vision , exploit , graph , line (geometry) , transition (genetics) , enhanced data rates for gsm evolution , set (abstract data type) , frame (networking) , artificial intelligence , scheme (mathematics) , computer graphics (images) , theoretical computer science , mathematics , physics , telecommunications , geometry , mathematical analysis , biochemistry , chemistry , computer security , quantum mechanics , gene , programming language
In this paper, we present an integrated framework of on‐line motion blending for locomotion generation. We first provide a novel scheme for incremental timewarping, which always guarantees that the time goes forward. Combining the idea of motion blending with that of posture rearrangement, we introduce a motion transition graph to address on‐line motion blending and transition simultaneously. Guided by a stream of motion specifications, our motion synthesis scheme moves from node to node in an on‐line manner while blending a motion at a node and generating a transition motion at an edge. For smooth on‐line motion transition, we also attach a set of example transition motions to an edge. To represent similar postures consistently, we exploit the inter‐frame coherency embedded in the input motion specification. Finally, we provide a comprehensive solution to on‐line motion retargeting by integrating existing techniques. Copyright © 2004 John Wiley & Sons, Ltd.