z-logo
Premium
State of the Art in Example‐Based Motion Synthesis for Virtual Characters in Interactive Applications
Author(s) -
Pejsa T.,
Pandzic I.S.
Publication year - 2010
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/j.1467-8659.2009.01591.x
Subject(s) - animation , computer science , motion (physics) , character animation , computer animation , human–computer interaction , computer facial animation , computer graphics (images) , feature (linguistics) , state (computer science) , skeletal animation , artificial intelligence , programming language , linguistics , philosophy
Animated virtual human characters are a common feature in interactive graphical applications, such as computer and video games, online virtual worlds and simulations. Due to dynamic nature of such applications, character animation must be responsive and controllable in addition to looking as realistic and natural as possible. Though procedural and physics‐based animation provide a great amount of control over motion, they still look too unnatural to be of use in all but a few specific scenarios, which is why interactive applications nowadays still rely mainly on recorded and hand‐crafted motion clips. The challenge faced by animation system designers is to dynamically synthesize new, controllable motion by concatenating short motion segments into sequences of different actions or by parametrically blending clips that correspond to different variants of the same logical action. In this article, we provide an overview of research in the field of example‐based motion synthesis for interactive applications. We present methods for automated creation of supporting data structures for motion synthesis and describe how they can be employed at run‐time to generate motion that accurately accomplishes tasks specified by the AI or human user.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here