z-logo
open-access-imgOpen Access
Presentation of Human Action Information via Avatar: From the Viewpoint of Avatar-Based Communication
Author(s) -
Daisaku Arita,
Rin-ichiro Taniguchi
Publication year - 2005
Publication title -
lecture notes in computer science
Language(s) - English
Resource type - Book series
SCImago Journal Rank - 0.249
H-Index - 400
eISSN - 1611-3349
pISSN - 0302-9743
ISBN - 3-540-28896-1
DOI - 10.1007/11553939_125
Subject(s) - avatar , computer science , human–computer interaction , action (physics) , human motion , motion (physics) , virtual actor , presentation (obstetrics) , artificial intelligence , virtual reality , medicine , physics , quantum mechanics , radiology
This paper describes techniques to present human action information on an avatar-based interaction system, using real-time motion sensing and human action symbolization. Avatar-based interaction systems with computer-generated virtual environments have difficulties in acquiring user's information, i.e., enough information to represent the user as if he/she were in the environment. This mainly comes of high degrees of freedom of human body and causes the lack of reality. Since it is almost impossible to acquire all the detailed information of human actions or activities, we, instead, recognize, or estimate, what kind of actions have occurred from sensed human motion information and other available information and re-generate detailed and natural actions from the estimated results. In this paper, we describe our approach, Real-time Human Proxy, especially on representing human actions. Also we present experimental results.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom