Premium
A gesture control system for intuitive 3D interaction with virtual objects
Author(s) -
Manders Corey,
Farbiz Farzam,
Yin Tang Ka,
Miaolong Yuan,
Guan Chua Gim
Publication year - 2009
Publication title -
computer animation and virtual worlds
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.225
H-Index - 49
eISSN - 1546-427X
pISSN - 1546-4261
DOI - 10.1002/cav.324
Subject(s) - computer science , gesture , computer vision , artificial intelligence , translation (biology) , object (grammar) , face (sociological concept) , virtual reality , computer graphics (images) , position (finance) , human–computer interaction , social science , biochemistry , chemistry , finance , sociology , messenger rna , economics , gene
We present a system for interacting with 3D objects in a 3D virtual environment. Using the notion that a typical head‐mounted display (HMD) does not cover the user's entire face, we use a fiducial marker placed on the HMD to locate the user's exposed facial skin. Using this information, a skin model is built and combined with the depth information obtained from a stereo camera. The information when used in tandem allows the position of the user's hands to be detected and tracked in real time. Once both hands are located, our system allows the user to manipulate the object with five degrees of freedom (translation in x ‐, y ‐, and z ‐ axis with roll and yaw rotations) in virtual three‐dimensional space using a series of intuitive hand gestures. Copyright © 2009 John Wiley & Sons, Ltd.