Premium
Primitive Object Grasping for Finger Motion Synthesis
Author(s) -
Hwang JaePyung,
Park Gangrae,
Suh Il Hong,
Kwon Taesoo
Publication year - 2021
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/cgf.14187
Subject(s) - grasp , computer science , computer vision , motion (physics) , object (grammar) , artificial intelligence , orientation (vector space) , position (finance) , geometry , mathematics , finance , economics , programming language
We developed a new framework to generate hand and finger grasping motions. The proposed framework provides online adaptation to the position and orientation of objects and can generate grasping motions even when the object shape differs from that used during motion capture. This is achieved by using a mesh model, which we call primitive object grasping (POG), to represent the object grasping motion. The POG model uses a mesh deformation algorithm that keeps the original shape of the mesh while adapting to varying constraints. These characteristics are beneficial for finger grasping motion synthesis that satisfies constraints for mimicking the motion capture sequence and the grasping points reflecting the shape of the object. We verify the adaptability of the proposed motion synthesizer according to its position/orientation and shape variations of different objects by using motion capture sequences for grasping primitive objects, namely, a sphere, a cylinder, and a box. In addition, a different grasp strategy called a three‐finger grasp is synthesized to validate the generality of the POG‐based synthesis framework.