z-logo
open-access-imgOpen Access
Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task
Author(s) -
Haiyang Jin,
Qing Chen,
Zhixian Chen,
Ying Hu,
Jianwei Zhang
Publication year - 2016
Publication title -
caai transactions on intelligence technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.613
H-Index - 15
eISSN - 2468-6557
pISSN - 2468-2322
DOI - 10.1016/j.trit.2016.03.010
Subject(s) - computer science , task (project management) , artificial intelligence , computer vision , object (grammar) , tracking (education) , gesture recognition , gesture , tracking system , video tracking , human–computer interaction , engineering , kalman filter , systems engineering , psychology , pedagogy
In some complicated tabletop object manipulation task for robotic system, demonstration based control is an efficient way to enhance the stability of execution. In this paper, we use a new optical hand tracking sensor, LeapMotion, to perform a non-contact demonstration for robotic systems. A Multi-LeapMotion hand tracking system is developed. The setup of the two sensors is analyzed to gain a optimal way for efficiently use the informations from the two sensors. Meanwhile, the coordinate systems of the Mult-LeapMotion hand tracking device and the robotic demonstration system are developed. With the recognition to the element actions and the delay calibration, the fusion principles are developed to get the improved and corrected gesture recognition. The gesture recognition and scenario experiments are carried out, and indicate the improvement of the proposed Multi-LeapMotion hand tracking system in tabletop object manipulation task for robotic demonstration

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom