z-logo
open-access-imgOpen Access
Adaptive Learning of Hand Movement in Human Demonstration for Robot Action
Author(s) -
Ngoc Hung Pham,
Takashi Yoshimi
Publication year - 2017
Publication title -
journal of robotics and mechatronics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.257
H-Index - 19
eISSN - 1883-8049
pISSN - 0915-3942
DOI - 10.20965/jrm.2017.p0919
Subject(s) - movement (music) , artificial intelligence , computer science , robot , computer vision , orientation (vector space) , process (computing) , tracking (education) , position (finance) , action (physics) , psychology , mathematics , physics , quantum mechanics , pedagogy , philosophy , geometry , finance , economics , operating system , aesthetics
This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom