Hand Gesture Modeling and Recognition for Human and Robot Interactive Assembly Using Hidden Markov Models
Author(s) -
Fei Chen,
Qiubo Zhong,
Ferdinando Cannella,
Kosuke Sekiyama,
Toshio Fukuda
Publication year - 2015
Publication title -
international journal of advanced robotic systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.394
H-Index - 46
eISSN - 1729-8814
pISSN - 1729-8806
DOI - 10.5772/60044
Subject(s) - computer science , hidden markov model , gesture , task (project management) , gesture recognition , artificial intelligence , robot , computer vision , trajectory , human–robot interaction , motion (physics) , human–computer interaction , physics , management , astronomy , economics
Gesture recognition is essential for human and robot collaboration. Within an industrial hybrid assembly cell, the performance of such a system significantly affects the safety of human workers. This work presents an approach to recognizing hand gestures accurately during an assembly task while in collaboration with a robot co-worker. We have designed and developed a sensor system for measuring natural human-robot interactions. The position and rotation information of a human worker's hands and fingertips are tracked in 3D space while completing a task. A modified chain-code method is proposed to describe the motion trajectory of the measured hands and fingertips. The Hidden Markov Model (HMM) method is adopted to recognize patterns via data streams and identify workers' gesture patterns and assembly intentions. The effectiveness of the proposed system is verified by experimental results. The outcome demonstrates that the proposed system is able to automatically segment the data streams and recognize the gesture patterns thus represented with a reasonable accuracy ratio
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom