Movement Operation Interaction System for Mobility Robot Using Finger-Pointing Recognition
Author(s) -
Eichi Tamura,
Yoshihiro Yamashita,
Taisei Yamashita,
Eri Sato-Shimokawara,
Toru Yamaguchi
Publication year - 2017
Publication title -
journal of advanced computational intelligence and intelligent informatics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.172
H-Index - 20
eISSN - 1343-0130
pISSN - 1883-8014
DOI - 10.20965/jaciii.2017.p0709
Subject(s) - computer science , gesture , computer vision , artificial intelligence , usb , orientation (vector space) , robot , movement (music) , gesture recognition , wearable computer , embedded system , philosophy , geometry , mathematics , software , programming language , aesthetics
Finger pointing is an intuitive method for people to direct a robot to move to a certain location. We propose a system that enables the movement operation of a mobility robot by using finger-pointing gestures for an automatic and intuitive driving experience. We employ a method to recognize gestures by using video images from a USB camera mounted on a wearable device. Our method does not require the use of infrared sensors. Three movement commands for forward motion, turning, and stopping are chosen based on gesture recognition, face orientation detection, and an intelligent safety system. We experimentally demonstrate the usefulness of the system using a scooter-type mobility robot.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom