z-logo
open-access-imgOpen Access
Bottom Dressing by a Dual-Arm Robot Using a Clothing State Estimation Based on Dynamic Shape Changes
Author(s) -
Kimitoshi Yamazaki,
Ryosuke Oya,
Kotaro Nagahama,
Kei Okada,
Masayuki Inaba
Publication year - 2016
Publication title -
international journal of advanced robotic systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.394
H-Index - 46
eISSN - 1729-8814
pISSN - 1729-8806
DOI - 10.5772/61930
Subject(s) - clothing , computer science , robot , artificial intelligence , task (project management) , computer vision , focus (optics) , sitting , dual (grammatical number) , humanoid robot , action (physics) , human–computer interaction , simulation , engineering , medicine , art , physics , literature , archaeology , systems engineering , optics , pathology , history , quantum mechanics
This paper describes an autonomous robot's method of dressing a subject in clothing. Our target task is to dress a person in the sitting pose. We especially focus on the action whereby a robot automatically pulls a pair of trousers up the subject's legs, an action frequently needed in dressing assistance. To avoid injuring the subject's legs, the robot should be able to recognize the state of the manipulated clothing. Therefore, while handling the clothing, the robot is supplied with both visual and tactile sensory information. A dressing failure is detected by the visual sensing of the behaviour of optical flows extracted from the clothing's movements. The effectiveness of the proposed approach is implemented and validated in a life-sized humanoid robot

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom