Ergodicity reveals assistance and learning from physical human-robot interaction
Author(s) -
Kathleen Fitzsimons,
Ana Marı́a Acosta,
Julius P. A. Dewald,
Todd Murphey
Publication year - 2019
Publication title -
science robotics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 5.619
H-Index - 50
ISSN - 2470-9476
DOI - 10.1126/scirobotics.aav6079
Subject(s) - ergodicity , measure (data warehouse) , task (project management) , robot , motion (physics) , artificial intelligence , human–computer interaction , computer science , cognitive psychology , psychology , computer vision , mathematics , engineering , statistics , data mining , systems engineering
This paper applies information theoretic principles to the investigation of physical human-robot interaction. Drawing from the study of human perception and neural encoding, information theoretic approaches offer a perspective that enables quantitatively interpreting the body as an information channel, and bodily motion as an information-carrying signal. We show that ergodicity, which can be interpreted as the degree to which a trajectory encodes information about a task, correctly predicts changes due to reduction of a person's existing deficit or the addition of algorithmic assistance. The measure also captures changes from training with robotic assistance. Other common measures for assessment failed to capture at least one of these effects. This information-based interpretation of motion can be applied broadly, in the evaluation and design of human-machine interactions, in learning by demonstration paradigms, or in human motion analysis.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom