Premium
A context‐awareness model for activity recognition in robot‐assisted scenarios
Author(s) -
Rodriguez Lera Francisco J.,
Martín Rico Francisco,
Guerrero Higueras Angel Manuel,
Olivera Vicente Matellán
Publication year - 2020
Publication title -
expert systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.365
H-Index - 38
eISSN - 1468-0394
pISSN - 0266-4720
DOI - 10.1111/exsy.12481
Subject(s) - computer science , context (archaeology) , layer (electronics) , robot , bayesian network , inference , context awareness , dynamic bayesian network , artificial intelligence , activity recognition , bayesian inference , human–computer interaction , machine learning , bayesian probability , paleontology , linguistics , chemistry , philosophy , organic chemistry , phone , biology
Context awareness in ambient assisted living programmes for the elderly is a cornerstone in the current scenario of noncustomized service robots distributed around the world. This research proposes a context‐awareness system for a human–robot scene interpretation based on seven primary contexts and the American Occupational Therapy Association. The context‐awareness system defined here proposes an inference mechanism for the activity recognition supported on hierarchical Bayesian networks. However, when the information from sensors increases, the computational cost associated also increases. Thus, an evaluation of different Bayesian network models is necessary for decreasing its impact over the robot performance. Two topological models have been modelled and tested using OpenMarkov application: a two‐level approach of an input–observations layer and the activity recognition layer, and a three‐layer model setting apart a primary contexts layer, the input–observations layer, and the activity recognition layer. The qualitative and quantitative results presented here show better performance in terms of memory and memory in a three‐layer model. Besides, its effect on a hybrid architecture of a robotic platform is presented.