Premium
Contribution estimation of participants for human interaction recognition
Author(s) -
Ji Yanli,
Shimada Atsushi,
Nagahara Hajime,
Taniguchi Rinichiro
Publication year - 2013
Publication title -
ieej transactions on electrical and electronic engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.254
H-Index - 30
eISSN - 1931-4981
pISSN - 1931-4973
DOI - 10.1002/tee.21850
Subject(s) - computer science , artificial intelligence , robustness (evolution) , estimation , machine learning , set (abstract data type) , action (physics) , action recognition , human interaction , pattern recognition (psychology) , human–computer interaction , engineering , class (philosophy) , biochemistry , chemistry , physics , systems engineering , quantum mechanics , gene , programming language
In this paper, we propose an efficient algorithm to recognize actions of human interaction. Unlike previous algorithms using two participants' actions, the proposed algorithm estimates the action contribution of participants to determine which participant's action is the major action for correct interaction recognition. To estimate this contribution, we construct a contribution interaction model for each interaction category in which both participants carry out major actions. Using the contribution models, we design a method that automatically estimates the contribution of participants and classifies interaction samples into “co‐contribution” and “single‐contribution” interactions. At the same time, the major actions in the “single‐contribution” interactions are determined. We evaluate our method on the UT‐interaction dataset and our original interaction dataset (LIMU). Recognition results indicate the robustness of the proposed method and the high estimation accuracy obtained: estimation accuracies of 96 and 98% in set 1 and set 2 of the UT dataset, respectively, and 97.8% in the LIMU dataset. Based on the estimation results, we extract the major action information for interaction recognition. Average recognition accuracies of 93.3% in set 1 and 91.7% in set 2 of the UT dataset were obtained. Our result is at least 5% better than those obtained with previous algorithms. For the LIMU dataset, recognition accuracy reached 91.1%. It was 8.9% higher than the recognition result without contribution estimation. © 2013 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.