A 3D Dynamic Model of Human Actions for Probabilistic Image Tracking
Author(s) -
Ignasi Rius,
Daniel B. Rowe,
Jordi Gonzàlez,
Xavier Roca
Publication year - 2005
Publication title -
lecture notes in computer science
Language(s) - English
Resource type - Book series
SCImago Journal Rank - 0.249
H-Index - 400
eISSN - 1611-3349
pISSN - 0302-9743
ISBN - 3-540-26153-2
DOI - 10.1007/11492429_64
Subject(s) - probabilistic logic , computer science , particle filter , set (abstract data type) , artificial intelligence , object (grammar) , sequence (biology) , computer vision , tracking (education) , image (mathematics) , action (physics) , space (punctuation) , statistical model , data mining , pattern recognition (psychology) , kalman filter , psychology , pedagogy , biology , genetics , programming language , physics , operating system , quantum mechanics
In this paper we present a method suitable to be used for human tracking as a temporal prior in a particle filtering framework such as CONDENSATION [5]. This method is for predicting feasible human postures given a reduced set of previous postures and will drastically reduce the number of particles needed to track a generic high-articulated object. Given a sequence of preceding postures, this example-driven transition model probabilistically matches the most likely postures from a database of human actions. Each action of the database is defined within a PCA-like space called UaSpace suitable to perform the probabilistic match when searching for similar sequences. So different, but feasible postures of the database become the new predicted poses.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom