z-logo
open-access-imgOpen Access
View-based Location and Tracking of Body Parts for Visual Interaction
Author(s) -
Antonio S. Micilotta,
Richard Bowden
Publication year - 2004
Publication title -
citeseer x (the pennsylvania state university)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5244/c.18.87
Subject(s) - computer vision , torso , artificial intelligence , computer science , particle filter , tracking (education) , position (finance) , segmentation , representation (politics) , biometrics , filter (signal processing) , psychology , pedagogy , medicine , finance , politics , political science , law , economics , anatomy
This paper presents a real time approach to locate and track the upper torso of the human body. Our main interest is not in 3D biometric accuracy, but rather a sufficient discriminatory representation for visual interaction. The algorithm employs background suppression and a general approximation to body shape, applied within a particle filter framework, making use of integral images to maintain real-time performance. Furthermore, we present a novel method to disambiguate the hands of the subject and to predict the likely position of elbows. The final system is demonstrated segmenting multiple subjects from a cluttered scene at above real time operation

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom