Particle dynamics and multi-channel feature dictionaries for robust visual tracking
Author(s) -
Srikrishna Karanam,
Yang Li,
Richard J. Radke
Publication year - 2015
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5244/c.29.183
Subject(s) - robustness (evolution) , particle filter , computer science , bittorrent tracker , artificial intelligence , eye tracking , feature (linguistics) , computer vision , pattern recognition (psychology) , pruning , representation (politics) , channel (broadcasting) , active appearance model , image (mathematics) , kalman filter , linguistics , philosophy , computer network , biochemistry , chemistry , biology , politics , law , political science , agronomy , gene
We present a novel approach to solve the visual tracking problem in a particle filter framework based on sparse visual representations. Current state-of-the-art trackers use low-resolution image intensity features in target appearance modeling. Such features often fail to capture sufficient visual information about the target. Here, we demonstrate the efficacy of visually richer representation schemes by employing multi-channel feature dictionaries as part of the appearance model. To further mitigate the tracking drift problem, we propose a novel dynamic adaptive state transition model, taking into account the dynamics of the past states. Finally, we demonstrate the computational tractability of using richer appearance modeling schemes by adaptively pruning candidate particles during each sampling step, and using a fast augmented Lagrangian technique to solve the associated optimization problem. Extensive quantitative evaluations and robustness tests on several challenging video sequences demonstrate that our approach substantially outperforms the state of the art, and achieves stable results.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom