z-logo
open-access-imgOpen Access
Robust Multitask Multiview Tracking in Videos
Author(s) -
Xue Mei,
Zhibin Hong,
Danil Prokhorov,
Dacheng Tao
Publication year - 2015
Publication title -
ieee transactions on neural networks and learning systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.882
H-Index - 212
eISSN - 2162-2388
pISSN - 2162-237X
DOI - 10.1109/tnnls.2015.2399233
Subject(s) - computing and processing , communication, networking and broadcast technologies , components, circuits, devices and systems , general topics for engineers
Various sparse-representation-based methods have been proposed to solve tracking problems, and most of them employ least squares (LSs) criteria to learn the sparse representation. In many tracking scenarios, traditional LS-based methods may not perform well owing to the presence of heavy-tailed noise. In this paper, we present a tracking approach using an approximate least absolute deviation (LAD)-based multitask multiview sparse learning method to enjoy robustness of LAD and take advantage of multiple types of visual features, such as intensity, color, and texture. The proposed method is integrated in a particle filter framework, where learning the sparse representation for each view of the single particle is regarded as an individual task. The underlying relationship between tasks across different views and different particles is jointly exploited in a unified robust multitask formulation based on LAD. In addition, to capture the frequently emerging outlier tasks, we decompose the representation matrix to two collaborative components that enable a more robust and accurate approximation. We show that the proposed formulation can be effectively approximated by Nesterov's smoothing method and efficiently solved using the accelerated proximal gradient method. The presented tracker is implemented using four types of features and is tested on numerous synthetic sequences and real-world video sequences, including the CVPR2013 tracking benchmark and ALOV++ data set. Both the qualitative and quantitative results demonstrate the superior performance of the proposed approach compared with several state-of-the-art trackers.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom