z-logo
open-access-imgOpen Access
Multiscale spatially regularised correlation filters for visual tracking
Author(s) -
Gu Xiaodong,
Huang Xinyu,
Tokuta Alade
Publication year - 2017
Publication title -
iet computer vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.38
H-Index - 37
eISSN - 1751-9640
pISSN - 1751-9632
DOI - 10.1049/iet-cvi.2016.0241
Subject(s) - discriminative model , robustness (evolution) , bittorrent tracker , eye tracking , artificial intelligence , computer science , pattern recognition (psychology) , active appearance model , benchmark (surveying) , boundary (topology) , correlation , computer vision , mathematics , image (mathematics) , biochemistry , chemistry , geometry , geodesy , gene , geography , mathematical analysis
Recently, discriminative correlation filter based trackers have achieved extremely successful results in many competitions and benchmarks. These methods utilise a periodic assumption of the training samples to efficiently learn a classifier. However, this assumption will produce unwanted boundary effects which severely degrade the tracking performance. Correlation filters with limited boundaries and spatially regularised discriminative correlation filters were proposed to reduce boundary effects. However, their methods use the fixed scale mask or pre‐designed weights function, respectively, which are unsuitable for large‐scale variation. In this study, the authors proposed multiscale spatially regularised correlation filters (MSRCF) for visual tracking. The authors’ augmented objective can reduce the boundary effect even in large‐scale variation, leading to more discriminative model. The proposed multiscale regularisation matrix makes MSRCF fast convergence. The authors’ online tracking algorithm performs favourably against state‐of‐the‐art trackers on OTB‐2013 and OTB‐2015 Benchmark in terms of efficiency, accuracy and robustness.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here