z-logo
Premium
Learning Target‐Adaptive Correlation Filters for Visual Tracking
Author(s) -
She Y.,
Yi Y.,
Gu J.
Publication year - 2020
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/cgf.14153
Subject(s) - bittorrent tracker , computer science , eye tracking , artificial intelligence , filter (signal processing) , focus (optics) , context (archaeology) , computer vision , tracking (education) , adaptive filter , active appearance model , correlation , bilateral filter , pattern recognition (psychology) , image (mathematics) , mathematics , algorithm , psychology , pedagogy , geometry , biology , paleontology , physics , optics
Correlation filters (CF) achieve excellent performance in visual tracking but suffer from undesired boundary effects. A significant amount of approaches focus on enlarging search regions to make up for this shortcoming. However, this introduces excessive background noises and misleads the filter into learning from the ambiguous information. In this paper, we propose a novel target‐adaptive correlation filter (TACF) that incorporates context and spatial‐temporal regularizations into the CF framework, thus learning a more robust appearance model in the case of large appearance variations. Besides, it can be effectively optimized via the alternating direction method of multipliers(ADMM), thus achieving a global optimal solution. Finally, an adaptive updating strategy is presented to discriminate the unreliable samples and alleviate the contamination of these training samples. Extensive evaluations on OTB‐2013, OTB‐2015, VOT‐2016, VOT‐2017 and TC‐128 datasets demonstrate that our TACF is very promising for various challenging scenarios compared with several state‐of‐the‐art trackers, with real‐time performance of 20 frames per second(fps).

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here