Open Access
Spatial and long–short temporal attention correlation filters for visual tracking
Author(s) -
Zhao Jianwei,
Wei Fuyuan,
Chen NingNing,
Zhou Zhenghua
Publication year - 2022
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/ipr2.12535
Subject(s) - discriminative model , artificial intelligence , computer science , filter (signal processing) , term (time) , tracking (education) , computer vision , similarity (geometry) , bittorrent tracker , pattern recognition (psychology) , eye tracking , frame (networking) , boundary (topology) , mathematics , image (mathematics) , psychology , telecommunications , pedagogy , mathematical analysis , physics , quantum mechanics
Abstract Discriminative correlation filter is one of the quick and effective ways for studying visual tracking. However, discriminative correlation filter‐based methods still suffer from many challenging questions caused by environmental interferences, such as spatial boundary effect, temporal filter degradation, and tracking drift. A novel appearance optimisation model, named spatial and long–short temporal attention model, has been proposed based on a new spatial regularisation term and a long–short temporal regularisation term for learning the correlation filter to localise the target. On the one hand, our proposed method can improve the classical spatial regularisation term with a new weight matrix to alleviate the spatial boundary effect. On the other hand, two new temporal regularisation terms are designed: a short temporal regularisation term and a long temporal regularisation term. The short temporal regularisation term can enlarge the inner connections of the current frame and all foregoing frames to improve the tracking performances, and the long temporal regularisation term can address the influence of occlusion by using the similarity between the initial filter and the current one. Extensive experiments on various benchmarks illustrate that our proposed tracker performs favourably against several related popular trackers.