z-logo
open-access-imgOpen Access
Occlusion‐handling tracker based on discriminative correlation filters
Author(s) -
Xie Yue,
Zhang Hanling,
Li Lijun
Publication year - 2020
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2019.0651
Subject(s) - discriminative model , bittorrent tracker , artificial intelligence , robustness (evolution) , computer vision , eye tracking , occlusion , computer science , active appearance model , pattern recognition (psychology) , video tracking , tracking (education) , correlation , mathematics , object (grammar) , image (mathematics) , medicine , psychology , pedagogy , biochemistry , chemistry , geometry , cardiology , gene
Visual object tracking (VOT) based on discriminative correlation filters (DCF) has received great attention due to its higher computational efficiency and better robustness. However, DCF‐based methods suffer from the problem of model contamination. The tracker will drift into the background due to the uncertainties brought by shifting among peaks, which will further lead to the issues of model degradation. To deal with occlusions, a novel Occlusion‐Handling Tracker Based on Discriminative Correlation Filters (OHDCF) framework is proposed for online visual object tracking, where an occlusion‐handling strategy is integrated into the spatial–temporal regularized correlation filters (STRCF). The occlusion‐handling tracker follows a hybrid approach to handle partial occlusion and complete occlusion. Specifically, we first present a function to determine whether occlusion occurs. Then, the proposed filter uses block‐based and feature‐matching methods to determine whether an object is partially occluded or completely occluded. Following this, we use different methods to track the target. Extensive experiments have performed on OTB‐100, Temple‐Color‐128, VOT‐2016 and VOT‐2018 datasets, the results show that OHDCF achieves promising performance compared to other state‐of‐the‐art trackers. On VOT‐2018, OHDCF significantly outperforms STRCF from the challenge with a relative gain of 4.8 % in EAO and a gain of 4.6 % in Accuracy.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here