
Occlusion‐robust object tracking based on the confidence of online selected hierarchical features
Author(s) -
Liu Mingjie,
Jin ChengBin,
Yang Bin,
Cui Xuenan,
Kim Hakil
Publication year - 2018
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2018.5454
Subject(s) - bittorrent tracker , computer science , artificial intelligence , redundancy (engineering) , convolutional neural network , pattern recognition (psychology) , video tracking , eye tracking , feature (linguistics) , feature selection , active appearance model , computer vision , visualization , object (grammar) , image (mathematics) , operating system , linguistics , philosophy
In recent years, convolutional neural networks (CNNs) have been widely used for visual object tracking, especially in combination with correlation filters (CFs). However, the increasing complex CNN models introduce more useless information, which may decrease the tracking performance. This study proposes an online feature map selection method to remove noisy and irrelevant feature maps from different convolutional layers of CNN, which can reduce computation redundancy and improve tracking accuracy. Furthermore, a novel appearance model update strategy, which exploits the feedback from the peak value of response maps, is developed to avoid model corruption. Finally, an extensive evaluation of the proposed method was conducted over OTB‐2013 and OTB‐2015 datasets, and compared with different kinds of trackers, including deep learning‐based trackers and CF‐based trackers. The results demonstrate that the proposed method achieves a highly satisfactory performance.