z-logo
Premium
Compressive tracking combined with sample weights and adaptive learning factor
Author(s) -
Jin Yong,
Li Hongying,
Zhang Dandan,
Wu Jing,
Li Maozhen
Publication year - 2017
Publication title -
concurrency and computation: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.309
H-Index - 67
eISSN - 1532-0634
pISSN - 1532-0626
DOI - 10.1002/cpe.4398
Subject(s) - bhattacharyya distance , computer science , robustness (evolution) , artificial intelligence , tracking (education) , classifier (uml) , parametric statistics , stability (learning theory) , pattern recognition (psychology) , algorithm , computer vision , machine learning , mathematics , statistics , psychology , pedagogy , biochemistry , chemistry , gene
Summary The compressive tracking algorithm introduces a compressive sensing theory into the target tracking field and produces good real‐time performance. However, the original compressive tracking algorithm ignores the fact that individual samples make different contributions to the target and that the learning factor is an empirical value that remains constant when the template is updated. Therefore, adverse factors (such as noise) and errors can infiltrate into the parametric model during the updating of the model when the object is obscured or receives interference from external factors, which will lead to tracking drift. In view of these problems, the weights of samples are given according to the distance between the sample and the target when training the Naive Bayesian classifier; hence, the stability of the tracking is improved. While the introduction of the Bhattacharyya coefficient is utilized to adjust the learning factor, this can help parameters to self‐adapt effectively. Experimental results show that the improved tracking algorithm has a better adaption to the target appearance variations, illumination changes, occlusion, and so on, and has better robustness than the original algorithm.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here