z-logo
open-access-imgOpen Access
Multi‐band joint local sparse tracking via wavelet transforms
Author(s) -
Han Guang,
Luo Heng,
Liu Jixin,
Sun Ning,
Du Kun,
Li Xiaofei
Publication year - 2016
Publication title -
iet computer vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.38
H-Index - 37
eISSN - 1751-9640
pISSN - 1751-9632
DOI - 10.1049/iet-cvi.2016.0079
Subject(s) - artificial intelligence , wavelet transform , computer science , wavelet , video tracking , pattern recognition (psychology) , computer vision , sparse approximation , object (grammar) , block (permutation group theory) , frame (networking) , mathematics , telecommunications , geometry
A novel multi‐band joint local sparse tracking algorithm via wavelet transforms is proposed in this study. The object image may contain rich information of different types; the authors use wavelet transforms to decompose the object image into some sub‐band images first. This will help extract the information in different frequency ranges for the object. Then same block operation is executed on all the sub‐band images. The l 2, 1 mixed‐norm is used to describe the multi‐band joint local sparse representation on each patch; it can effectively extract the structural information in different frequency ranges. Thus, more accurate object appearance model can be established. Second, the coefficients on the diagonal of coefficient matrix are extracted for the confidence degrees of the candidate objects in this band, and then the confidence degree results in all the bands are fused to determine the best candidate object in the current frame. This can effectively alleviate the object drifting. Finally, both qualitative and quantitative evaluation results on 15 challenging video sequences demonstrate that the proposed tracking algorithm in this study can achieve better tracking effects compared with the other state‐of‐the‐art algorithms.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here