z-logo
open-access-imgOpen Access
Particle filter with occlusion handling for visual tracking
Author(s) -
Lin Shinfeng D.,
Lin JiaJen,
Chuang ChihYao
Publication year - 2015
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2014.0666
Subject(s) - computer vision , artificial intelligence , particle filter , robustness (evolution) , computer science , tracking (education) , weighting , occlusion , eye tracking , feature extraction , video tracking , filter (signal processing) , pattern recognition (psychology) , video processing , medicine , psychology , pedagogy , biochemistry , chemistry , radiology , cardiology , gene
Visual tracking is widely used in many computer vision applications such as surveillance, traffic monitoring, robot vision, human behaviour analysis, and so on. Thus, visual tracking has attracted much attention in recent years. However, there are still some challenges needed to be solved. The main problems include illumination variation, scale variation, scene change, cluttered background, similar appearance, occlusion, and real time. To solve some issues in visual tracking, the authors propose a visual tracking method using particle filter with occlusion handling. This method contains three major parts: feature extraction, particles weighting, and occlusion handling. The patch‐based appearance model is presented for occlusion handling, which contains two main features: colour and motion vector. For tracking failure, the authors also propose error recovery by using speeded up robust feature. In addition, the procedures of occlusion detection and model updating make their tracking more robust. Experimental results demonstrate the robustness and efficiency with challenging sequences. The comparative performance of the proposed method is shown as well.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here