z-logo
open-access-imgOpen Access
STL_Siam: Real-time Visual Tracking based on reinforcement guided network
Author(s) -
Shijia Huang,
Luping Wang
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1684/1/012060
Subject(s) - bittorrent tracker , computer science , tracking (education) , artificial intelligence , reliability (semiconductor) , eye tracking , track (disk drive) , baseline (sea) , algorithm , computer vision , reinforcement learning , psychology , pedagogy , power (physics) , physics , quantum mechanics , operating system , oceanography , geology
In recent years, deep visual tracking algorithms based on Siamese has made great breakthrough in both speed and accuracy. However, due to the dependence of Siamese network on the target template, these trackers are prone to drift or even fail to track in a complex tracking environment. In this work, we navigate the Siamese network with a STLNet model and propose the STL_Siam method. The STLNet, trained offline with the dataset enhanced by ROMIX method, is introduced into the SiamGrad algorithm to infer the movement of the target during online tracking, and guides the SiamGrad network to track. In order to evaluate the reliability of the proposed algorithm, we conducted experiments on the OTB2013 and VOT2016 two benchmarks. The algorithm achieved excellent performance, and the performance of the sequences with complex obstruction interference has been slightly improved. Experiments suggests that the proposed architecture reaches 40+fps and gets 0.865 precision on OTB2013, which is higher than results of the ACT. Meanwhile, compared with the baseline algorithm ACT, the A, R, and EAO of the proposed approach is increased by 5.6%, 1%, and 0.3% respectively on VOT2016 respectively.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here