z-logo
open-access-imgOpen Access
Online Multiple Object Tracking with Reid Feature Extraction Network and Similarity Matrix Network
Author(s) -
Qingge Ji,
Haoqiang Yu
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1544/1/012147
Subject(s) - similarity (geometry) , artificial intelligence , computer science , pattern recognition (psychology) , matching (statistics) , euclidean distance , object (grammar) , feature extraction , video tracking , association (psychology) , feature (linguistics) , data mining , frame (networking) , tracking (education) , image (mathematics) , mathematics , statistics , psychology , telecommunications , pedagogy , philosophy , linguistics , epistemology
In multiple object tracking(MOT), data association is a crucial part. By constructing similarity loss matrix for trajectories and detections, they can be matched correspondingly by using Hungarian matching algorithm. However, the similarity loss is often obtained by calculating the Euclidean distance or other handcraft distance metrics of features extracted between objects, which may not be robust enough, resulting in matching inaccuracy. In this paper, we propose a novel MOT method with applying deep learning to feature extraction and data association. We firstly design the appearance feature extraction network(AFN) to learn effective features by training it on a large-scale person re-identification dataset(reid). Then, we propose the similarity matrix estimation network (SMN) to obtain reliable similarity by training it on the public MOT Challenge dataset, MOT17. Additionally, the similarity matrix output of SMN includes the dummy objects, which are used to deal with the association problems of object missing and object appearance between frames. In the end, our proposed MOT method is evaluated on MOT15, MOT17 and ablation study is carried out.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here