
Global Contrastive Person Re-identification
Author(s) -
Shengyu Pei,
Xiaoping Fan
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1757/1/012035
Subject(s) - pedestrian , computer science , identification (biology) , artificial intelligence , feature (linguistics) , object (grammar) , embedding , computer vision , pedestrian detection , pattern recognition (psychology) , geography , linguistics , philosophy , botany , archaeology , biology
Solving the problem of pedestrians being occluded by objects is extremely challenging. Using part-level features to describe pedestrian images can provide fine-grained information. However, only paying attention to the local features of body will lack global pedestrian information. And the network consumes time and memory. To solve these problems, we propose a new person re-identification network. The network uses a global contrastivemodule to obtain the features of pedestrians. Through effective use of the pedestrian’s global features, as well as the pedestrian’s personal information and global contrastive information, the pedestrian can be found in the object occlusion to provide a reliable feature embedding. Our model is tested on Market1501, Duke MTMC-reID, CUHK03 and MSMT17 datasets. The experimental results show that our method is effective in occluded person re-identification