z-logo
open-access-imgOpen Access
SAN-GAL: Spatial Attention Network Guided by Attribute Label for Person Re-identification
Author(s) -
Shaoqi Hou,
Chunhui Liu,
Kangning Yin,
Yiyin Ding,
Zhiguo Wang,
Guangqiang Yin
Publication year - 2021
Publication title -
wireless communications and mobile computing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.42
H-Index - 64
eISSN - 1530-8677
pISSN - 1530-8669
DOI - 10.1155/2021/7557361
Subject(s) - computer science , attention network , adaptability , rank (graph theory) , artificial intelligence , identification (biology) , flexibility (engineering) , data mining , similarity (geometry) , matching (statistics) , task (project management) , trace (psycholinguistics) , pattern recognition (psychology) , machine learning , image (mathematics) , botany , biology , ecology , linguistics , statistics , philosophy , mathematics , management , combinatorics , economics
Person Re-identification (Re-ID) is aimed at solving the matching problem of the same pedestrian at a different time and in different places. Due to the cross-device condition, the appearance of different pedestrians may have a high degree of similarity; at this time, using the global features of pedestrians to match often cannot achieve good results. In order to solve these problems, we designed a Spatial Attention Network Guided by Attribute Label (SAN-GAL), which is a dual-trace network containing both attribute classification and Re-ID. Different from the previous approach of simply adding a branch of attribute binary classification network, our SAN-GAL is mainly divided into two connecting steps. First, with attribute labels as guidance, we generate Attribute Attention Heat map (AAH) through Grad-CAM algorithm to accurately locate fine-grained attribute areas of pedestrians. Then, the Attribute Spatial Attention Module (ASAM) is constructed according to the AHH which is taken as the prior knowledge and introduced into the Re-ID network to assist in the discrimination of the Re-ID task. In particular, our SAN-GAL network can integrate the local attribute information and global ID information of pedestrians without introducing additional attribute region annotation, which has good flexibility and adaptability. The test results on Market1501 and DukeMTMC-reID show that our SAN-GAL can achieve good results and can achieve 85.8% Rank-1 accuracy on DukeMTMC-reID dataset, which is obviously competitive compared with most Re-ID algorithms.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom