z-logo
open-access-imgOpen Access
Relation Aware Attention for Penson Re-identification
Author(s) -
Junyu Song,
Kaifang Li,
Guancheng Hui,
Miaohui Zhang
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2010/1/012130
Subject(s) - discriminative model , computer science , artificial intelligence , feature (linguistics) , identification (biology) , representation (politics) , relation (database) , benchmark (surveying) , pattern recognition (psychology) , matching (statistics) , machine learning , mathematics , data mining , geography , philosophy , linguistics , botany , politics , political science , law , biology , statistics , geodesy
Person re-identification (Re-ID) means matching people across different camera views based on different locations. It is challengeing for person images where there are background clutter, pose variations, illumination changes, etc. Attention mechanisms have become attractive for person re-identification algorithms as they aim at strengthening discriminative features, which accord with the purpose of person re-id, i.e., learning discriminative features for diffirent pedestrains. Previous approches mostly learn attention by useing local convolutions which have limited receptive filds. In this paper, an Relation Aware Attention(RAA) module is proposed to address this issue. RAA infers attention maps along two dimensions, channel and spatial, then them are mutiplied to the feature as the output map. For each feature position, RAA harvests the pairwise relationship with others as its response. Furthermore, in order to grasp the structure information of global scope and the local apperance information, we stack the relations and the feature to lean the final attention with a convlutional model. We designe the experiment and compare it with the existing benchmark. The resulits show that our attention model can increas the abillity of feature representation.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here