z-logo
open-access-imgOpen Access
Mask‐guided class activation mapping network for person re‐identification
Author(s) -
Lian Sicheng,
Hu Haifeng
Publication year - 2020
Publication title -
electronics letters
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.375
H-Index - 146
ISSN - 1350-911X
DOI - 10.1049/el.2020.1643
Subject(s) - computer science , pooling , artificial intelligence , invariant (physics) , identification (biology) , channel (broadcasting) , binary number , pattern recognition (psychology) , convolution (computer science) , class (philosophy) , algorithm , artificial neural network , mathematics , arithmetic , computer network , botany , mathematical physics , biology
In this Letter, the authors propose a novel mask‐guided class activation mapping (MCAM) network for person re‐identification, which learns background‐invariant and view‐invariant features. Specifically, a novel loss function named mask‐guided mapping loss is meticulously formulated to utilise the human binary masks, which contain helpful body shape information as the reference standard, thereby guiding the model to place more emphasis on human body regions. Moreover, they propose a new weighted channel attention (WCA) module, which replaces the global average pooling with a global depthwise convolution layer. By virtue of this particular WCA module, the feature information distributed across the spatial space can be individually weighted and dynamically compressed into a more precise channel attention map. Extensive experiments have been carried out on three widely‐used re‐identification data sets. Compared with the baseline model, MCAM has gained rank‐1 accuracy improvement of 2.0% on Market‐1501, 6.0% on DukeMTMC‐reID, and 7.5% on CUHK03‐NP, confirming its effectiveness.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here