z-logo
open-access-imgOpen Access
A two‐branch network with pyramid‐based local and spatial attention global feature learning for vehicle re‐identification
Author(s) -
Yang Jucheng,
Xing Di,
Hu Zhiqiang,
Yao Tong
Publication year - 2021
Publication title -
caai transactions on intelligence technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.613
H-Index - 15
ISSN - 2468-2322
DOI - 10.1049/cit2.12001
Subject(s) - pooling , pyramid (geometry) , artificial intelligence , feature (linguistics) , discriminative model , pattern recognition (psychology) , identification (biology) , computer science , feature extraction , feature learning , computer vision , mathematics , linguistics , philosophy , botany , geometry , biology
Abstract In recent years, vehicle re‐identification has attracted more and more attention. How to learn the discriminative information from multi‐view vehicle images becomes one of the challenging problems in vehicle re‐identification field. For example, when the viewpoint of the image changes, the features extracted from one image may be lost in another image. A two‐branch network with pyramid‐based local and spatial attention global feature learning (PSA) is proposed for vehicle re‐identification to solve this issue. Specifically, one branch learns local features at different scales by building pyramid from coarse to fine and the other branch learns attentive global features by using spatial attention module. Subsequently, pooling operation by using global maximum pooling (GMP) for local features and global average pooling (GAP) for global feature is performed. Finally, local feature vectors and global feature vector extracted from the last pooling layer, respectively, are employed for identity re‐identification. The experimental results demonstrate that the proposed method achieves state‐of‐the‐art results on the VeRi‐776 dataset and VehicleID dataset.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here