
Category Relevance Redirection Network for Few-Shot Classification
Author(s) -
Xiangtao Tian,
Zhengli Zhao,
Peng Jiang,
Jianxin Wu,
Guoqiang Zhong
Publication year - 2022
Publication title -
ieee access
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.587
H-Index - 127
ISSN - 2169-3536
DOI - 10.1109/access.2022.3199003
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
The metric learning method has been successfully developed in few-shot learning by learning the deeply embedded metric space. However, many methods ignore the relationship between the support set and the query set. In this paper, we propose the category relevance redirection network (CRRNet), which is a novel metric learning-based few-shot learning classification network. The learning of CRRNet includes two stages. In the category relevance pre-training stage, the convolutional block attention module (CBAM) and the classifier are jointly trained using the center loss and cross entropy loss. In the category redirection training stage, the meta fusion block and the weight generator are used to represent the weighted query set features, which are then input into the spatial attention generator to obtain the discriminative features for classification. CRRNet can not only learn the relevant category characteristics of the support set samples, but emphasizes the relevant category characteristics of the current task’s query samples. Experiments on three datasets show that CRRNet can greatly improve the embedding learning performance and classification accuracy of metric-based few-shot classification. For the 5-way 1-shot and 5-way 5-shot scenarios, CRRNet may produce 9.68 percent and 12.42 percent increases over the traditional methods on the miniImageNet dataset, respectively.