Article
Version 1
Preserved in Portico This version is not peer-reviewed
Improving RENet by Introducing Modified Cross Attention for Few-Shot Classification
Version 1
: Received: 28 May 2022 / Approved: 6 June 2022 / Online: 6 June 2022 (09:42:13 CEST)
How to cite: Chang, C.-H.; Yu, T.-L. Improving RENet by Introducing Modified Cross Attention for Few-Shot Classification. Preprints 2022, 2022060079. https://doi.org/10.20944/preprints202206.0079.v1 Chang, C.-H.; Yu, T.-L. Improving RENet by Introducing Modified Cross Attention for Few-Shot Classification. Preprints 2022, 2022060079. https://doi.org/10.20944/preprints202206.0079.v1
Abstract
Few-shot classification is challenging since the goal is to classify unlabeled samples with very few labeled samples provided. It has been shown that cross attention helps generate more discriminative features for few-shot learning. This paper extends the idea and proposes two cross attention modules, namely the cross scaled attention (CSA) and the cross aligned attention (CAA). Specifically, CSA scales different feature maps to make them better matched, and CAA adopts the principal component analysis to further align features from different images. Experiments showed that both CSA and CAA achieve consistent improvements over state-of-the-art methods on four widely used few-shot classification benchmark datasets, miniImageNet, tieredImageNet, CIFAR-FS, and CUB-200-2011, while CSA is slightly faster and CAA achieves higher accuracies.
Keywords
few-shot classification; attention
Subject
Computer Science and Mathematics, Computer Science
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment