Boosting Few-Shot Classification with Lie Group Contrastive Learning

被引:0
|
作者
He, Feihong [1 ]
Li, Fanzhang [1 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Suzhou 215006, Peoples R China
基金
中国国家自然科学基金;
关键词
Few-shot learning; Contrative learning; Lie group;
D O I
10.1007/978-3-031-44207-0_9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Few-shot learning can alleviate the issue of sample scarcity, however, there remains a certain degree of overfitting. There have been solutions for this problem by combining contrastive learning with few-shot learning. In previous works, sample pairs are usually constructed with traditional data augmentation. The fitting of traditional data augmentation methods to real sample distributions poses difficulties. In this paper, our method employs Lie group transformations for data augmentation, resulting in the model learning more discriminative feature representations. Otherwise, we consider the congruence between contrastive learning and few-shot learning with respect to classification objectives. We also incorporate an attention mechanism into the model. Utilizing the attention module obtained through contrastive learning, the performance of few-shot learning can be improved. Inspired by the loss function of contrastive learning, we incorporate a penalty term into the loss function for few-shot classification. This penalty term serves to regulate the similarity between classes and non-classes. We conduct experiments with two different feature extraction networks on the standard few-shot image classification benchmark datasets, namely miniImageNet and tieredImageNet. The experimental results show that the proposed method effectively improves the performance of the few-shot classification.
引用
收藏
页码:99 / 111
页数:13
相关论文
共 50 条
  • [1] Few-Shot Classification with Contrastive Learning
    Yang, Zhanyuan
    Wang, Jinghua
    Zhu, Yingying
    [J]. COMPUTER VISION, ECCV 2022, PT XX, 2022, 13680 : 293 - 309
  • [2] Diversified Contrastive Learning For Few-Shot Classification
    Lu, Guangtong
    Li, Fanzhang
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT I, 2023, 14254 : 147 - 158
  • [3] Spatial Contrastive Learning for Few-Shot Classification
    Ouali, Yassine
    Hudelot, Celine
    Tami, Myriam
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, 2021, 12975 : 671 - 686
  • [4] Multimodal variational contrastive learning for few-shot classification
    Pan, Meihong
    Shen, Hongbin
    [J]. APPLIED INTELLIGENCE, 2024, 54 (02) : 1879 - 1892
  • [5] Supervised Contrastive Learning for Few-Shot Action Classification
    Han, Hongfeng
    Fei, Nanyi
    Lu, Zhiwu
    Wen, Ji-Rong
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT III, 2023, 13715 : 512 - 528
  • [6] Multimodal variational contrastive learning for few-shot classification
    Meihong Pan
    Hongbin Shen
    [J]. Applied Intelligence, 2024, 54 : 1879 - 1892
  • [7] ContrastNet: A Contrastive Learning Framework for Few-Shot Text Classification
    Chen, Junfan
    Zhang, Richong
    Mao, Yongyi
    Xu, Jie
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 10492 - 10500
  • [8] Supervised Graph Contrastive Learning for Few-Shot Node Classification
    Tan, Zhen
    Ding, Kaize
    Guo, Ruocheng
    Liu, Huan
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT II, 2023, 13714 : 394 - 411
  • [9] Contrastive Meta-Learning for Few-shot Node Classification
    Wang, Song
    Tan, Zhen
    Liu, Huan
    Li, Jundong
    [J]. PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 2386 - 2397
  • [10] CPCL: Conceptual prototypical contrastive learning for Few-Shot text classification
    Cheng, Tao
    Cheng, Hua
    Fang, Yiquan
    Liu, Yufei
    Gao, Caiting
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 45 (06) : 11963 - 11975