Local feature graph neural network for few-shot learning

被引:1
|
作者
Weng P. [1 ]
Dong S. [1 ]
Ren L. [2 ]
Zou K. [1 ]
机构
[1] Zhongshan Institute, University of Electronic Science and Technology China, Guangdong, Zhongshan
[2] Computer Science and Engineering, University of Electronic Science and Technology China, Sichuan, Chengdu
基金
中国国家自然科学基金;
关键词
Few-shot learning; Graph neural network; Local feature;
D O I
10.1007/s12652-023-04545-5
中图分类号
学科分类号
摘要
The few-shot learning method based on local feature attention can suppress the irrelevant distraction in the global information and extract discriminating features. However, empirically defining the relationship between local features cannot fully utilize the power of local feature attention. This paper proposes a local feature graph neural network model (LFGNN), which uses the GNN to automatically extract and aggregate the relationship between different local parts and obtain features with stronger expressive ability for classification. Specifically, a sparse hierarchical connectivity graph is proposed to describe the relationship between features, in which the global features of all samples in the support set and the query set are connected in pairs, and the local features of each sample are only connected to the corresponding global features. Further, a multiple node-edge aggregation strategy is developed to learn a similarity metric. By integrating the edge loss with the classification loss, our LFGNN learns a better classifier to distinguish samples of novel classes. We conducted extensive experiments under the 5-way 1-shot and 5-way 5-shot setting on two benchmark datasets: miniImageNet, tieredImageNet. Experimental results demonstrate that the proposed approach is effective for boosting performance of meta-learning few-shot classification. © 2023, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.
引用
收藏
页码:4343 / 4354
页数:11
相关论文
共 50 条
  • [41] FCGCN: Feature Correlation Graph Convolution Network for Few-Shot Individual Identification
    Feng, Zhongming
    Zha, Haoran
    Xu, Congan
    He, Yuanzhi
    Lin, Yun
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 2848 - 2860
  • [42] GRAPH AFFINITY NETWORK FOR FEW-SHOT SEGMENTATION
    Luo, Xiaoliu
    Zhang, Taiping
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 609 - 613
  • [43] Word Embedding Distribution Propagation Graph Network for Few-Shot Learning
    Zhu, Chaoran
    Wang, Ling
    Han, Cheng
    SENSORS, 2022, 22 (07)
  • [44] Graph Few-shot Learning with Attribute Matching
    Wang, Ning
    Luo, Minnan
    Ding, Kaize
    Zhang, Lingling
    Li, Jundong
    Zheng, Qinghua
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 1545 - 1554
  • [45] Multi-level Attention Feature Network for Few-shot Learning
    Wang Ronggui
    Han Mengya
    Yang Juan
    Xue Lixia
    Hu Min
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2020, 42 (03) : 772 - 778
  • [46] Multi-level Attention Feature Network for Few-shot Learning
    Wang R.
    Han M.
    Yang J.
    Xue L.
    Hu M.
    Yang, Juan (yangjuan@hfut.edu.cn), 1600, Science Press (42): : 772 - 778
  • [47] Tensor feature hallucination for few-shot learning
    Lazarou, Michalis
    Stathaki, Tania
    Avrithis, Yannis
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 2050 - 2060
  • [48] An Adaptive Dual-channel Multi-modal graph neural network for few-shot learning
    Yang, Jieyi
    Dong, Yihong
    Li, Guoqing
    KNOWLEDGE-BASED SYSTEMS, 2025, 310
  • [49] Scale-Aware Graph Neural Network for Few-Shot Semantic Segmentation
    Xie, Guo-Sen
    Liu, Jie
    Xiong, Huan
    Shao, Ling
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 5471 - 5480
  • [50] Research Progress of Few-Shot Learning Methods Based on Graph Neural Networks
    Yang J.
    Dong Y.
    Qian J.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (04): : 856 - 876