Transductive Information Maximization For Few-Shot Learning

被引:0
|
作者
Boudiaf, Malik [1 ]
Masud, Ziko Imtiaz [1 ]
Rony, Jerome [1 ]
Dolz, Jose [1 ]
Piantanida, Pablo [2 ]
Ben Ayed, Ismail [1 ]
机构
[1] ETS Montreal, Montreal, PQ, Canada
[2] Univ Paris Saclay, Cent Supelec CNRS, Gif Sur Yvette, France
基金
加拿大自然科学与工程研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce Transductive Infomation Maximization (TIM) for few-shot learning. Our method maximizes the mutual information between the query features and their label predictions for a given few-shot task, in conjunction with a supervision loss based on the support set. Furthermore, we propose a new alternating-direction solver for our mutual-information loss, which substantially speeds up transductive-inference convergence over gradient-based optimization, while yielding similar accuracy. TIM inference is modular: it can be used on top of any base-training feature extractor. Following standard transductive few-shot settings, our comprehensive experiments(2) demonstrate that TIM outperforms state-of-the-art methods significantly across various datasets and networks, while used on top of a fixed feature extractor trained with simple cross-entropy on the base classes, without resorting to complex meta-learning schemes. It consistently brings between 2% and 5% improvement in accuracy over the best performing method, not only on all the well-established few-shot benchmarks but also on more challenging scenarios, with domain shifts and larger numbers of classes.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Transductive distribution calibration for few-shot learning
    Li, Gang
    Zheng, Changwen
    Su, Bing
    Neurocomputing, 2022, 500 : 604 - 615
  • [2] Realistic Evaluation of Transductive Few-Shot Learning
    Veilleux, Olivier
    Boudiaf, Malik
    Piantanida, Pablo
    Ben Ayed, Ismail
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] Transductive distribution calibration for few-shot learning
    Li, Gang
    Zheng, Changwen
    Su, Bing
    NEUROCOMPUTING, 2022, 500 : 604 - 615
  • [4] Adaptive multi-scale transductive information propagation for few-shot learning
    Fu, Sichao
    Liu, Baodi
    Liu, Weifeng
    Zou, Bin
    You, Xinhua
    Peng, Qinmu
    Jing, Xiao-Yuan
    KNOWLEDGE-BASED SYSTEMS, 2022, 249
  • [5] EASE: Unsupervised Discriminant Subspace Learning for Transductive Few-Shot Learning
    Zhu, Hao
    Koniusz, Piotr
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 9068 - 9078
  • [6] Relation fusion propagation network for transductive few-shot learning
    Huang, Yixiang
    Hao, Hongyu
    Ge, Weichao
    Cao, Yang
    Wu, Ming
    Zhang, Chuang
    Guo, Jun
    PATTERN RECOGNITION, 2024, 151
  • [7] A transductive learning method to leverage graph structure for few-shot learning
    Wang, Yaning
    Liu, Zijian
    Luo, Yang
    Luo, Chunbo
    PATTERN RECOGNITION LETTERS, 2022, 159 : 189 - 195
  • [8] ADAPTIVE ANCHOR LABEL PROPAGATION FOR TRANSDUCTIVE FEW-SHOT LEARNING
    Lazarou, Michalis
    Avrithis, Yannis
    Ren, Guangyu
    Stathaki, Tania
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 331 - 335
  • [9] Transductive clustering optimization learning for few-shot image classification
    Wang, Yi
    Bian, Xiong
    Zhu, Songhao
    JOURNAL OF ELECTRONIC IMAGING, 2023, 32 (04)
  • [10] Transductive Relation-Propagation Network for Few-shot Learning
    Ma, Yuqing
    Bai, Shihao
    An, Shan
    Liu, Wei
    Liu, Aishan
    Zhen, Xiantong
    Liu, Xianglong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 804 - 810