Trapezoidal Step Scheduler for Model-Agnostic Meta-Learning in Medical Imaging

被引:0
|
作者
机构
[1] Voon, Wingates
[2] Hum, Yan Chai
[3] Tee, Yee Kai
[4] Yap, Wun-She
[5] Lai, Khin Wee
[6] Nisar, Humaira
[7] Mokayed, Hamam
关键词
D O I
10.1016/j.patcog.2024.111316
中图分类号
学科分类号
摘要
Model-Agnostic Meta-learning (MAML) is a widely adopted few-shot learning (FSL) method designed to mitigate the dependency on large, labeled datasets of deep learning-based methods in medical imaging analysis. However, MAML's reliance on a fixed number of gradient descent (GD) steps for task adaptation results in computational inefficiency and task-level overfitting. To address this issue, we introduce Tra-MAML, which optimizes the balance between model adaptation capacity and computational efficiency through a trapezoidal step scheduler (TRA). The TRA scheduler dynamically adjusts the number of GD steps in the inner optimization loop: initially increasing the steps uniformly to reduce variance, maintaining the maximum number of steps to enhance adaptation capacity, and finally decreasing the steps uniformly to mitigate overfitting. Our evaluation of Tra-MAML against selected FSL methods across four medical imaging datasets demonstrates its superior performance. Notably, Tra-MAML outperforms MAML by 13.36% on the BreaKHis40X dataset in the 3-way 10-shot scenario. © 2024 Elsevier Ltd
引用
收藏
相关论文
共 50 条
  • [1] Probabilistic Model-Agnostic Meta-Learning
    Finn, Chelsea
    Xu, Kelvin
    Levine, Sergey
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [2] Bayesian Model-Agnostic Meta-Learning
    Yoon, Jaesik
    Kim, Taesup
    Dia, Ousmane
    Kim, Sungwoong
    Bengio, Yoshua
    Ahn, Sungjin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Theoretical Convergence of Multi-Step Model-Agnostic Meta-Learning
    Ji, Kaiyi
    Yang, Junjie
    Liang, Yingbin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [4] Theoretical Convergence of Multi-Step Model-Agnostic Meta-Learning
    Ji, Kaiyi
    Yang, Junjie
    Liang, Yingbin
    Journal of Machine Learning Research, 2022, 23
  • [5] Knowledge Distillation for Model-Agnostic Meta-Learning
    Zhang, Min
    Wang, Donglin
    Gai, Sibo
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1355 - 1362
  • [6] Meta weight learning via model-agnostic meta-learning
    Xu, Zhixiong
    Chen, Xiliang
    Tang, Wei
    Lai, Jun
    Cao, Lei
    NEUROCOMPUTING, 2021, 432 : 124 - 132
  • [7] Combining Model-Agnostic Meta-Learning and Transfer Learning for Regression
    Satrya, Wahyu Fadli
    Yun, Ji-Hoon
    SENSORS, 2023, 23 (02)
  • [8] Task-Robust Model-Agnostic Meta-Learning
    Collins, Liam
    Mokhtari, Aryan
    Shakkottai, Sanjay
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [9] Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning
    Raymond, Christian
    Chen, Qi
    Xue, Bing
    Zhang, Mengjie
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 13699 - 13714
  • [10] Uncertainty in Model-Agnostic Meta-Learning using Variational Inference
    Nguyen, Cuong
    Do, Thanh-Toan
    Carneiro, Gustavo
    2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 3079 - 3089