Continual meta-learning algorithm

被引:0
|
作者
Mengjuan Jiang
Fanzhang Li
Li Liu
机构
[1] Soochow University,School of Computer Science and Technology
来源
Applied Intelligence | 2022年 / 52卷
关键词
Continual meta-learning algorithm; Deep learning; Neural network; Catastrophic forgetting; Meta-learning;
D O I
暂无
中图分类号
学科分类号
摘要
Deep learning has accomplished impressive excellence in many fields. However, its achievement relies on a vast amount of marker data and when there is insufficient labeled data, the phenomenon of over-fitting will occur. On the other hand, the real world tends to be so non-stationary that neural networks cannot learn continuously like humans. The specific manifestation is that learning new tasks leads to a significant decrease in its performance on old tasks. In responding to the above problem, this paper proposes a new algorithm CMLA (Continual Meta-Learning Algorithm) based on meta-learning. CMLA cannot only extract the key features of the sample, but also optimize the update method of the task gradient by introducing the cosine similarity judgment mechanism. The algorithm is tested on miniImageNet and Fewshot-CIFAR100 (Canadian Institute For Advanced Research), and the outcome clearly reveals the effectiveness and superiority of the CMLA in comparison with other advanced systems. Especially compared to MAML (Model-Agnostic Meta-Learning) with standard four-layer convolution, the accuracy of 1 shot and 5 shot is improved by 15.4% and 16.91% respectively under the setting of 5-way on miniImageNet. CMLA not only reduces the instability of the adaptation process, but also solves the stability-plasticity dilemma to a certain extent, achieving the goal of continual learning.
引用
收藏
页码:4527 / 4542
页数:15
相关论文
共 50 条
  • [1] Continual meta-learning algorithm
    Jiang, Mengjuan
    Li, Fanzhang
    Liu, Li
    [J]. APPLIED INTELLIGENCE, 2022, 52 (04) : 4527 - 4542
  • [2] Meta-Learning Representations for Continual Learning
    Javed, Khurram
    White, Martha
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [3] Variational Continual Bayesian Meta-Learning
    Zhang, Qiang
    Fang, Jinyuan
    Meng, Zaiqiao
    Liang, Shangsong
    Yilmaz, Emine
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [4] On the Stability-Plasticity Dilemma in Continual Meta-Learning: Theory and Algorithm
    Chen, Qi
    Shui, Changjian
    Han, Ligong
    Marchand, Mario
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [5] Visual Tracking by Adaptive Continual Meta-Learning
    Choi, Janghoon
    Baik, Sungyong
    Choi, Myungsub
    Kwon, Junseok
    Lee, Kyoung Mu
    [J]. IEEE ACCESS, 2022, 10 : 9022 - 9035
  • [6] Efficient Meta-Learning for Continual Learning with Taylor Expansion Approximation
    Zou, Xiaohan
    Lin, Tong
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [7] Towards Continual Reinforcement Learning through Evolutionary Meta-Learning
    Grbic, Djordje
    Risi, Sebastian
    [J]. PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION (GECCCO'19 COMPANION), 2019, : 119 - 120
  • [8] Reconciling meta-learning and continual learning with online mixtures of tasks
    Jerfel, Ghassen
    Grant, Erin
    Griffiths, Thomas L.
    Heller, Katherine
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [9] Continual Quality Estimation with Online Bayesian Meta-Learning
    Obamuyide, Abiola
    Fomicheva, Marina
    Specia, Lucia
    [J]. ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 190 - 197
  • [10] Large-Scale Meta-Learning with Continual Trajectory Shifting
    Shin, JaeWoong
    Lee, Hae Beom
    Gong, Boqing
    Hwang, Sung Ju
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139