Meta-Learning Representations for Continual Learning

被引:0
|
作者
Javed, Khurram [1 ]
White, Martha [1 ]
机构
[1] Univ Alberta, Dept Comp Sci, Edmonton, AB T6G 1P8, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A continual learning agent should be able to build on top of existing knowledge to learn on new data quickly while minimizing forgetting. Current intelligent systems based on neural network function approximators arguably do the opposite-they are highly prone to forgetting and rarely trained to facilitate future learning. One reason for this poor behavior is that they learn from a representation that is not explicitly trained for these two goals. In this paper, we propose OML, an objective that directly minimizes catastrophic interference by learning representations that accelerate future learning and are robust to forgetting under online updates in continual learning. We show that it is possible to learn naturally sparse representations that are more effective for online updating. Moreover, our algorithm is complementary to existing continual learning strategies, such as MER and GEM. Finally, we demonstrate that a basic online updating strategy on representations learned by OML is competitive with rehearsal based methods for continual learning. (1)
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Continual meta-learning algorithm
    Mengjuan Jiang
    Fanzhang Li
    Li Liu
    [J]. Applied Intelligence, 2022, 52 : 4527 - 4542
  • [2] Continual meta-learning algorithm
    Jiang, Mengjuan
    Li, Fanzhang
    Liu, Li
    [J]. APPLIED INTELLIGENCE, 2022, 52 (04) : 4527 - 4542
  • [3] Continual Adaptation of Visual Representations via Domain Randomization and Meta-learning
    Volpi, Riccardo
    Larlus, Diane
    Rogez, Gregory
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 4441 - 4451
  • [4] Learning Tensor Representations for Meta-Learning
    Deng, Samuel
    Guo, Yilin
    Hsu, Daniel
    Mandal, Debmalya
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [5] Variational Continual Bayesian Meta-Learning
    Zhang, Qiang
    Fang, Jinyuan
    Meng, Zaiqiao
    Liang, Shangsong
    Yilmaz, Emine
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [6] Meta-learning of Textual Representations
    Madrid, Jorge G.
    Jair Escalante, Hugo
    Morales, Eduardo
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT I, 2020, 1167 : 57 - 67
  • [7] Efficient Meta-Learning for Continual Learning with Taylor Expansion Approximation
    Zou, Xiaohan
    Lin, Tong
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [8] Visual Tracking by Adaptive Continual Meta-Learning
    Choi, Janghoon
    Baik, Sungyong
    Choi, Myungsub
    Kwon, Junseok
    Lee, Kyoung Mu
    [J]. IEEE ACCESS, 2022, 10 : 9022 - 9035
  • [9] Towards Continual Reinforcement Learning through Evolutionary Meta-Learning
    Grbic, Djordje
    Risi, Sebastian
    [J]. PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION (GECCCO'19 COMPANION), 2019, : 119 - 120
  • [10] Reconciling meta-learning and continual learning with online mixtures of tasks
    Jerfel, Ghassen
    Grant, Erin
    Griffiths, Thomas L.
    Heller, Katherine
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32