Continual Adaptation of Visual Representations via Domain Randomization and Meta-learning

被引:34
|
作者
Volpi, Riccardo [1 ]
Larlus, Diane [1 ]
Rogez, Gregory [1 ]
机构
[1] NAVER LABS Europe, Meylan, France
关键词
D O I
10.1109/CVPR46437.2021.00442
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most standard learning approaches lead to fragile models which are prone to drift when sequentially trained on samples of a different nature-the well-known catastrophic forgetting issue. In particular, when a model consecutively learns from different visual domains, it tends to forget the past domains in favor of the most recent ones. In this context, we show that one way to learn models that are inherently more robust against forgetting is domain randomization-for vision tasks, randomizing the current domain's distribution with heavy image manipulations. Building on this result, we devise a meta-learning strategy where a regularizer explicitly penalizes any loss associated with transferring the model from the current domain to different "auxiliary" meta-domains, while also easing adaptation to them. Such meta-domains are also generated through randomized image manipulations. We empirically demonstrate in a variety of experiments-spanning from classification to semantic segmentation-that our approach results in models that are less prone to catastrophic forgetting when transferred to new domains.
引用
收藏
页码:4441 / 4451
页数:11
相关论文
共 50 条
  • [1] Meta-Learning Representations for Continual Learning
    Javed, Khurram
    White, Martha
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [2] Visual Tracking by Adaptive Continual Meta-Learning
    Choi, Janghoon
    Baik, Sungyong
    Choi, Myungsub
    Kwon, Junseok
    Lee, Kyoung Mu
    [J]. IEEE ACCESS, 2022, 10 : 9022 - 9035
  • [3] Continual meta-learning algorithm
    Mengjuan Jiang
    Fanzhang Li
    Li Liu
    [J]. Applied Intelligence, 2022, 52 : 4527 - 4542
  • [4] Continual meta-learning algorithm
    Jiang, Mengjuan
    Li, Fanzhang
    Liu, Li
    [J]. APPLIED INTELLIGENCE, 2022, 52 (04) : 4527 - 4542
  • [5] Meta-learning for efficient unsupervised domain adaptation
    Vettoruzzo, Anna
    Bouguelia, Mohamed-Rafik
    Roegnvaldsson, Thorsteinn
    [J]. NEUROCOMPUTING, 2024, 574
  • [6] Variational Continual Bayesian Meta-Learning
    Zhang, Qiang
    Fang, Jinyuan
    Meng, Zaiqiao
    Liang, Shangsong
    Yilmaz, Emine
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [7] Fast Context Adaptation via Meta-Learning
    Zintgraf, Luisa
    Shiarlis, Kyriacos
    Kurin, Vitaly
    Hofmann, Katja
    Whiteson, Shimon
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [8] MULTI-INITIALIZATION META-LEARNING WITH DOMAIN ADAPTATION
    Chen, Zhengyu
    Wang, Donglin
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 1390 - 1394
  • [9] Leveraging Meta-Learning To Improve Unsupervised Domain Adaptation
    Farhadi, Amirfarhad
    Sharifi, Arash
    [J]. COMPUTER JOURNAL, 2023, 67 (05): : 1838 - 1850
  • [10] Learning Tensor Representations for Meta-Learning
    Deng, Samuel
    Guo, Yilin
    Hsu, Daniel
    Mandal, Debmalya
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151