Fast Context Adaptation via Meta-Learning

被引:0
|
作者
Zintgraf, Luisa [1 ]
Shiarlis, Kyriacos [1 ,2 ]
Kurin, Vitaly [1 ,2 ]
Hofmann, Katja [3 ]
Whiteson, Shimon [1 ,2 ]
机构
[1] Univ Oxford, Oxford, England
[2] Latent Logic, Oxford, England
[3] Microsoft Res, Redmond, WA USA
基金
欧洲研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose CAVIA for meta-learning, a simple extension to MAML that is less prone to meta-overfitting, easier to parallelise, and more interpretable. CAVIA partitions the model parameters into two parts: context parameters that serve as additional input to the model and are adapted on individual tasks, and shared parameters that are meta-trained and shared across tasks. At test time, only the context parameters are updated, leading to a low-dimensional task representation. We show empirically that CAVIA outperforms MAML for regression, classification, and reinforcement learning. Our experiments also high-light weaknesses in current benchmarks, in that the amount of adaptation needed in some cases is small.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Fast Adaptation of Radar Detection via Online Meta-learning
    Khan, Zareen
    Jiang, Wei
    Haimovich, Alexander
    Govoni, Mark
    Garner, Timothy
    Simeone, Osvaldo
    2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 580 - 585
  • [2] Fast Network Alignment via Graph Meta-Learning
    Zhou, Fan
    Cao, Chengtai
    Trajcevski, Goce
    Zhang, Kunpeng
    Zhong, Ting
    Geng, Ji
    IEEE INFOCOM 2020 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2020, : 686 - 695
  • [3] Fast Power Control Adaptation via Meta-Learning for Random Edge Graph Neural Networks
    Nikoloska, Ivana
    Simeone, Osvaldo
    SPAWC 2021: 2021 IEEE 22ND INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC 2021), 2021, : 146 - 150
  • [4] Fast Adaptation for Cold-start Collaborative Filtering with Meta-learning
    Wei, Tianxin
    Wu, Ziwei
    Li, Ruirui
    Hu, Ziniu
    Feng, Fuli
    He, Xiangnan
    Sun, Yizhou
    Wang, Wei
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 661 - 670
  • [5] Meta-learning for fast incremental learning
    Oohira, T
    Yamauchi, K
    Omori, T
    ARTIFICAIL NEURAL NETWORKS AND NEURAL INFORMATION PROCESSING - ICAN/ICONIP 2003, 2003, 2714 : 157 - 164
  • [6] Meta-Learning for Fast Cross-Lingual Adaptation in Dependency Parsing
    Langedijk, Anna
    Dankers, Verna
    Lippe, Phillip
    Bos, Sander
    Guevara, Bryan Cardenas
    Yannakoudakis, Helen
    Shutova, Ekaterina
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 8503 - 8520
  • [7] Curvature-Adaptive Meta-Learning for Fast Adaptation to Manifold Data
    Gao, Zhi
    Wu, Yuwei
    Harandi, Mehrtash
    Jia, Yunde
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (02) : 1545 - 1562
  • [8] Meta-learning via Language Model In-context Tuning
    Chen, Yanda
    Zhong, Ruiqi
    Zha, Sheng
    Karypis, George
    He, He
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 719 - 730
  • [9] Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
    Finn, Chelsea
    Abbeel, Pieter
    Levine, Sergey
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [10] Fast Sparse Connectivity Network Adaption via Meta-Learning
    Jin, Bo
    Cheng, Ke
    Qu, Yue
    Zhang, Liang
    Xiao, Keli
    Lu, Xinjiang
    Wei, Xiaopeng
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 232 - 241