Context Adaptive Metric Model for Meta-learning

被引:0
|
作者
Wang, Zhe [1 ]
Li, Fanzhang [1 ]
机构
[1] Soochow Univ, Prov Key Lab Comp Informat Proc Technol, Suzhou 215006, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Context adaptive; Metric model; Meta-learning; Few-shot learning;
D O I
10.1007/978-3-030-61609-0_31
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The metric-based meta-learning is effective to solve few-shot problems. Generally, a metric model learns a task-agnostic embedding function, maps instances to a low-dimensional embedding space, then classifies unlabeled examples by similarity comparison. However, different classification tasks have individual discriminative characteristics, and previous approaches are constrained to use a single set of features for all possible tasks. In this work, we introduce a Context Adaptive Metric Model (CAMM), which has adaptive ability to extract key features and can be used for most metric models. Our extension consists of two parts: Context parameter module and Self-evaluation module. The context is interpreted as a task representation that modulates the behavior of feature extractor. CAMM fine-tunes context parameters via Self-evaluation module to generate task-specific embedding functions. We demonstrate that our approach is competitive with recent state-of-the-art systems, improves performance considerably (4%-6% relative) over baselines on mini-imagenet benchmark. Our code is publicly available at https://github.com/Jorewang/CAMM.
引用
收藏
页码:393 / 405
页数:13
相关论文
共 50 条
  • [1] Meta-Learning with Adaptive Hyperparameters
    Baik, Sungyong
    Choi, Myungsub
    Choi, Janghoon
    Kim, Heewon
    Lee, Kyoung Mu
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [2] Meta-learning via Language Model In-context Tuning
    Chen, Yanda
    Zhong, Ruiqi
    Zha, Sheng
    Karypis, George
    He, He
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 719 - 730
  • [3] Variational Metric Scaling for Metric-Based Meta-Learning
    Chen, Jiaxin
    Zhan, Li-Ming
    Wu, Xiao-Ming
    Chung, Fu-lai
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 3478 - 3485
  • [4] SPEAKER ADAPTIVE TRAINING USING MODEL AGNOSTIC META-LEARNING
    Klejch, Ondrej
    Fainberg, Joachim
    Bell, Peter
    Renals, Steve
    [J]. 2019 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU 2019), 2019, : 881 - 888
  • [5] Adaptive Code Completion with Meta-learning
    Fang, Liyu
    Huang, Zhiqiu
    Zhou, Yu
    Chen, Taolue
    [J]. THE 12TH ASIA-PACIFIC SYMPOSIUM ON INTERNETWARE, INTERNETWARE 2020, 2021, : 116 - 125
  • [6] Meta-learning for Adaptive Image Segmentation
    Sellaouti, Aymen
    Jaafra, Yasmina
    Hamouda, Atef
    [J]. IMAGE ANALYSIS AND RECOGNITION, ICIAR 2014, PT I, 2014, 8814 : 187 - 197
  • [7] Meta-learning with an Adaptive Task Scheduler
    Yao, Huaxiu
    Wang, Yu
    Wei, Ying
    Zhao, Peilin
    Mahdavi, Mehrdad
    Lian, Defu
    Finn, Chelsea
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [8] Tracking Context Changes through Meta-Learning
    Gerhard Widmer
    [J]. Machine Learning, 1997, 27 : 259 - 286
  • [9] Meta-AF: Meta-Learning for Adaptive Filters
    Casebeer, Jonah
    Bryan, Nicholas J.
    Smaragdis, Paris
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 355 - 370
  • [10] Fast Context Adaptation via Meta-Learning
    Zintgraf, Luisa
    Shiarlis, Kyriacos
    Kurin, Vitaly
    Hofmann, Katja
    Whiteson, Shimon
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97