Data Augmentation for Meta-Learning

被引:0
|
作者
Ni, Renkun [1 ]
Goldblum, Micah [1 ]
Sharaf, Amr [2 ]
Kong, Kezhi [1 ]
Goldstein, Tom [1 ]
机构
[1] Univ Maryland, Dept Comp Sci, College Pk, MD 20742 USA
[2] Microsoft, Redmond, WA USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conventional image classifiers are trained by randomly sampling mini-batches of images. To achieve state-of-the-art performance, practitioners use sophisticated data augmentation schemes to expand the amount of training data available for sampling. In contrast, meta-learning algorithms sample support data, query data, and tasks on each training step. In this complex sampling scenario, data augmentation can be used not only to expand the number of images available per class, but also to generate entirely new classes/tasks. We systematically dissect the meta-learning pipeline and investigate the distinct ways in which data augmentation can be integrated at both the image and class levels. Our proposed meta-specific data augmentation significantly improves the performance of meta-learners on few-shot classification benchmarks.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Meta-Learning Requires Meta-Augmentation
    Rajendran, Janarthanan
    Irpan, Alex
    Jang, Eric
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [2] Learning Meta-Learning (LML) dataset: Survey data of meta-learning parameters
    Corraya, Sonia
    Al Mamun, Shamim
    Kaiser, M. Shamim
    [J]. DATA IN BRIEF, 2023, 51
  • [3] MEDA: Meta-Learning with Data Augmentation for Few-Shot Text Classification
    Sun, Pengfei
    Ouyang, Yawen
    Zhang, Wenming
    Dai, Xin-yu
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 3929 - 3935
  • [4] Data Efficiency of Meta-learning
    Al-Shedivat, Maruan
    Li, Liam
    Xing, Eric
    Talwalkar, Ameet
    [J]. 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [5] Improving Generalization in Meta-learning via Task Augmentation
    Yao, Huaxiu
    Huang, Long-Kai
    Zhang, Linjun
    Wei, Ying
    Tian, Li
    Zou, James
    Huang, Junzhou
    Li, Zhenhui
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [6] Meta-learning: Data, architecture, and both
    Binz, Marcel
    Dasgupta, Ishita
    Jagadish, Akshay
    Botvinick, Matthew
    Wang, Jane X.
    Schulz, Eric
    [J]. BEHAVIORAL AND BRAIN SCIENCES, 2024, 47
  • [7] Meta-learning Enhancements by Data Partitioning
    Merk, Beata
    Bratu, Camelia Vidrighin
    Potolea, Rodica
    [J]. 2009 IEEE 5TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING, PROCEEDINGS, 2009, : 59 - 62
  • [8] On sensitivity of meta-learning to support data
    Agarwal, Mayank
    Yurochkin, Mikhail
    Sun, Yuekai
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [9] Online Continual Learning via the Meta-learning update with Multi-scale Knowledge Distillation and Data Augmentation
    Han, Ya-nan
    Liu, Jian-wei
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2022, 113
  • [10] Automated Data Cleansing through Meta-Learning
    Gemp, Ian
    Theocharous, Georgios
    Ghavamzadeh, Mohammad
    [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 4760 - 4761