Generalization Bounds for Meta-Learning: An Information-Theoretic Analysis

被引:0
|
作者
Chen, Qi [1 ]
Shui, Changjian [2 ]
Marchand, Mario [1 ]
机构
[1] Univ Laval, Dept Comp Sci & Software Engn, Quebec City, PQ, Canada
[2] Univ Laval, Dept Elect Engn & Comp Engn, Quebec City, PQ, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
MODEL; BIAS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We derive a novel information-theoretic analysis of the generalization property of meta-learning algorithms. Concretely, our analysis proposes a generic understanding of both the conventional learning-to-learn framework [1] and the modern model-agnostic meta learning (MAML) algorithms [2]. Moreover, we provide a data-dependent generalization bound for a stochastic variant of MAML, which is non-vacuous for deep few-shot learning. As compared to previous bounds that depend on the square norm of gradients, empirical validations on both simulated data and a well-known few-shot benchmark show that the proposed bound is orders of magnitude tighter in most situations.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Information-Theoretic Generalization Bounds for Meta-Learning and Applications
    Jose, Sharu Theresa
    Simeone, Osvaldo
    [J]. ENTROPY, 2021, 23 (01) : 1 - 28
  • [2] Information-Theoretic Measures for Meta-learning
    Segrera, Saddys
    Pinho, Joel
    Moreno, Maria N.
    [J]. HYBRID ARTIFICIAL INTELLIGENCE SYSTEMS, 2008, 5271 : 458 - 465
  • [3] Information-Theoretic Analysis of Epistemic Uncertainty in Bayesian Meta-learning
    Jose, Sharu Theresa
    Park, Sangwoo
    Simeone, Osvaldo
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [4] An Information-Theoretic Analysis of the Impact of Task Similarity on Meta-Learning
    Jose, Sharu Theresa
    Simeone, Osvaldo
    [J]. 2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 1534 - 1539
  • [5] Information-Theoretic Bounds on the Moments of the Generalization Error of Learning Algorithms
    Aminian, Gholamali
    Toni, Laura
    Rodrigues, Miguel R. D.
    [J]. 2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 682 - 687
  • [6] On the Generalization for Transfer Learning: An Information-Theoretic Analysis
    Wu, Xuetong
    Manton, Jonathan H.
    Aickelin, Uwe
    Zhu, Jingge
    [J]. IEEE Transactions on Information Theory, 2024, 70 (10) : 7089 - 7124
  • [7] Strengthened Information-theoretic Bounds on the Generalization Error
    Issa, Ibrahim
    Esposito, Amedeo Roberto
    Gastpar, Michael
    [J]. 2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 582 - 586
  • [8] Improved Information-Theoretic Generalization Bounds for Distributed, Federated, and Iterative Learning
    Barnes, Leighton Pate
    Dytso, Alex
    Poor, Harold Vincent
    [J]. ENTROPY, 2022, 24 (09)
  • [9] Information-Theoretic Bounds on the Generalization Error and Privacy Leakage in Federated Learning
    Yagli, Semih
    Dytso, Alex
    Poor, H. Vincent
    [J]. PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [10] Information-theoretic generalization bounds for black-box learning algorithms
    Harutyunyan, Hrayr
    Raginsky, Maxim
    Ver Steeg, Greg
    Galstyan, Aram
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34