Exact Gap between Generalization Error and Uniform Convergence in Random Feature Models

被引:0
|
作者
Yang, Zitong [1 ]
Bai, Yu [2 ]
Mei, Song [3 ]
机构
[1] Univ Calif Berkeley, Dept Elect Engn & Comp Sci, Berkeley, CA 94720 USA
[2] Univ Calif Berkeley, Salesforce Res, Berkeley, CA 94720 USA
[3] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent work showed that there could be a large gap between the classical uniform convergence bound and the actual test error of zero-training-error predictors (interpolators) such as deep neural networks. To better understand this gap, we study the uniform convergence in the nonlinear random feature model and perform a precise theoretical analysis on how uniform convergence depends on the sample size and the number of parameters. We derive and prove analytical expressions for three quantities in this model: 1) classical uniform convergence over norm balls, 2) uniform convergence over interpolators in the norm ball (recently proposed by Zhou et al. (2020)), and 3) the risk of minimum norm interpolator. We show that, in the setting where the classical uniform convergence bound is vacuous (diverges to oc), uniform convergence over the interpolators still gives a non-trivial bound of the test error of interpolating solutions. We also showcase a different setting where classical uniform convergence bound is non-vacuous, but uniform convergence over interpolators can give an improved sample complexity guarantee. Our result provides a first exact comparison between the test errors and uniform convergence bounds for interpolators beyond simple linear models.
引用
收藏
页数:12
相关论文
共 18 条
  • [1] The Slow Deterioration of the Generalization Error of the Random Feature Model
    Ma, Chao
    Wu, Lei
    Weinan, E.
    MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 107, 2020, 107 : 373 - +
  • [2] Conditioning of random Fourier feature matrices: double descent and generalization error
    Chen, Zhijun
    Schaeffer, Hayden
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2024, 13 (02)
  • [3] Asymptotic generalization errors in the online learning of random feature models
    Worschech, Roman
    Rosenow, Bernd
    PHYSICAL REVIEW RESEARCH, 2024, 6 (02):
  • [4] Generalization error of random feature and kernel methods: Hypercontractivity and kernel matrix concentration
    Mei, Song
    Misiakiewicz, Theodor
    Montanari, Andrea
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2022, 59 : 3 - 84
  • [5] Studying Generalization Performance of Random Feature Model through Equivalent Models
    Demir, Samet
    Dogan, Zafer
    32ND IEEE SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU 2024, 2024,
  • [6] Uniform Random Sampling Product Configurations of Feature Models That Have Numerical Features
    Munoz, Daniel-Jesus
    Oh, Jeho
    Pinto, Monica
    Fuentes, Lidia
    Batory, Don
    SPLC'19: PROCEEDINGS OF THE 23RD INTERNATIONAL SYSTEMS AND SOFTWARE PRODUCT LINE CONFERENCE, VOL A, 2020, : 289 - 301
  • [7] NON-UNIFORM CONVERGENCE RATES FOR DISTRIBUTIONS OF ERROR VARIANCE ESTIMATES IN LINEAR MODELS
    赵林城
    陈希孺
    Science China Mathematics, 1982, (10) : 1042 - 1055
  • [8] NON-UNIFORM CONVERGENCE RATES FOR DISTRIBUTIONS OF ERROR VARIANCE ESTIMATES IN LINEAR MODELS
    赵林城
    陈希孺
    ScienceinChina,SerA., 1982, Ser.A.1982 (10) : 1042 - 1055
  • [9] RANDOM ISING-MODELS - CONVERGENCE OF SUCCESSIVE-APPROXIMATIONS TOWARDS EXACT RESULTS
    BENYOUSSEF, A
    BOCCARA, N
    JOURNAL OF APPLIED PHYSICS, 1984, 55 (06) : 2419 - 2420
  • [10] More Data Can Expand the Generalization Gap Between Adversarially Robust and Standard Models
    Chen, Lin
    Min, Yifei
    Zhang, Mingrui
    Karbasi, Amin
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,