Generalization bounds for sparse random feature expansions ?

被引:21
|
作者
Hashemi, Abolfazl [1 ]
Schaeffer, Hayden [2 ]
Shi, Robert [3 ]
Topcu, Ufuk [3 ]
Tran, Giang [4 ]
Ward, Rachel [3 ]
机构
[1] Purdue Univ, W Lafayette, IN USA
[2] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[3] Univ Texas Austin, Austin, TX USA
[4] Univ Waterloo, Waterloo, ON, Canada
基金
美国国家科学基金会; 加拿大自然科学与工程研究理事会;
关键词
Random features; Sparse optimization; Generalization error; Compressive sensing; SIGNAL RECOVERY; APPROXIMATION; PROJECTION; SUBSPACE;
D O I
10.1016/j.acha.2022.08.003
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Random feature methods have been successful in various machine learning tasks, are easy to compute, and come with theoretical accuracy bounds. They serve as an alternative approach to standard neural networks since they can represent similar function spaces without a costly training phase. However, for accuracy, random feature methods require more measurements than trainable parameters, limiting their use for data-scarce applications. We introduce the sparse random feature expansion to obtain parsimonious random feature models. We leverage ideas from compressive sensing to generate random feature expansions with theoretical guarantees even in the data-scarce setting. We provide generalization bounds for functions in a certain class depending on the number of samples and the distribution of features. By introducing sparse features, i.e. features with random sparse weights, we provide improved bounds for low order functions. We show that our method outperforms shallow networks in several scientific machine learning tasks. (c) 2022 The Author(s). Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
引用
收藏
页码:310 / 330
页数:21
相关论文
共 50 条
  • [1] Generalization Error Bounds for Multiclass Sparse Linear Classifiers
    Levy, Tomer
    Abramovich, Felix
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [2] Random Feature Expansions for Deep Gaussian Processes
    Cutajar, Kurt
    Bonilla, Edwin, V
    Michiardi, Pietro
    Filippone, Maurizio
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [3] Sparse bounds for oscillatory and random singular integrals
    Lacey, Michael T.
    Spencer, Scott
    NEW YORK JOURNAL OF MATHEMATICS, 2017, 23 : 119 - 131
  • [4] PERFORMANCE BOUNDS FOR SPARSE ESTIMATION WITH RANDOM NOISE
    Ben-Haim, Zvika
    Eldar, Yonina C.
    2009 IEEE/SP 15TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2009, : 225 - 228
  • [5] Sparse multinomial logistic regression: Fast algorithms and generalization bounds
    Krishnapuram, B
    Carin, L
    Figueiredo, MAT
    Hartemink, AJ
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (06) : 957 - 968
  • [6] Norm-based Generalization Bounds for Sparse Neural Networks
    Galanti, Tomer
    Xu, Mengjia
    Galanti, Liane
    Poggio, Tomaso
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [7] Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery
    Feng, Yunlong
    Lv, Shao-Gao
    Hang, Hanyuan
    Suykens, Johan A. K.
    NEURAL COMPUTATION, 2016, 28 (03) : 525 - 562
  • [8] Random Feature Amplification: Feature Learning and Generalization in Neural Networks
    Frei, Spencer
    Chatterji, Niladri S.
    Bartlett, Peter L.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [9] Information Theoretic Bounds for Sparse Reconstruction in Random Noise
    Chen, Junjie
    Zhu, Fangqi
    Liang, Qilian
    IEEE ACCESS, 2019, 7 (102304-102312) : 102304 - 102312
  • [10] The Slow Deterioration of the Generalization Error of the Random Feature Model
    Ma, Chao
    Wu, Lei
    Weinan, E.
    MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 107, 2020, 107 : 373 - +