Generalization bounds for sparse random feature expansions ?

被引:21
|
作者
Hashemi, Abolfazl [1 ]
Schaeffer, Hayden [2 ]
Shi, Robert [3 ]
Topcu, Ufuk [3 ]
Tran, Giang [4 ]
Ward, Rachel [3 ]
机构
[1] Purdue Univ, W Lafayette, IN USA
[2] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[3] Univ Texas Austin, Austin, TX USA
[4] Univ Waterloo, Waterloo, ON, Canada
基金
美国国家科学基金会; 加拿大自然科学与工程研究理事会;
关键词
Random features; Sparse optimization; Generalization error; Compressive sensing; SIGNAL RECOVERY; APPROXIMATION; PROJECTION; SUBSPACE;
D O I
10.1016/j.acha.2022.08.003
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Random feature methods have been successful in various machine learning tasks, are easy to compute, and come with theoretical accuracy bounds. They serve as an alternative approach to standard neural networks since they can represent similar function spaces without a costly training phase. However, for accuracy, random feature methods require more measurements than trainable parameters, limiting their use for data-scarce applications. We introduce the sparse random feature expansion to obtain parsimonious random feature models. We leverage ideas from compressive sensing to generate random feature expansions with theoretical guarantees even in the data-scarce setting. We provide generalization bounds for functions in a certain class depending on the number of samples and the distribution of features. By introducing sparse features, i.e. features with random sparse weights, we provide improved bounds for low order functions. We show that our method outperforms shallow networks in several scientific machine learning tasks. (c) 2022 The Author(s). Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
引用
收藏
页码:310 / 330
页数:21
相关论文
共 50 条
  • [31] STABLE RECOVERY OF SPARSE VECTORS FROM RANDOM SINUSOIDAL FEATURE MAPS
    Soltani, Mohammadreza
    Hegde, Chinmay
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 6384 - 6388
  • [32] Representation of sparse Legendre expansions
    Peter, Thomas
    Plonka, Gerlind
    Rosca, Daniela
    JOURNAL OF SYMBOLIC COMPUTATION, 2013, 50 : 159 - 169
  • [33] Contourlets and sparse image expansions
    Do, MN
    WAVELETS: APPLICATIONS IN SIGNAL AND IMAGE PROCESSING X, PTS 1 AND 2, 2003, 5207 : 560 - 570
  • [34] Generalization error of random feature and kernel methods: Hypercontractivity and kernel matrix concentration
    Mei, Song
    Misiakiewicz, Theodor
    Montanari, Andrea
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2022, 59 : 3 - 84
  • [35] Exact Gap between Generalization Error and Uniform Convergence in Random Feature Models
    Yang, Zitong
    Bai, Yu
    Mei, Song
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [36] On Generalization Bounds for Projective Clustering
    Bucarelli, Maria Sofia
    Larsen, Matilde Fjeldsø
    Schwiegelshohn, Chris
    Toftrup, Mads Bech
    Advances in Neural Information Processing Systems, 2023, 36
  • [37] New bounds for correct generalization
    Mattera, D
    Palmieri, F
    1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 1051 - 1055
  • [38] Generalization bounds for averaged classifiers
    Freund, Y
    Mansour, Y
    Schapire, RE
    ANNALS OF STATISTICS, 2004, 32 (04): : 1698 - 1722
  • [39] Generalization bounds of incremental SVM
    Zeng, Jingjing
    Zou, Bin
    Qin, Yimo
    Xu, Jie
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2024, 22 (06)
  • [40] Comparing Comparators in Generalization Bounds
    Hellstrom, Fredrik
    Guedj, Benjamin
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238