Generalization error bounds for iterative recovery algorithms unfolded as neural networks

被引:1
|
作者
Schnoor, Ekkehard [1 ]
Behboodi, Arash [2 ]
Rauhut, Holger [1 ]
机构
[1] Rhein Westfal TH Aachen, Chair Math Informat Proc, Aachen, Germany
[2] Rhein Westfal TH Aachen, Inst Theoret Informat Technol, Aachen, Germany
关键词
compressive sensing; neural networks; iterative soft thresholding; generalization; Rademacher complexity; SAMPLE COMPLEXITY; OVERCOMPLETE DICTIONARIES; SPARSE;
D O I
10.1093/imaiai/iaad023
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Motivated by the learned iterative soft thresholding algorithm (LISTA), we introduce a general class of neural networks suitable for sparse reconstruction from few linear measurements. By allowing a wide range of degrees of weight-sharing between the flayers, we enable a unified analysis for very different neural network types, ranging from recurrent ones to networks more similar to standard feedforward neural networks. Based on training samples, via empirical risk minimization, we aim at learning the optimal network parameters and thereby the optimal network that reconstructs signals from their low-dimensional linear measurements. We derive generalization bounds by analyzing the Rademacher complexity of hypothesis classes consisting of such deep networks, that also take into account the thresholding parameters. We obtain estimates of the sample complexity that essentially depend only linearly on the number of parameters and on the depth. We apply our main result to obtain specific generalization bounds for several practical examples, including different algorithms for (implicit) dictionary learning, and convolutional neural networks.
引用
收藏
页数:33
相关论文
共 50 条
  • [21] Error Performance Bounds for Routing Algorithms in Wireless Cooperative Networks
    Sheng, Zhengguo
    Ding, Zhiguo
    Leung, Kin K.
    2010 5TH INTERNATIONAL ICST CONFERENCE ON COMMUNICATIONS AND NETWORKING IN CHINA (CHINACOM), 2010,
  • [22] Graph Neural Networks Inspired by Classical Iterative Algorithms
    Yang, Yongyi
    Liu, Tang
    Wang, Yangkun
    Zhou, Jinjing
    Gan, Quan
    Wei, Zhewei
    Zhang, Zheng
    Huang, Zengfeng
    Wipf, David
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [23] Generalization error bounds for aggregate classifiers
    Blanchard, G
    NONLINEAR ESTIMATION AND CLASSIFICATION, 2003, 171 : 357 - 367
  • [24] Theoretical Investigation of Generalization Bounds for Adversarial Learning of Deep Neural Networks
    Qingyi Gao
    Xiao Wang
    Journal of Statistical Theory and Practice, 2021, 15
  • [25] Generalization Error Analysis of Neural Networks with Gradient Based Regularization
    Li, Lingfeng
    Tai, Xue-Cheng
    Yang, Jiang
    COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2022, 32 (04) : 1007 - 1038
  • [26] Localized generalization error model for Multilayer Perceptron Neural Networks
    Yang, Fei
    Ng, Wing W. Y.
    Tsang, Eric C. C.
    Zeng, Xiao-Qin
    Yeung, Daniel S.
    PROCEEDINGS OF 2008 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2008, : 794 - +
  • [27] High-dimensional dynamics of generalization error in neural networks
    Advani, Madhu S.
    Saxe, Andrew M.
    Sompolinsky, Haim
    NEURAL NETWORKS, 2020, 132 : 428 - 446
  • [28] Propositional lower bounds: Generalization and algorithms
    Cadoli, M
    Palopoli, L
    Scarcello, F
    LOGICS IN ARTIFICIAL INTELLIGENCE, 1998, 1489 : 355 - 367
  • [29] A tight upper bound on the generalization error of feedforward neural networks
    Sarraf, Aydin
    NEURAL NETWORKS, 2020, 127 : 1 - 6
  • [30] Generalization Bounds for Uniformly Stable Algorithms
    Feldman, Vitaly
    Vondrak, Jan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31