Generalization error bounds for iterative recovery algorithms unfolded as neural networks

被引:1
|
作者
Schnoor, Ekkehard [1 ]
Behboodi, Arash [2 ]
Rauhut, Holger [1 ]
机构
[1] Rhein Westfal TH Aachen, Chair Math Informat Proc, Aachen, Germany
[2] Rhein Westfal TH Aachen, Inst Theoret Informat Technol, Aachen, Germany
关键词
compressive sensing; neural networks; iterative soft thresholding; generalization; Rademacher complexity; SAMPLE COMPLEXITY; OVERCOMPLETE DICTIONARIES; SPARSE;
D O I
10.1093/imaiai/iaad023
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Motivated by the learned iterative soft thresholding algorithm (LISTA), we introduce a general class of neural networks suitable for sparse reconstruction from few linear measurements. By allowing a wide range of degrees of weight-sharing between the flayers, we enable a unified analysis for very different neural network types, ranging from recurrent ones to networks more similar to standard feedforward neural networks. Based on training samples, via empirical risk minimization, we aim at learning the optimal network parameters and thereby the optimal network that reconstructs signals from their low-dimensional linear measurements. We derive generalization bounds by analyzing the Rademacher complexity of hypothesis classes consisting of such deep networks, that also take into account the thresholding parameters. We obtain estimates of the sample complexity that essentially depend only linearly on the number of parameters and on the depth. We apply our main result to obtain specific generalization bounds for several practical examples, including different algorithms for (implicit) dictionary learning, and convolutional neural networks.
引用
收藏
页数:33
相关论文
共 50 条
  • [1] Generalization Error Bounds for Noisy, Iterative Algorithms
    Pensia, Ankit
    Jog, Varun
    Loh, Po-Ling
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 546 - 550
  • [2] Generalization Error Bounds for Noisy, Iterative Algorithms via Maximal Leakage
    Issa, Ibrahim
    Esposito, Amedeo Roberto
    Gastpar, Michael
    THIRTY SIXTH ANNUAL CONFERENCE ON LEARNING THEORY, VOL 195, 2023, 195
  • [3] Generalization error bounds for Bayesian mixture algorithms
    Meir, R
    Zhang, T
    JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 4 (05) : 839 - 860
  • [4] Generalization Error Bounds for Optimization Algorithms via Stability
    Meng, Qi
    Wang, Yue
    Chen, Wei
    Wang, Taifeng
    Ma, Zhi-Ming
    Liu, Tie-Yan
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2336 - 2342
  • [5] Regression with Deep Neural Networks: Generalization Error Guarantees, Learning Algorithms, and Regularizers
    Amjad, Jaweria
    Lyu, Zhaoyan
    Rodrigues, Miguel R. D.
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 1481 - 1485
  • [6] On Generalization Bounds of a Family of Recurrent Neural Networks
    Chen, Minshuo
    Li, Xingguo
    Zhao, Tuo
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 1233 - 1242
  • [7] Generalization and risk bounds for recurrent neural networks
    Cheng, Xuewei
    Huang, Ke
    Ma, Shujie
    NEUROCOMPUTING, 2025, 616
  • [8] Upper Bounds on the Generalization Error of Private Algorithms for Discrete Data
    Rodriguez-Galvez, Borja
    Bassi, German
    Skoglund, Mikael
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2021, 67 (11) : 7362 - 7379
  • [9] On the Error Bounds for ReLU Neural Networks
    Katende, Ronald
    Kasumba, Henry
    Kakuba, Godwin
    Mango, John
    IAENG International Journal of Applied Mathematics, 2024, 54 (12) : 2602 - 2611
  • [10] Error bounds for approximation with neural networks
    Burger, M
    Neubauer, A
    JOURNAL OF APPROXIMATION THEORY, 2001, 112 (02) : 235 - 250