On the stable recovery of deep structured linear networks under sparsity constraints

被引:0
|
作者
Malgouyres, Francois [1 ,2 ,3 ,4 ]
机构
[1] Inst Math Toulouse, F-31062 Toulouse 9, France
[2] Univ Toulouse, UMR5219, F-31062 Toulouse 9, France
[3] CNRS, UPS IMT, F-31062 Toulouse 9, France
[4] Inst Rech Technol St Exupery, Toulouse, France
来源
MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 107 | 2020年 / 107卷
关键词
Stable recovery; deep structured linear networks; convolutional linear networks; feature robustess;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider a deep structured linear network under sparsity constraints. We study sharp conditions guaranteeing the stability of the optimal parameters defining the network. More precisely, we provide sharp conditions on the network architecture and the sample under which the error on the parameters defining the network scales linearly with the reconstruction error (i.e. the risk). Therefore, under these conditions, the weights obtained with a successful algorithms are well defined and only depend on the architecture of the network and the sample. The features in the latent spaces are stably defined. The stability property is required in order to interpret the features defined in the latent spaces. It can also lead to a guarantee on the statistical risk. This is what motivates this study. The analysis is based on the recently proposed Tensorial Lifting. The particularity of this paper is to consider a sparsity prior. This leads to a better stability constant. As an illustration, we detail the analysis and provide sharp stability guarantees for convolutional linear network under sparsity prior. In this analysis, we distinguish the role of the network architecture and the sample input. This highlights the requirements on the data in connection to parameter stability.
引用
收藏
页码:107 / 127
页数:21
相关论文
共 50 条
  • [41] Domain decomposition methods for linear inverse problems with sparsity constraints
    Fornasier, Massimo
    INVERSE PROBLEMS, 2007, 23 (06) : 2505 - 2526
  • [42] MAXIMUM LIKELIHOOD ESTIMATION UNDER PARTIAL SPARSITY CONSTRAINTS
    Routtenberg, Tirza
    Eldar, Yonina C.
    Tong, Lang
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 6421 - 6425
  • [43] Deep convolutional network with locality and sparsity constraints for texture classification
    Bu, Xingyuan
    Wu, Yuwei
    Gao, Zhi
    Jia, Yunde
    PATTERN RECOGNITION, 2019, 91 : 34 - 46
  • [44] Multi-party Speech Recovery Exploiting Structured Sparsity Models
    Asaei, Afsaneh
    Taghizadeh, Mohammad J.
    Bourlard, Herve
    Cevher, Volkan
    12TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2011 (INTERSPEECH 2011), VOLS 1-5, 2011, : 192 - 195
  • [45] Structured and Deep Similarity Matching via Structured and Deep Hebbian Networks
    Obeid, Dina
    Ramambason, Hugo
    Pehlevan, Cengiz
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [46] Recovery algorithms for vector-valued data with joint sparsity constraints
    Fornasier, Massimo
    Rauhut, Holger
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2008, 46 (02) : 577 - 613
  • [47] Information Theoretic Limits for Linear Prediction with Graph-Structured Sparsity
    Barik, Adarsh
    Honorio, Jean
    Tawarmalani, Mohit
    2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 2348 - 2352
  • [48] Beyond convergence rates: exact recovery with the Tikhonov regularization with sparsity constraints
    Lorenz, D. A.
    Schiffler, S.
    Trede, D.
    INVERSE PROBLEMS, 2011, 27 (08)
  • [49] A Nearly-Linear Time Framework for Graph-Structured Sparsity
    Hegde, Chinmay
    Indyk, Piotr
    Schmidt, Ludwig
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 928 - 937
  • [50] Deep belief networks with self-adaptive sparsity
    Qiao, Chen
    Yang, Lan
    Shi, Yan
    Fang, Hanfeng
    Kang, Yanmei
    APPLIED INTELLIGENCE, 2022, 52 (01) : 237 - 253