On the stable recovery of deep structured linear networks under sparsity constraints

被引:0
|
作者
Malgouyres, Francois [1 ,2 ,3 ,4 ]
机构
[1] Inst Math Toulouse, F-31062 Toulouse 9, France
[2] Univ Toulouse, UMR5219, F-31062 Toulouse 9, France
[3] CNRS, UPS IMT, F-31062 Toulouse 9, France
[4] Inst Rech Technol St Exupery, Toulouse, France
来源
MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 107 | 2020年 / 107卷
关键词
Stable recovery; deep structured linear networks; convolutional linear networks; feature robustess;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider a deep structured linear network under sparsity constraints. We study sharp conditions guaranteeing the stability of the optimal parameters defining the network. More precisely, we provide sharp conditions on the network architecture and the sample under which the error on the parameters defining the network scales linearly with the reconstruction error (i.e. the risk). Therefore, under these conditions, the weights obtained with a successful algorithms are well defined and only depend on the architecture of the network and the sample. The features in the latent spaces are stably defined. The stability property is required in order to interpret the features defined in the latent spaces. It can also lead to a guarantee on the statistical risk. This is what motivates this study. The analysis is based on the recently proposed Tensorial Lifting. The particularity of this paper is to consider a sparsity prior. This leads to a better stability constant. As an illustration, we detail the analysis and provide sharp stability guarantees for convolutional linear network under sparsity prior. In this analysis, we distinguish the role of the network architecture and the sample input. This highlights the requirements on the data in connection to parameter stability.
引用
收藏
页码:107 / 127
页数:21
相关论文
共 50 条
  • [21] Maximal Sparsity with Deep Networks?
    Xin, Bo
    Wang, Yizhou
    Gao, Wen
    Wang, Baoyuan
    Wipf, David
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [23] Preserving Real-World Robustness of Neural Networks Under Sparsity Constraints
    Gritsch, Jasmin Viktoria
    Legenstein, Robert
    Oezdenizci, Ozan
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT V, ECML PKDD 2024, 2024, 14945 : 337 - 354
  • [24] DISTRIBUTED BLIND DECONVOLUTION OF SEISMIC SIGNALS UNDER SPARSITY CONSTRAINTS IN SENSOR NETWORKS
    Shin, Ban-Sok
    Shutin, Dmitriy
    PROCEEDINGS OF THE 2020 IEEE 30TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2020,
  • [25] An Adaptive Bayesian Framework for Recovery of Sources with Structured Sparsity
    Bereyhi, Ali
    Mueller, Ralf R.
    2019 IEEE 8TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP 2019), 2019, : 71 - 75
  • [26] Local sparsity and recovery of fusion frame structured signals
    Aceska, Roza
    Bouchot, Jean-Luc
    Li, Shidong
    SIGNAL PROCESSING, 2020, 174
  • [27] Signal Recovery With Multistage Tests and Without Sparsity Constraints
    Xing, Yiming
    Fellouris, Georgios
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2023, 69 (11) : 7220 - 7245
  • [28] Performance Analysis of Support Recovery with Joint Sparsity Constraints
    Tang, Gongguo
    Nehorai, Arye
    2009 47TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING, VOLS 1 AND 2, 2009, : 258 - 264
  • [29] Sparseout: Controlling Sparsity in Deep Networks
    Khan, Najeeb
    Stavness, Ian
    ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, 11489 : 296 - 307
  • [30] Addressing Sparsity in Deep Neural Networks
    Zhou, Xuda
    Du, Zidong
    Zhang, Shijin
    Zhang, Lei
    Lan, Huiying
    Liu, Shaoli
    Li, Ling
    Guo, Qi
    Chen, Tianshi
    Chen, Yunji
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2019, 38 (10) : 1858 - 1871