On the stable recovery of deep structured linear networks under sparsity constraints

被引:0
|
作者
Malgouyres, Francois [1 ,2 ,3 ,4 ]
机构
[1] Inst Math Toulouse, F-31062 Toulouse 9, France
[2] Univ Toulouse, UMR5219, F-31062 Toulouse 9, France
[3] CNRS, UPS IMT, F-31062 Toulouse 9, France
[4] Inst Rech Technol St Exupery, Toulouse, France
来源
MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 107 | 2020年 / 107卷
关键词
Stable recovery; deep structured linear networks; convolutional linear networks; feature robustess;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider a deep structured linear network under sparsity constraints. We study sharp conditions guaranteeing the stability of the optimal parameters defining the network. More precisely, we provide sharp conditions on the network architecture and the sample under which the error on the parameters defining the network scales linearly with the reconstruction error (i.e. the risk). Therefore, under these conditions, the weights obtained with a successful algorithms are well defined and only depend on the architecture of the network and the sample. The features in the latent spaces are stably defined. The stability property is required in order to interpret the features defined in the latent spaces. It can also lead to a guarantee on the statistical risk. This is what motivates this study. The analysis is based on the recently proposed Tensorial Lifting. The particularity of this paper is to consider a sparsity prior. This leads to a better stability constant. As an illustration, we detail the analysis and provide sharp stability guarantees for convolutional linear network under sparsity prior. In this analysis, we distinguish the role of the network architecture and the sample input. This highlights the requirements on the data in connection to parameter stability.
引用
收藏
页码:107 / 127
页数:21
相关论文
共 50 条
  • [31] On the identifiability and stable recovery of deep/multi-layer structured matrix factorization
    Malgouyres, Francois
    Landsberg, Joseph
    2016 IEEE INFORMATION THEORY WORKSHOP (ITW), 2016,
  • [32] GRAPH LEARNING UNDER SPECTRAL SPARSITY CONSTRAINTS
    Subbareddy, B.
    Siripuram, Aditya
    Zhang, Jingxin
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 5405 - 5409
  • [33] Convex optimization under combinatorial sparsity constraints
    Buchheim, Christoph
    Traversi, Emiliano
    OPERATIONS RESEARCH LETTERS, 2022, 50 (06) : 632 - 638
  • [34] Graph learning under spectral sparsity constraints
    2021, Institute of Electrical and Electronics Engineers Inc. (2021-June):
  • [35] On the Role of Sparsity and DAG Constraints for Learning Linear DAGs
    Ng, Ignavier
    Ghassami, AmirEmad
    Zhang, Kun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [36] TIME VARYING LINEAR PREDICTION USING SPARSITY CONSTRAINTS
    Chetupalli, Srikanth Raj
    Sreenivas, T. V.
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [37] Target Imaging under Robust Sparsity Recovery
    Liu, Hongqing
    Li, Yong
    Huang, Jianzhong
    Zhou, Yi
    2013 IEEE INTERNATIONAL CONFERENCE OF IEEE REGION 10 (TENCON), 2013,
  • [38] Local and Global Sparsity for Deep Learning Networks
    Zhang, Long
    Zhao, Jieyu
    Shi, Xiangfu
    Ye, Xulun
    IMAGE AND GRAPHICS (ICIG 2017), PT II, 2017, 10667 : 74 - 85
  • [39] Dynamic sparsity control in Deep Belief Networks
    Keyvanrad, Mohammad Ali
    Homayounpour, Mohammad Mehdi
    INTELLIGENT DATA ANALYSIS, 2017, 21 (04) : 963 - 979
  • [40] Robust Phase Retrieval with Sparsity under Nonnegativity Constraints
    Weller, Daniel S.
    2016 50TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS, 2016, : 1043 - 1047