Information theoretic limits of learning a sparse rule

被引:0
|
作者
Luneau, Clement [1 ]
Macris, Nicolas [1 ]
Barbier, Jean [2 ]
机构
[1] Ecole Polytech Fed Lausanne, Lausanne, Switzerland
[2] Abdus Salaam Int Ctr Theoret Phys, Trieste, Italy
关键词
MUTUAL INFORMATION; TIGHT BOUNDS; SHARP BOUNDS; CAPACITY; ERROR;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider generalized linear models in regimes where the number of nonzero components of the signal and accessible data points are sublinear with respect to the size of the signal. We prove a variational formula for the asymptotic mutual information per sample when the system size grows to infinity. This result allows us to derive an expression for the minimum mean-square error (MMSE) of the Bayesian estimator when the signal entries have a discrete distribution with finite support. We find that, for such signals and suitable vanishing scalings of the sparsity and sampling rate, the MMSE is nonincreasing piecewise constant. In specific instances the MMSE even displays an all-or-nothing phase transition, that is, the MMSE sharply jumps from its maximum value to zero at a critical sampling rate. The all-or-nothing phenomenon has previously been shown to occur in high-dimensional linear regression. Our analysis goes beyond the linear case and applies to learning the weights of a perceptron with general activation function in a teacher-student scenario. In particular, we discuss an all-or-nothing phenomenon for the generalization error with a sublinear set of training examples.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Information theoretic limits of learning a sparse rule
    Luneau, Clement
    Macris, Nicolas
    Barbier, Jean
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2022, 2022 (04):
  • [2] On Support Recovery With Sparse CCA: Information Theoretic and Computational Limits
    Laha, Nilanjana
    Mukherjee, Rajarshi
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2023, 69 (03) : 1695 - 1738
  • [3] On the Information-Theoretic Limits of Noisy Sparse Phase Retrieval
    Lan V Truong
    Scarlett, Jonathan
    2019 IEEE INFORMATION THEORY WORKSHOP (ITW), 2019, : 329 - 333
  • [4] On the Information Theoretic Limits of Learning Ising Models
    Shanmugam, Karthikeyan
    Tandon, Rashish
    Dimakis, Alexandros G.
    Ravikumar, Pradeep
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [5] Information-theoretic limits on sparse support recovery: Dense versus sparse measurements
    Wang, Wei
    Wainwright, Martin J.
    Ramchandran, Kannan
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 2197 - 2201
  • [6] An information theoretic sparse kernel algorithm for online learning
    Fan, Haijin
    Song, Qing
    Xu, Zhao
    EXPERT SYSTEMS WITH APPLICATIONS, 2014, 41 (09) : 4349 - 4359
  • [8] Information Theoretic Limits on Learning Stochastic Differential Equations
    Bento, Jose
    Ibrahimi, Morteza
    Montanari, Andrea
    2011 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2011, : 855 - 859
  • [9] Information Theoretic Limits of Data Shuffling for Distributed Learning
    Attia, Mohamed Adel
    Tandon, Ravi
    2016 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2016,
  • [10] Information-Theoretic Limits on Sparse Signal Recovery: Dense versus Sparse Measurement Matrices
    Wang, Wei
    Wainwright, Martin J.
    Ramchandran, Kannan
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (06) : 2967 - 2979