Improved bounds on Gaussian MAC and sparse regression via Gaussian inequalities

被引:0
|
作者
Zadik, Ilias [3 ]
Polyanskiy, Yury [1 ]
Thrampoulidis, Christos [2 ]
机构
[1] MIT, Dept EECS, 77 Massachusetts Ave, Cambridge, MA 02139 USA
[2] UC Santa Barbara, Dept ECE, Santa Barbara, CA USA
[3] MIT, Operat Res Ctr, Cambridge, MA 02139 USA
关键词
D O I
10.1109/isit.2019.8849764
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider the Gaussian multiple-access channel with two critical departures from the classical asymptotics: a) number of users proportional to block-length and b) each user sends a fixed number of data bits. We provide improved bounds on the trade-off between the user density and the energy-per-bit. Interestingly, in this information-theoretic problem we rely on Gordon's lemma from Gaussian process theory. From the engineering standpoint, we discover a surprising new effect: good coded-access schemes can achieve perfect multi-user interference cancellation at low user density. In addition, by a similar method we analyze the limits of false-discovery in binary sparse regression problem in the asymptotic regime of number of measurements going to infinity at fixed ratios with problem dimension, sparsity and noise level. Our rigorous bound matches the formal replica-method prediction for some range of parameters with imperceptible numerical precision.
引用
收藏
页码:430 / 434
页数:5
相关论文
共 50 条
  • [1] Sparse Additive Gaussian Process Regression
    Luo, Hengrui
    Nattino, Giovanni
    Pratola, Matthew T.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [2] Sparse Additive Gaussian Process Regression
    Luo, Hengrui
    Nattino, Giovanni
    Pratola, Matthew T.
    [J]. Journal of Machine Learning Research, 2022, 23
  • [3] Sparse greedy Gaussian process regression
    Smola, AJ
    Bartlett, P
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 13, 2001, 13 : 619 - 625
  • [4] Sparse Spectrum Gaussian Process Regression
    Lazaro-Gredilla, Miguel
    Quinonero-Candela, Joaquin
    Rasmussen, Carl Edward
    Figueiras-Vidal, Anibal R.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2010, 11 : 1865 - 1881
  • [5] Sparse Multivariate Gaussian Mixture Regression
    Weruaga, Luis
    Via, Javier
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (05) : 1098 - 1108
  • [6] Recursive estimation for sparse Gaussian process regression
    Schuerch, Manuel
    Azzimonti, Dario
    Benavoli, Alessio
    Zaffalon, Marco
    [J]. AUTOMATICA, 2020, 120
  • [7] Sparse Inverse Kernel Gaussian Process Regression
    Das, Kamalika
    Srivastava, Ashok N.
    [J]. STATISTICAL ANALYSIS AND DATA MINING, 2013, 6 (03) : 205 - 220
  • [8] Efficient Optimization for Sparse Gaussian Process Regression
    Cao, Yanshuai
    Brubaker, Marcus A.
    Fleet, David J.
    Hertzmann, Aaron
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (12) : 2415 - 2427
  • [9] Incremental Variational Sparse Gaussian Process Regression
    Cheng, Ching-An
    Boots, Byron
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [10] Moderate deviations inequalities for Gaussian process regression
    Li, Jialin
    Ryzhov, Ilya O.
    [J]. JOURNAL OF APPLIED PROBABILITY, 2024, 61 (01) : 172 - 197