Lossy Compression via Sparse Linear Regression: Performance Under Minimum-Distance Encoding

被引:9
|
作者
Venkataramanan, Ramji [1 ]
Joseph, Antony [2 ]
Tatikonda, Sekhar [3 ]
机构
[1] Univ Cambridge, Dept Engn, Cambridge CB2 1PZ, England
[2] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94704 USA
[3] Yale Univ, Dept Elect Engn, New Haven, CT 06511 USA
基金
美国国家科学基金会;
关键词
Lossy compression; Gaussian sources; squared error distortion; rate-distortion function; error exponent; sparse regression; THEORETIC LIMITS; ERROR EXPONENT; CODES; RECOVERY;
D O I
10.1109/TIT.2014.2313085
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study a new class of codes for lossy compression with the squared-error distortion criterion, designed using the statistical framework of high-dimensional linear regression. Codewords are linear combinations of subsets of columns of a design matrix. Called a sparse superposition or sparse regression codebook, this structure is motivated by an analogous construction proposed recently by Barron and Joseph for communication over an Additive White Gaussian Noise channel. For independent identically distributed (i.i.d) Gaussian sources and minimum-distance encoding, we show that such a code can attain the Shannon rate-distortion function with the optimal error exponent, for all distortions below a specified value. It is also shown that sparse regression codes are robust in the following sense: a codebook designed to compress an i.i.d Gaussian source of variance sigma(2) with (squared-error) distortion D can compress any ergodic source of variance less than sigma(2) to within distortion D. Thus, the sparse regression ensemble retains many of the good covering properties of the i.i.d random Gaussian ensemble, while having a compact representation in terms of a matrix whose size is a low-order polynomial in the block-length.
引用
收藏
页码:3254 / 3264
页数:11
相关论文
共 50 条
  • [41] MINIMUM DISTANCE METHODS BASED ON QUADRATIC DISTANCES FOR TRANSFORMS IN SIMPLE LINEAR-REGRESSION MODELS
    LUONG, A
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1991, 53 (02): : 465 - 471
  • [42] Semi-Supervised Feature Selection via Sparse Rescaled Linear Square Regression
    Chen, Xiaojun
    Yuan, Guowen
    Nie, Feiping
    Ming, Zhong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (01) : 165 - 176
  • [43] Estimating Multiple Frequency-Hopping Signal Parameters via Sparse Linear Regression
    Angelosante, Daniele
    Giannakis, Georgios B.
    Sidiropoulos, Nicholas D.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (10) : 5044 - 5056
  • [44] Gaussian Rate-Distortion via Sparse Linear Regression over Compact Dictionaries
    Venkataramanan, Ramji
    Joseph, Antony
    Tatikonda, Sekhar
    2012 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2012, : 368 - 372
  • [45] Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP
    Kale, Satyen
    Karnin, Zohar
    Liang, Tengyuan
    Pal, David
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [46] SPARSE MINIMUM AVERAGE VARIANCE ESTIMATION VIA QUANTILE REGRESSION GROUP VARIABLE SELECTION PENALTIES (GLQMAVE)
    Ahmed, Rihab
    Rodeen, Waleed
    INTERNATIONAL JOURNAL OF AGRICULTURAL AND STATISTICAL SCIENCES, 2021, 17 : 2395 - 2401
  • [47] Dimensionality reduction based on multi-local linear regression and global subspace projection distance minimum
    Haidong Huang
    Zhengming Ma
    Guokai Zhang
    Huibin Wu
    Pattern Analysis and Applications, 2021, 24 : 1713 - 1730
  • [48] Dimensionality reduction based on multi-local linear regression and global subspace projection distance minimum
    Huang, Haidong
    Ma, Zhengming
    Zhang, Guokai
    Wu, Huibin
    PATTERN ANALYSIS AND APPLICATIONS, 2021, 24 (04) : 1713 - 1730
  • [49] SMLR-Type Blind Deconvolution of Sparse Pulse Sequences Under a Minimum Temporal Distance Constraint
    Kail, Georg
    Hlawatsch, Franz
    Novak, Clemens
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (18) : 4838 - 4853
  • [50] Linear Growing Minimum Distance of Ultra-Sparse Non-Binary Cluster-LDPC Codes
    Savin, Valentin
    Declercq, David
    2011 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2011, : 523 - 527