Fundamental Performance Limits for Ideal Decoders in High-Dimensional Linear Inverse Problems

被引:29
|
作者
Bourrier, Anthony [1 ]
Davies, Mike E. [2 ]
Peleg, Tomer [3 ]
Perez, Patrick [4 ]
Gribonval, Remi [5 ]
机构
[1] Gipsa Lab, F-38400 St Martin Dheres, France
[2] Univ Edimburgh, Edinburgh EH8 9YL, Midlothian, Scotland
[3] Technion Israel Inst Technol, IL-32000 Haifa, Israel
[4] Technicolor, F-35576 Cesson Sevigne, France
[5] Inria Rennes Bretagne Atlantique, F-35042 Rennes, France
基金
欧洲研究理事会;
关键词
Linear inverse problems; instance optimality; null space property; restricted isometry property; RANDOM PROJECTIONS; SIGNAL RECOVERY; RECONSTRUCTION; UNION; MODEL;
D O I
10.1109/TIT.2014.2364403
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The primary challenge in linear inverse problems is to design stable and robust decoders to reconstruct high-dimensional vectors from a low-dimensional observation through a linear operator. Sparsity, low-rank, and related assumptions are typically exploited to design decoders, whose performance is then bounded based on some measure of deviation from the idealized model, typically using a norm. This paper focuses on characterizing the fundamental performance limits that can be expected from an ideal decoder given a general model, i. e., a general subset of simple vectors of interest. First, we extend the so-called notion of instance optimality of a decoder to settings where one only wishes to reconstruct some part of the original high-dimensional vector from a low-dimensional observation. This covers practical settings, such as medical imaging of a region of interest, or audio source separation, when one is only interested in estimating the contribution of a specific instrument to a musical recording. We define instance optimality relatively to a model much beyond the traditional framework of sparse recovery, and characterize the existence of an instance optimal decoder in terms of joint properties of the model and the considered linear operator. Noiseless and noise-robust settings are both considered. We show somewhat surprisingly that the existence of noise-aware instance optimal decoders for all noise levels implies the existence of a noise-blind decoder. A consequence of our results is that for models that are rich enough to contain an orthonormal basis, the existence of an l(2)/l(2) instance optimal decoder is only possible when the linear operator is not substantially dimension-reducing. This covers well-known cases (sparse vectors, low-rank matrices) as well as a number of seemingly new situations (structured sparsity and sparse inverse covariance matrices for instance). We exhibit an operator-dependent norm which, under a model-specific generalization of the restricted isometry property, always yields a feasible instance optimality property. This norm can be upper bounded by an atomic norm relative to the considered model.
引用
收藏
页码:7928 / 7946
页数:19
相关论文
共 50 条
  • [21] GRADIENT SCAN GIBBS SAMPLER : AN EFFICIENT HIGH-DIMENSIONAL SAMPLER APPLICATION IN INVERSE PROBLEMS
    Orieux, F.
    Feron, O.
    Giovannelli, J. -F.
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 4085 - 4089
  • [22] Boosting for high-dimensional linear models
    Buhlmann, Peter
    ANNALS OF STATISTICS, 2006, 34 (02): : 559 - 583
  • [23] High-Dimensional Sparse Linear Bandits
    Hao, Botao
    Lattimore, Tor
    Wang, Mengdi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [24] Containment problems in high-dimensional spaces
    Ishigami, Y
    GRAPHS AND COMBINATORICS, 1995, 11 (04) : 327 - 335
  • [25] A TENSOR APPROXIMATION METHOD BASED ON IDEAL MINIMAL RESIDUAL FORMULATIONS FOR THE SOLUTION OF HIGH-DIMENSIONAL PROBLEMS
    Billaud-Friess, M.
    Nouy, A.
    Zahm, O.
    ESAIM-MATHEMATICAL MODELLING AND NUMERICAL ANALYSIS-MODELISATION MATHEMATIQUE ET ANALYSE NUMERIQUE, 2014, 48 (06): : 1777 - 1806
  • [26] Power in High-Dimensional Testing Problems
    Kock, Anders Bredahl
    Preinerstorfer, David
    ECONOMETRICA, 2019, 87 (03) : 1055 - 1069
  • [27] FUNDAMENTAL BARRIERS TO HIGH-DIMENSIONAL REGRESSION WITH CONVEX PENALTIES
    Celentano, Michael
    Montanari, Andrea
    ANNALS OF STATISTICS, 2022, 50 (01): : 170 - 196
  • [28] Randomized physics-informed machine learning for uncertainty quantification in high-dimensional inverse problems
    Zong, Yifei
    Barajas-Solano, David
    Tartakovsky, Alexandre M.
    JOURNAL OF COMPUTATIONAL PHYSICS, 2024, 519
  • [29] A Distributed Block-Split Gibbs Sampler with Hypergraph Structure for High-Dimensional Inverse Problems
    Thouvenin, Pierre-Antoine
    Repetti, Audrey
    Chainais, Pierre
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2024, 33 (03) : 814 - 832
  • [30] Hierarchical tensor-product approximation to the inverse and related operators for high-dimensional elliptic problems
    Gavrilyuk, IP
    Hackbusch, W
    Khoromskij, BN
    COMPUTING, 2005, 74 (02) : 131 - 157