Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format

被引:6
|
作者
Rohrbach, Paul B. [1 ]
Dolgov, Sergey [2 ]
Grasedyck, Lars [3 ]
Scheichl, Robert [4 ]
机构
[1] Univ Cambridge, Dept Appl Math & Theoret Phys, Wilberforce Rd, Cambridge CB3 0WA, England
[2] Univ Bath, Dept Math Sci, Claverton Down, Bath BA2 7AY, Avon, England
[3] Rhein Westfal TH Aachen, Inst Geomet & Prakt Math, Templergraben 55, D-52056 Aachen, Germany
[4] Heidelberg Univ, Inst Appl Math, Neuenheimer Feld 205, D-69120 Heidelberg, Germany
来源
基金
英国工程与自然科学研究理事会;
关键词
tensor; Tensor-Train; high-dimensional; low rank; Gaussian probability distribution; DECOMPOSITION; ALGORITHM; TT;
D O I
10.1137/20M1314653
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Low-rank tensor approximations have shown great potential for uncertainty quantification in high dimensions, for example, to build surrogate models that can be used to speed up large-scale inference problems [M. Eigel, M. Marschall, and R. Schneider, Inverse Problems, 34 (2018), 035010; S. Dolgov et al., Stat. Comput., 30 (2020), pp. 603-625]. The feasibility and efficiency of such approaches depends critically on the rank that is necessary to represent or approximate the underlying distribution. In this paper, a priori rank bounds for approximations in the functional Tensor-Train representation for the case of Gaussian models are developed. It is shown that under suitable conditions on the precision matrix, the Gaussian density can be approximated to high accuracy without suffering from an exponential growth of complexity as the dimension increases. These results provide a rigorous justification of the suitability and the limitations of low-rank tensor methods in a simple but important model case. Numerical experiments confirm that the rank bounds capture the qualitative behavior of the rank structure when varying the parameters of the precision matrix and the accuracy of the approximation. Finally, the practical relevance of the theoretical results is demonstrated in the context of a Bayesian filtering problem.
引用
收藏
页码:1191 / 1224
页数:34
相关论文
共 50 条
  • [21] Low tensor-train rank with total variation for magnetic resonance imaging reconstruction
    CHEN QiPeng
    CAO JianTing
    Science China(Technological Sciences), 2021, (09) : 1854 - 1862
  • [22] Tensor-Train decomposition for image recognition
    D. Brandoni
    V. Simoncini
    Calcolo, 2020, 57
  • [23] Tensor-train ranks for matrices and their inverses
    Oseledets, Ivan
    Tyrtyshnikov, Eugene
    Zamarashkin, Nickolai
    Computational Methods in Applied Mathematics, 2011, 11 (03) : 394 - 403
  • [24] Graph Regularized Low-Rank Tensor-Train for Robust Principal Component Analysis
    Sofuoglu, Seyyid Emre
    Aviyente, Selin
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1152 - 1156
  • [25] Tensor-Train decomposition for image recognition
    Brandoni, D.
    Simoncini, V
    CALCOLO, 2020, 57 (01)
  • [26] A continuous analogue of the tensor-train decomposition
    Gorodetsky, Alex
    Karaman, Sertac
    Marzouk, Youssef
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2019, 347 : 59 - 84
  • [27] Learning a Low Tensor-Train Rank Representation for Hyperspectral Image Super-Resolution
    Dian, Renwei
    Li, Shutao
    Fang, Leyuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (09) : 2672 - 2683
  • [28] AUTOMATIC DIFFERENTIATION FOR RIEMANNIAN OPTIMIZATION ON LOW-RANK MATRIX AND TENSOR-TRAIN MANIFOLDS
    NOVIKOV, A. L. E. X. A. N. D. E. R.
    RAKHUBA, M. A. X. I. M.
    OSELEDETS, I. V. A. N.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2022, 44 (02): : A843 - A869
  • [29] On Tensor-Train Ranks of Tensorized Polynomials
    Vysotsky, Lev
    LARGE-SCALE SCIENTIFIC COMPUTING (LSSC 2019), 2020, 11958 : 189 - 196
  • [30] Survey of the hierarchical equations of motion in tensor-train format for non-Markovian quantum dynamics
    Etienne Mangaud
    Amine Jaouadi
    Alex Chin
    Michèle Desouter-Lecomte
    The European Physical Journal Special Topics, 2023, 232 : 1847 - 1869