Counting Tensor Rank Decompositions

被引:4
|
作者
Obster, Dennis [1 ]
Sasakura, Naoki [1 ]
机构
[1] Kyoto Univ, Yukawa Inst Theoret Phys, Kitashirakawa, Sakyo Ku, Kyoto 6068502, Japan
关键词
canonical tensor model; tensor rank decomposition; quantum gravity; numerical methods;
D O I
10.3390/universe7080302
中图分类号
P1 [天文学];
学科分类号
0704 ;
摘要
Tensor rank decomposition is a useful tool for geometric interpretation of the tensors in the canonical tensor model (CTM) of quantum gravity. In order to understand the stability of this interpretation, it is important to be able to estimate how many tensor rank decompositions can approximate a given tensor. More precisely, finding an approximate symmetric tensor rank decomposition of a symmetric tensor Q with an error allowance Delta is to find vectors phi(i) satisfying parallel to Q-Sigma(R)(i=1)phi(i)circle times phi(i)center dot center dot center dot circle times phi(i)parallel to(2)<=Delta. The volume of all such possible phi(i) is an interesting quantity which measures the amount of possible decompositions for a tensor Q within an allowance. While it would be difficult to evaluate this quantity for each Q, we find an explicit formula for a similar quantity by integrating over all Q of unit norm. The expression as a function of Delta is given by the product of a hypergeometric function and a power function. By combining new numerical analysis and previous results, we conjecture a formula for the critical rank, yielding an estimate for the spacetime degrees of freedom of the CTM. We also extend the formula to generic decompositions of non-symmetric tensors in order to make our results more broadly applicable. Interestingly, the derivation depends on the existence (convergence) of the partition function of a matrix model which previously appeared in the context of the CTM.
引用
收藏
页数:26
相关论文
共 50 条
  • [1] Tensor decompositions and rank increment conjecture
    Tyrtyshnikov, Eugene E.
    RUSSIAN JOURNAL OF NUMERICAL ANALYSIS AND MATHEMATICAL MODELLING, 2020, 35 (04) : 239 - 246
  • [2] Low Rank Tensor Decompositions and Approximations
    Nie, Jiawang
    Wang, Li
    Zheng, Zequn
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2023,
  • [3] Low Rank Tensor Decompositions and ApproximationsLow Rank Tensor Decompositions and ApproximationsJ. Nie et al.
    Jiawang Nie
    Li Wang
    Zequn Zheng
    Journal of the Operations Research Society of China, 2024, 12 (4) : 847 - 873
  • [4] On the average condition number of tensor rank decompositions
    Breiding, Paul
    Vannieuwenhoven, Nick
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2020, 40 (03) : 1908 - 1936
  • [5] On the average condition number of tensor rank decompositions
    Breiding P.
    Vannieuwenhoven N.
    IMA Journal of Numerical Analysis, 2021, 40 (03) : 1908 - 1936
  • [6] Rank Properties and Computational Methods for Orthogonal Tensor Decompositions
    Zeng, Chao
    JOURNAL OF SCIENTIFIC COMPUTING, 2023, 94 (01)
  • [7] Accelerated Low-rank Updates to Tensor Decompositions
    Baskaran, Muthu
    Langston, M. Harper
    Ramananandro, Tahina
    Bruns-Smith, David
    Henretty, Tom
    Ezick, James
    Lethin, Richard
    2016 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2016,
  • [8] COMPLEXITY AND RANK OF DOUBLE CONES AND TENSOR PRODUCT DECOMPOSITIONS
    PANYUSHEV, DI
    COMMENTARII MATHEMATICI HELVETICI, 1993, 68 (03) : 455 - 468
  • [9] Rank Properties and Computational Methods for Orthogonal Tensor Decompositions
    Chao Zeng
    Journal of Scientific Computing, 2023, 94
  • [10] Tensor Regression Using Low-Rank and Sparse Tucker Decompositions
    Ahmed, Talal
    Raja, Haroon
    Bajwa, Waheed U.
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (04): : 944 - 966