Provable Tensor-Train Format Tensor Completion by Riemannian Optimization

被引:0
|
作者
Cai, Jian-Feng [1 ]
Li, Jingyang [1 ]
Xia, Dong [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Math, Hong Kong, Peoples R China
关键词
tensor completion; Riemannian gradient descent; tensor-train decomposition; tensor-train SVD; spectral initialization; NUCLEAR-NORM; MATRIX COMPLETION; UNCERTAINTY QUANTIFICATION; RANK; FACTORIZATION; ALGORITHMS; RECOVERY; MODELS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The tensor train (TT) format enjoys appealing advantages in handling structural high-order tensors. The recent decade has witnessed the wide applications of TT-format tensors from diverse disciplines, among which tensor completion has drawn considerable attention. Numerous fast algorithms, including the Riemannian gradient descent (RGrad), have been proposed for the TT-format tensor completion. However, the theoretical guarantees of these algorithms are largely missing or sub-optimal, partly due to the complicated and recursive algebraic operations in TT-format decomposition. Moreover, existing results established for the tensors of other formats, for example, Tucker and CP, are inapplicable because the algorithms treating TT-format tensors are substantially different and more involved. In this paper, we provide, to our best knowledge, the first theoretical guarantees of the convergence of RGrad algorithm for TT-format tensor completion, under a nearly optimal sample size condition. The RGrad algorithm converges linearly with a constant contraction rate that is free of tensor condition number without the necessity of re-conditioning. We also propose a novel approach, referred to as the sequential second-order moment method, to attain a warm initialization under a similar sample size requirement. As a byproduct, our result even significantly refines the prior investigation of RGrad algorithm for matrix completion. Lastly, statistically (near) optimal rate is derived for RGrad algorithm if the observed entries consist of random sub-Gaussian noise. Numerical experiments confirm our theoretical discovery and showcase the computational speedup gained by the TT-format decomposition.
引用
收藏
页码:1 / 77
页数:77
相关论文
共 50 条
  • [1] Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    [J]. Journal of Machine Learning Research, 2022, 23
  • [2] RANDOMIZED ALGORITHMS FOR ROUNDING IN THE TENSOR-TRAIN FORMAT
    Al Daas, Hussam
    Ballard, Grey
    Cazeaux, Paul
    Hallman, Eric
    Miedlar, Agnieszka
    Pasha, Mirjeta
    Reid, Tim W.
    Saibaba, Arvind K.
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2023, 45 (01): : A74 - A95
  • [3] Robust Tensor Tracking With Missing Data Under Tensor-Train Format
    Le Trung Thanh
    Abed-Meraim, Karim
    Nguyen Linh Trung
    Hafiane, Adel
    [J]. 2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 832 - 836
  • [4] HIGH-ORDER TENSOR COMPLETION FOR DATA RECOVERY VIA SPARSE TENSOR-TRAIN OPTIMIZATION
    Yuan, Longhao
    Zhao, Qibin
    Cao, Jianting
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 1258 - 1262
  • [5] Gradient-based optimization for regression in the functional tensor-train format
    Gorodetsky, Alex A.
    Jakeman, John D.
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2018, 374 : 1219 - 1238
  • [6] Tensor Completion using Low-Rank Tensor Train Decomposition by Riemannian Optimization
    Wang, Junli
    Zhao, Guangshe
    Wang, Dingheng
    Li, Guoqi
    [J]. 2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 3380 - 3384
  • [7] TENSOR-TRAIN DECOMPOSITION
    Oseledets, I. V.
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2011, 33 (05): : 2295 - 2317
  • [8] VARIANTS OF ALTERNATING LEAST SQUARES TENSOR COMPLETION IN THE TENSOR TRAIN FORMAT
    Grasedyck, Lars
    Kluge, Melanie
    Kraemer, Sebastian
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2015, 37 (05): : A2424 - A2450
  • [9] Tensor train completion: Local recovery guarantees via Riemannian optimization
    Budzinskiy, Stanislav
    Zamarashkin, Nikolai
    [J]. NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2023, 30 (06)
  • [10] AUTOMATIC DIFFERENTIATION FOR RIEMANNIAN OPTIMIZATION ON LOW-RANK MATRIX AND TENSOR-TRAIN MANIFOLDS
    NOVIKOV, A. L. E. X. A. N. D. E. R.
    RAKHUBA, M. A. X. I. M.
    OSELEDETS, I. V. A. N.
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2022, 44 (02): : A843 - A869