CROSS: EFFICIENT LOW-RANK TENSOR COMPLETION

被引:39
|
作者
Zhang, Anru [1 ]
机构
[1] Univ Wisconsin, Dept Stat, Madison, WI 53706 USA
来源
ANNALS OF STATISTICS | 2019年 / 47卷 / 02期
关键词
Cross tensor measurement; denoising; minimax rate-optimal; neuroimaging; tensor completion; MATRIX COMPLETION; OPTIMAL RATES; NORM; RECOVERY; RECONSTRUCTION; DIMENSIONALITY; DECOMPOSITIONS;
D O I
10.1214/18-AOS1694
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The completion of tensors, or high-order arrays, attracts significant attention in recent research. Current literature on tensor completion primarily focuses on recovery from a set of uniformly randomly measured entries, and the required number of measurements to achieve recovery is not guaranteed to be optimal. In addition, the implementation of some previous methods are NP-hard. In this article, we propose a framework for low-rank tensor completion via a novel tensor measurement scheme that we name Cross. The proposed procedure is efficient and easy to implement. In particular, we show that a third-order tensor of Tucker rank-(r(1), r(2), r(3)) in p(1)-by-p(2)-by-p(3) dimensional space can be recovered from as few as r(1)r(2)r(3) + r(1)(p(1) - r(1)) + r(2)(p(2) - r(2)) + r(3)(p(3) - r(3)) noiseless measurements, which matches the sample complexity lower bound. In the case of noisy measurements, we also develop a theoretical upper bound and the matching mini-max lower bound for recovery error over certain classes of low-rank tensors for the proposed procedure. The results can be further extended to fourth or higher-order tensors. Simulation studies show that the method performs well under a variety of settings. Finally, the procedure is illustrated through a real dataset in neuroimaging.
引用
收藏
页码:936 / 964
页数:29
相关论文
共 50 条
  • [41] Low-Rank Tensor Completion Pansharpening Based on Haze Correction
    Wang, Peng
    Su, Yiyang
    Huang, Bo
    Zhu, Daiyin
    Liu, Wenjian
    Nedzved, Alexander
    Krasnoproshin, Viktor V.
    Leung, Henry
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 20
  • [42] Low-rank tensor completion with sparse regularization in a transformed domain
    Wang, Ping-Ping
    Li, Liang
    Cheng, Guang-Hui
    [J]. NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2021, 28 (06)
  • [43] On Polynomial Time Methods for Exact Low-Rank Tensor Completion
    Dong Xia
    Ming Yuan
    [J]. Foundations of Computational Mathematics, 2019, 19 : 1265 - 1313
  • [44] Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Completion
    Tong, Tian
    Ma, Cong
    Prater-Bennette, Ashley
    Tripp, Erin
    Chi, Yuejie
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [45] A generalizable framework for low-rank tensor completion with numerical priors
    Yuan, Shiran
    Huang, Kaizhu
    [J]. PATTERN RECOGNITION, 2024, 155
  • [46] On Polynomial Time Methods for Exact Low-Rank Tensor Completion
    Xia, Dong
    Yuan, Ming
    [J]. FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2019, 19 (06) : 1265 - 1313
  • [47] Low-rank tensor completion via smooth matrix factorization
    Zheng, Yu-Bang
    Huang, Ting-Zhu
    Ji, Teng-Yu
    Zhao, Xi-Le
    Jiang, Tai-Xiang
    Ma, Tian-Hui
    [J]. APPLIED MATHEMATICAL MODELLING, 2019, 70 : 677 - 695
  • [48] Riemannian conjugate gradient method for low-rank tensor completion
    Duan, Shan-Qi
    Duan, Xue-Feng
    Li, Chun-Mei
    Li, Jiao-Fen
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2023, 49 (03)
  • [49] Low-Rank Tensor Completion with Spatio-Temporal Consistency
    Wang, Hua
    Nie, Feiping
    Huang, Heng
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 2846 - 2852
  • [50] Mixed norm regularized models for low-rank tensor completion
    Bu, Yuanyang
    Zhao, Yongqiang
    Chan, Jonathan Cheung-Wai
    [J]. INFORMATION SCIENCES, 2024, 670