CROSS: EFFICIENT LOW-RANK TENSOR COMPLETION

被引:39
|
作者
Zhang, Anru [1 ]
机构
[1] Univ Wisconsin, Dept Stat, Madison, WI 53706 USA
来源
ANNALS OF STATISTICS | 2019年 / 47卷 / 02期
关键词
Cross tensor measurement; denoising; minimax rate-optimal; neuroimaging; tensor completion; MATRIX COMPLETION; OPTIMAL RATES; NORM; RECOVERY; RECONSTRUCTION; DIMENSIONALITY; DECOMPOSITIONS;
D O I
10.1214/18-AOS1694
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The completion of tensors, or high-order arrays, attracts significant attention in recent research. Current literature on tensor completion primarily focuses on recovery from a set of uniformly randomly measured entries, and the required number of measurements to achieve recovery is not guaranteed to be optimal. In addition, the implementation of some previous methods are NP-hard. In this article, we propose a framework for low-rank tensor completion via a novel tensor measurement scheme that we name Cross. The proposed procedure is efficient and easy to implement. In particular, we show that a third-order tensor of Tucker rank-(r(1), r(2), r(3)) in p(1)-by-p(2)-by-p(3) dimensional space can be recovered from as few as r(1)r(2)r(3) + r(1)(p(1) - r(1)) + r(2)(p(2) - r(2)) + r(3)(p(3) - r(3)) noiseless measurements, which matches the sample complexity lower bound. In the case of noisy measurements, we also develop a theoretical upper bound and the matching mini-max lower bound for recovery error over certain classes of low-rank tensors for the proposed procedure. The results can be further extended to fourth or higher-order tensors. Simulation studies show that the method performs well under a variety of settings. Finally, the procedure is illustrated through a real dataset in neuroimaging.
引用
收藏
页码:936 / 964
页数:29
相关论文
共 50 条
  • [1] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    [J]. PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [2] Tensor Factorization for Low-Rank Tensor Completion
    Zhou, Pan
    Lu, Canyi
    Lin, Zhouchen
    Zhang, Chao
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (03) : 1152 - 1163
  • [3] Low-Rank Tensor Completion by Approximating the Tensor Average Rank
    Wang, Zhanliang
    Dong, Junyu
    Liu, Xinguo
    Zeng, Xueying
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 4592 - 4600
  • [4] Iterative tensor eigen rank minimization for low-rank tensor completion
    Su, Liyu
    Liu, Jing
    Tian, Xiaoqing
    Huang, Kaiyu
    Tan, Shuncheng
    [J]. INFORMATION SCIENCES, 2022, 616 : 303 - 329
  • [5] Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data
    Ji, Teng-Yu
    Zhao, Xi-Le
    Sun, Dong-Lin
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1162 - 1166
  • [6] Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train
    Bengua, Johann A.
    Phien, Ho N.
    Hoang Duong Tuan
    Do, Minh N.
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (05) : 2466 - 2479
  • [7] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    [J]. BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [8] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [9] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    [J]. BIT Numerical Mathematics, 2014, 54 : 447 - 468
  • [10] A dual framework for low-rank tensor completion
    Nimishakavi, Madhav
    Jawanpuria, Pratik
    Mishra, Bamdev
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31