Nonconvex Low-Rank Tensor Completion from Noisy Data

被引:25
|
作者
Cai, Changxiao [1 ]
Li, Gen [2 ]
Poor, H. Vincent [1 ]
Chen, Yuxin [1 ]
机构
[1] Princeton Univ, Dept Elect Engn, Princeton, NJ 08540 USA
[2] Tsinghua Univ, Dept Elect Engn, Beijing 100084, Peoples R China
基金
美国国家科学基金会;
关键词
tensor completion; nonconvex optimization; gradient descent; spectral methods; entrywise statistical guarantees; minimaxity; MATRIX COMPLETION; BIG DATA; FACTORIZATION; OPTIMIZATION; RECOVERY; MODELS; DECOMPOSITIONS; ALGORITHMS; DESCENT; SPARSE;
D O I
10.1287/opre.2021.2106
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
We study a noisy tensor completion problem of broad practical interest, namely, the reconstruction of a low-rank tensor from highly incomplete and randomly corrupted observations of its entries. Whereas a variety of prior work has been dedicated to this problem, prior algorithms either are computationally too expensive for large-scale applications or come with suboptimal statistical guarantees. Focusing on "incoherent" and well -conditioned tensors of a constant canonical polyadic rank, we propose a two-stage nonconvex algorithm-(vanilla) gradient descent following a rough initialization-that achieves the best of both worlds. Specifically, the proposed nonconvex algorithm faithfully completes the tensor and retrieves all individual tensor factors within nearly linear time, while at the same time enjoying near-optimal statistical guarantees (i.e., minimal sample complexity and optimal estimation accuracy). The estimation errors are evenly spread out across all entries, thus achieving optimal l(infinity) statistical accuracy. We also discuss how to extend our approach to accommodate asymmetric tensors. The insight conveyed through our analysis of nonconvex optimization might have implications for other tensor estimation problems.
引用
收藏
页码:1219 / 1237
页数:20
相关论文
共 50 条
  • [1] Nonconvex Low-Rank Symmetric Tensor Completion from Noisy Data
    Cai, Changxiao
    Li, Gen
    Poor, H. Vincent
    Chen, Yuxin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [2] Low-Rank tensor completion based on nonconvex regularization
    Su, Xinhua
    Ge, Huanmin
    Liu, Zeting
    Shen, Yanfei
    [J]. SIGNAL PROCESSING, 2023, 212
  • [4] Noisy Tensor Completion via Low-Rank Tensor Ring
    Qiu, Yuning
    Zhou, Guoxu
    Zhao, Qibin
    Xie, Shengli
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1127 - 1141
  • [5] A nonconvex low-rank tensor completion model for spatiotemporal traffic data imputation
    Chen, Xinyu
    Yang, Jinming
    Sun, Lijun
    [J]. TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2020, 117
  • [6] A nonconvex low-rank tensor completion model for spatiotemporal traffic data imputation
    Chen, Xinyu
    Yang, Jinming
    Sun, Lijun
    [J]. Transportation Research Part C: Emerging Technologies, 2020, 117
  • [7] Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Completion
    Tong, Tian
    Ma, Cong
    Prater-Bennette, Ashley
    Tripp, Erin
    Chi, Yuejie
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [8] Laplace function based nonconvex surrogate for low-rank tensor completion
    Xu, Wen-Hao
    Zhao, Xi-Le
    Ji, Teng-Yu
    Miao, Jia-Qing
    Ma, Tian-Hui
    Wang, Si
    Huang, Ting-Zhu
    [J]. SIGNAL PROCESSING-IMAGE COMMUNICATION, 2019, 73 : 62 - 69
  • [9] Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data
    Ji, Teng-Yu
    Zhao, Xi-Le
    Sun, Dong-Lin
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1162 - 1166
  • [10] Enhanced Nonconvex Low-Rank Approximation of Tensor Multi-Modes for Tensor Completion
    Zeng, Haijin
    Chen, Yongyong
    Xie, Xiaozhen
    Ning, Jifeng
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2021, 7 : 164 - 177