PROVABLE MODELS FOR ROBUST LOW-RANK TENSOR COMPLETION

被引:0
|
作者
Huang, Bo [1 ]
Mu, Cun [1 ]
Goldfarb, Donald [1 ]
Wright, John [2 ]
机构
[1] Columbia Univ, Dept Ind Engn & Operat Res, New York, NY 10027 USA
[2] Columbia Univ, Dept Elect Engn, New York, NY 10027 USA
来源
PACIFIC JOURNAL OF OPTIMIZATION | 2015年 / 11卷 / 02期
关键词
robust low-rank tensor completion; tensor robust principal component analysis; Tucker decomposition; strongly convex programming; incoherence conditions; sum of nuclear norms minimization; MATRIX COMPLETION; DECOMPOSITIONS;
D O I
暂无
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we rigorously study tractable models for provably recovering low-rank tensors. Unlike their matrix-based predecessors, current convex approaches for recovering low-rank tensors based on incomplete (tensor completion) and/or grossly corrupted (tensor robust principal analysis) observations still suffer from the lack of theoretical guarantees, although they have been used in various recent applications and have exhibited promising empirical performance. In this work, we attempt to fill this gap. Specifically, we propose a class of convex recovery models (including strongly convex programs) that can be proved to guarantee exact recovery under a set of new tensor incoherence conditions which only require the existence of one low-rank mode, and characterize the problems where our models tend to perform well.
引用
收藏
页码:339 / 364
页数:26
相关论文
共 50 条
  • [1] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    [J]. PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [2] Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Completion
    Tong, Tian
    Ma, Cong
    Prater-Bennette, Ashley
    Tripp, Erin
    Chi, Yuejie
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [3] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [4] Robust approximations of low-rank minimization for tensor completion
    Gao, Shangqi
    Zhuang, Xiahai
    [J]. NEUROCOMPUTING, 2020, 379 : 319 - 333
  • [5] Robust Low-Rank Tensor Completion Based on Tensor Ring Rank via,&epsilon
    Li, Xiao Peng
    So, Hing Cheung
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 3685 - 3698
  • [6] Tensor Factorization for Low-Rank Tensor Completion
    Zhou, Pan
    Lu, Canyi
    Lin, Zhouchen
    Zhang, Chao
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (03) : 1152 - 1163
  • [7] Robust to Rank Selection: Low-Rank Sparse Tensor-Ring Completion
    Yu, Jinshi
    Zhou, Guoxu
    Sun, Weijun
    Xie, Shengli
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (05) : 2451 - 2465
  • [8] Low-Rank Tensor Completion by Approximating the Tensor Average Rank
    Wang, Zhanliang
    Dong, Junyu
    Liu, Xinguo
    Zeng, Xueying
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 4592 - 4600
  • [9] Mixed norm regularized models for low-rank tensor completion
    Bu, Yuanyang
    Zhao, Yongqiang
    Chan, Jonathan Cheung-Wai
    [J]. INFORMATION SCIENCES, 2024, 670
  • [10] ROBUST LOW-RANK TENSOR RECOVERY: MODELS AND ALGORITHMS
    Goldfarb, Donald
    Qin, Zhiwei
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2014, 35 (01) : 225 - 253