Generalized Low-Rank Plus Sparse Tensor Estimation by Fast Riemannian Optimization

被引:9
|
作者
Cai, Jian-Feng [1 ]
Li, Jingyang [1 ]
Xia, Dong [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Math, Kowloon, Hong Kong, Peoples R China
关键词
Binary tensor; Generalized linear model; Heavy-tailed noise; Tensor robust PCA; Tucker decomposition; COMMUNITY DETECTION; REGRESSION; DECOMPOSITIONS; FACTORIZATION; COMPLETION;
D O I
10.1080/01621459.2022.2063131
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We investigate a generalized framework to estimate a latent low-rank plus sparse tensor, where the low-rank tensor often captures the multi-way principal components and the sparse tensor accounts for potential model mis-specifications or heterogeneous signals that are unexplainable by the low-rank part. The framework flexibly covers both linear and generalized linear models, and can easily handle continuous or categorical variables. We propose a fast algorithm by integrating the Riemannian gradient descent and a novel gradient pruning procedure. Under suitable conditions, the algorithm converges linearly and can simultaneously estimate both the low-rank and sparse tensors. The statistical error bounds of final estimates are established in terms of the gradient of loss function. The error bounds are generally sharp under specific statistical models, for example, the sub-Gaussian robust PCA and Bernoulli tensor model. Moreover, our method achieves nontrivial error bounds for heavy-tailed tensor PCA whenever the noise has a finite 2 + epsilon moment. We apply our method to analyze the international trade flow dataset and the statistician hypergraph coauthorship network, both yielding new and interesting findings. Supplementary materials for this article are available online.
引用
收藏
页码:2588 / 2604
页数:17
相关论文
共 50 条
  • [1] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    [J]. BIT Numerical Mathematics, 2014, 54 : 447 - 468
  • [2] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    [J]. BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [3] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    [J]. PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [4] Tensor Completion using Low-Rank Tensor Train Decomposition by Riemannian Optimization
    Wang, Junli
    Zhao, Guangshe
    Wang, Dingheng
    Li, Guoqi
    [J]. 2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 3380 - 3384
  • [5] Sparse and Low-rank Tensor Estimation via Cubic Sketchings
    Hao, Botao
    Zhang, Anru
    Cheng, Guang
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 1319 - 1329
  • [6] Sparse and Low-Rank Tensor Estimation via Cubic Sketchings
    Hao, Botao
    Zhang, Anru
    Cheng, Guang
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2020, 66 (09) : 5927 - 5964
  • [7] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [8] PRECONDITIONED LOW-RANK RIEMANNIAN OPTIMIZATION FOR LINEAR SYSTEMS WITH TENSOR PRODUCT STRUCTURE
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2016, 38 (04): : A2018 - A2044
  • [9] Fast randomized tensor singular value thresholding for low-rank tensor optimization
    Che, Maolin
    Wang, Xuezhong
    Wei, Yimin
    Zhao, Xile
    [J]. NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2022, 29 (06)
  • [10] "Sparse plus Low-Rank"tensor completion approach for recovering images and videos
    Pan, Chenjian
    Ling, Chen
    He, Hongjin
    Qi, Liqun
    Xu, Yanwei
    [J]. SIGNAL PROCESSING-IMAGE COMMUNICATION, 2024, 127