Generalized Low-Rank Plus Sparse Tensor Estimation by Fast Riemannian Optimization

被引:9
|
作者
Cai, Jian-Feng [1 ]
Li, Jingyang [1 ]
Xia, Dong [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Math, Kowloon, Hong Kong, Peoples R China
关键词
Binary tensor; Generalized linear model; Heavy-tailed noise; Tensor robust PCA; Tucker decomposition; COMMUNITY DETECTION; REGRESSION; DECOMPOSITIONS; FACTORIZATION; COMPLETION;
D O I
10.1080/01621459.2022.2063131
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We investigate a generalized framework to estimate a latent low-rank plus sparse tensor, where the low-rank tensor often captures the multi-way principal components and the sparse tensor accounts for potential model mis-specifications or heterogeneous signals that are unexplainable by the low-rank part. The framework flexibly covers both linear and generalized linear models, and can easily handle continuous or categorical variables. We propose a fast algorithm by integrating the Riemannian gradient descent and a novel gradient pruning procedure. Under suitable conditions, the algorithm converges linearly and can simultaneously estimate both the low-rank and sparse tensors. The statistical error bounds of final estimates are established in terms of the gradient of loss function. The error bounds are generally sharp under specific statistical models, for example, the sub-Gaussian robust PCA and Bernoulli tensor model. Moreover, our method achieves nontrivial error bounds for heavy-tailed tensor PCA whenever the noise has a finite 2 + epsilon moment. We apply our method to analyze the international trade flow dataset and the statistician hypergraph coauthorship network, both yielding new and interesting findings. Supplementary materials for this article are available online.
引用
收藏
页码:2588 / 2604
页数:17
相关论文
共 50 条
  • [41] Robust to Rank Selection: Low-Rank Sparse Tensor-Ring Completion
    Yu, Jinshi
    Zhou, Guoxu
    Sun, Weijun
    Xie, Shengli
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (05) : 2451 - 2465
  • [42] Low-rank sparse fully-connected tensor network for tensor completion
    Yu, Jinshi
    Li, Zhifu
    Ma, Ge
    Wang, Jingwen
    Zou, Tao
    Zhou, Guoxu
    [J]. PATTERN RECOGNITION, 2025, 158
  • [43] Adaptive Rank Estimation Based Tensor Factorization Algorithm for Low-Rank Tensor Completion
    Liu, Han
    Liu, Jing
    Su, Liyu
    [J]. PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 3444 - 3449
  • [44] PROVABLE SECOND-ORDER RIEMANNIAN GAUSS-NEWTON METHOD FOR LOW-RANK TENSOR ESTIMATION
    Luo, Yuetian
    Ma, Qin
    Zhang, Chi
    Zhang, Anru R.
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 9057 - 9061
  • [45] Exterior-Point Optimization for Sparse and Low-Rank Optimization
    Gupta, Shuvomoy Das
    Stellato, Bartolomeo
    Van Parys, Bart P. G.
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, 202 (02) : 795 - 833
  • [46] Sparse Bayesian Methods for Low-Rank Matrix Estimation
    Babacan, S. Derin
    Luessi, Martin
    Molina, Rafael
    Katsaggelos, Aggelos K.
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (08) : 3964 - 3977
  • [47] Sparse and low-rank abundance estimation with structural information
    Yuan Jing
    Zhang Yu-Jin
    Yang De-He
    [J]. JOURNAL OF INFRARED AND MILLIMETER WAVES, 2018, 37 (02) : 144 - 153
  • [48] Analysis of VB Factorizations for Sparse and Low-Rank Estimation
    Wipf, David
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [49] Optimality conditions for Tucker low-rank tensor optimization
    Luo, Ziyan
    Qi, Liqun
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2023, 86 (03) : 1275 - 1298
  • [50] Optimality conditions for Tucker low-rank tensor optimization
    Ziyan Luo
    Liqun Qi
    [J]. Computational Optimization and Applications, 2023, 86 : 1275 - 1298