Robust principal component analysis based on tensor train rank and Schatten p-norm

被引:2
|
作者
Zhang, Pengcheng [1 ]
Geng, Juan [1 ]
Liu, Yapeng [1 ]
Yang, Shouxin [1 ]
机构
[1] Hebei Univ Econ & Business, Sch Math & Stat, Shijiazhuang 050000, Hebei, Peoples R China
来源
VISUAL COMPUTER | 2023年 / 39卷 / 11期
关键词
Tensor robust principal component analysis; Tensor train rank; Schatten p-norm; High-dimensional data; NUCLEAR NORM; MATRIX; SPARSE; COMPLETION; IMAGE; REGULARIZATION;
D O I
10.1007/s00371-022-02699-5
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
For a given data, robust principal component analysis (RPCA) aims to exactly recover the low-rank and sparse components from it. To date, as the convex relaxations of tensor rank, a number of tensor nuclear norms have been defined and applied to approximate the tensor rank because of their convexity. However, those relaxations may make the solution seriously deviate from the original solution for real-world data recovery. In this paper, we define the tensor Schatten p-norm based on tensor train rank and propose a new model for tensor robust principal component analysis (named TTSp). We solve the proposed model iteratively by using the ADMM algorithm. In addition, a tensor augmentation tool called ket augmentation is introduced to convert lower-order tensors to higher-order tensors to exploit the low-TT-rank structure. We report higher PSNR and SSIM values in numerical experiments to image recovery problems which demonstrate the superiority of our method. Further experiments on real data also illustrate the effectiveness of the proposed method.
引用
收藏
页码:5849 / 5867
页数:19
相关论文
共 50 条
  • [1] Robust principal component analysis based on tensor train rank and Schatten p-norm
    Pengcheng Zhang
    Juan Geng
    Yapeng Liu
    Shouxin Yang
    [J]. The Visual Computer, 2023, 39 : 5849 - 5867
  • [2] Schatten p-norm based principal component analysis
    Chang, Heyou
    Luo, Lei
    Yang, Jian
    Yang, Meng
    [J]. NEUROCOMPUTING, 2016, 207 : 754 - 762
  • [3] Low-rank tensor train for tensor robust principal component analysis
    Yang, Jing-Hua
    Zhao, Xi-Le
    Ji, Teng-Yu
    Ma, Tian-Hui
    Huang, Ting-Zhu
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2020, 367
  • [4] Two-dimensional principal component analysis based on Schatten p-norm for image feature extraction
    Du, Haishun
    Hu, Qingpu
    Jiang, Manman
    Zhang, Fan
    [J]. JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2015, 32 : 55 - 62
  • [5] Robust Low-Rank Kernel Subspace Clustering based on the Schatten p-norm and Correntropy
    Zhang, Xiaoqian
    Chen, Beijia
    Sun, Huaijiang
    Liu, Zhigui
    Ren, Zhenwen
    Li, Yanmeng
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (12) : 2426 - 2437
  • [6] Robust Regression with Nonconvex Schatten p-Norm Minimization
    Zeng, Deyu
    Yin, Ming
    Xie, Shengli
    Wu, Zongze
    [J]. NEURAL INFORMATION PROCESSING (ICONIP 2018), PT II, 2018, 11302 : 498 - 508
  • [7] THE SCHATTEN p-NORM ON Rn
    Huang, Chien-Hao
    Chen, Jein-Shan
    Hu, Chu-Chin
    [J]. JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2020, 21 (01) : 21 - 29
  • [8] Tensor Robust Principal Component Analysis with a New Tensor Nuclear Norm
    Lu, Canyi
    Feng, Jiashi
    Chen, Yudong
    Liu, Wei
    Lin, Zhouchen
    Yan, Shuicheng
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (04) : 925 - 938
  • [9] Graph Regularized Low-Rank Tensor-Train for Robust Principal Component Analysis
    Sofuoglu, Seyyid Emre
    Aviyente, Selin
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1152 - 1156
  • [10] Exact minimum rank approximation via Schatten p-norm minimization
    Liu, Lu
    Huang, Wei
    Chen, Di-Rong
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2014, 267 : 218 - 227