Low-Rank tensor completion based on nonconvex regularization

被引:5
|
作者
Su, Xinhua [1 ]
Ge, Huanmin [1 ]
Liu, Zeting [1 ]
Shen, Yanfei [1 ]
机构
[1] Beijing Sport Univ, Sports Engn Coll, Beijing 100084, Peoples R China
关键词
Tensor completion; Nonconvex tensor nuclear norm; Tensor singular value decomposition; Low-rank; NUCLEAR NORM; MATRIX; DECOMPOSITIONS; APPROXIMATION; FACTORIZATION; RECOVERY; SPARSE;
D O I
10.1016/j.sigpro.2023.109157
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we consider the low-rank tensor completion which aims to exactly recover incomplete high-dimensional visual data. Existing studies utilize widely tensor nuclear norm minimization (TNNM), a convex relaxation to tensor-rank minimization (TRM), to solve tensor completion tasks. Nevertheless, TNNM ignores the difference between different tensor singular values induced by the tensor singular value decompositions (t-SVD) and then the obtained solution may be suboptimal. In this paper, we pro- pose a nonconvex minimization approach to solve the tensor completion problem more effectively by adopting a nonconvex regularization to further approximate the tensor-rank. Moreover, alternating direc- tion method of multipliers (ADMM) and iteratively reweighted nuclear norm (IRNN) are adopted to solve the constructed nonconvex models efficiently, and the convergence can also be guaranteed. Finally, we present that the proposed nonconvex optimization methods are suitable for solving other TRM problems induced by any invertible linear transform, such as subspace clustering based on low-rank representa- tion. Extensive experiments on real images and videos validate the superiority of our approach over the state-of-the-art algorithms.& COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] A Nonconvex Relaxation Approach to Low-Rank Tensor Completion
    Zhang, Xiongjun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (06) : 1659 - 1671
  • [2] Laplace function based nonconvex surrogate for low-rank tensor completion
    Xu, Wen-Hao
    Zhao, Xi-Le
    Ji, Teng-Yu
    Miao, Jia-Qing
    Ma, Tian-Hui
    Wang, Si
    Huang, Ting-Zhu
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2019, 73 : 62 - 69
  • [3] Tensor Completion via Nonlocal Low-Rank Regularization
    Xie, Ting
    Li, Shutao
    Fang, Leyuan
    Liu, Licheng
    IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (06) : 2344 - 2354
  • [4] Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Completion
    Tong, Tian
    Ma, Cong
    Prater-Bennette, Ashley
    Tripp, Erin
    Chi, Yuejie
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [5] Nonconvex Low-Rank Tensor Completion from Noisy Data
    Cai, Changxiao
    Li, Gen
    Poor, H. Vincent
    Chen, Yuxin
    OPERATIONS RESEARCH, 2022, 70 (02) : 1219 - 1237
  • [6] Low-Rank Tensor Completion by Truncated Nuclear Norm Regularization
    Xue, Shengke
    Qiu, Wenyuan
    Liu, Fan
    Jin, Xinyu
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2600 - 2605
  • [7] Low-rank tensor completion with sparse regularization in a transformed domain
    Wang, Ping-Ping
    Li, Liang
    Cheng, Guang-Hui
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2021, 28 (06)
  • [8] Low-rank Tensor Learning with Nonconvex Overlapped Nuclear Norm Regularization
    Yao, Quanming
    Wang, Yaqing
    Han, Bo
    Kwok, James T.
    arXiv, 2022,
  • [9] Low-rank Tensor Learning with Nonconvex Overlapped Nuclear Norm Regularization
    Yao, Quanming
    Wang, Yaqing
    Han, Bo
    Kwok, James T.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [10] Low-rank Tensor Learning with Nonconvex Overlapped Nuclear Norm Regularization
    Yao, Quanming
    Wang, Yaqing
    Han, Bo
    Kwok, James T.
    Journal of Machine Learning Research, 2022, 23