Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion

被引:0
|
作者
Shi, Yuqing [1 ]
Du, Shiqiang [2 ]
Wang, Weilan [3 ]
机构
[1] Northwest Minzu Univ, Coll Elect Engn, Lanzhou 730030, Peoples R China
[2] Northwest Minzu Univ, Coll Math & Comp Sci, Lanzhou 730030, Peoples R China
[3] Northwest Minzu Univ, Chinese Natl Informat Technol Res Inst, Lanzhou 730030, Peoples R China
基金
中国国家自然科学基金;
关键词
Image Inpainting; Tensor Completion; Tensor Decomposition; Low-Rank Tensor; Tensor Nuclear Norm; MATRIX FACTORIZATION;
D O I
10.1109/CCDC52312.2021.9601608
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Low-rank tensor completion (LRTC) is a hot research direction in computer vision and machine learning because it can effectively recover the missing entries of tensor. However, most of the existing LRTC methods not only need to repeatedly calculate the time-consuming SVD decomposition, but also only consider a noise distribution in the model. To overcome the above shortcomings, based on the tensor-tensor product (t-product), we propose a new LRTC method-the robust low-rank and sparse tensor decomposition model (RLRST) for tensor completion. Firstly, in order to estimate the unknown entries in tensor data more accurately, two kinds of noise: sparse noise and Gaussian noise are considered simultaneously in RLRST. Secondly, the low-rank recovery tensor is equivalently decomposed into two smaller tensor t-products, which effectively saves the running time of the algorithm. Then, based on the alternate direction method of multipliers (ADMM), an efficient iterative updated algorithm is presented for our RLRST optimization. Finally, numerical experiments on image inpainting tasks demonstrate the effectiveness of our method over other related state-of-the-art tensor completion methods.
引用
收藏
页码:7138 / 7143
页数:6
相关论文
共 50 条
  • [21] A dual framework for low-rank tensor completion
    Nimishakavi, Madhav
    Jawanpuria, Pratik
    Mishra, Bamdev
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [22] On the equivalence between low-rank matrix completion and tensor rank
    Derksen, Harm
    [J]. LINEAR & MULTILINEAR ALGEBRA, 2018, 66 (04): : 645 - 667
  • [23] A Weighted Tensor Factorization Method for Low-rank Tensor Completion
    Cheng, Miaomiao
    Jing, Liping
    Ng, Michael K.
    [J]. 2019 IEEE FIFTH INTERNATIONAL CONFERENCE ON MULTIMEDIA BIG DATA (BIGMM 2019), 2019, : 30 - 38
  • [24] Noisy Tensor Completion via Low-Rank Tensor Ring
    Qiu, Yuning
    Zhou, Guoxu
    Zhao, Qibin
    Xie, Shengli
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1127 - 1141
  • [25] Boosted Sparse and Low-Rank Tensor Regression
    He, Lifang
    Chen, Kun
    Xu, Wanwan
    Zhou, Jiayu
    Wang, Fei
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [26] Tensor low-rank sparse representation for tensor subspace learning
    Du, Shiqiang
    Shi, Yuqing
    Shan, Guangrong
    Wang, Weilan
    Ma, Yide
    [J]. NEUROCOMPUTING, 2021, 440 : 351 - 364
  • [27] Statistical mechanics of low-rank tensor decomposition
    Kadmon, Jonathan
    Ganguli, Surya
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [28] Statistical mechanics of low-rank tensor decomposition
    Kadmon, Jonathan
    Ganguli, Surya
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [29] Higher-dimension Tensor Completion via Low-rank Tensor Ring Decomposition
    Yuan, Longhao
    Cao, Jianting
    Zhao, Xuyang
    Wu, Qiang
    Zhao, Qibin
    [J]. 2018 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2018, : 1071 - 1076
  • [30] Online Robust Low-Rank Tensor Learning
    Li, Ping
    Feng, Jiashi
    Jin, Xiaojie
    Zhang, Luming
    Xu, Xianghua
    Yan, Shuicheng
    [J]. PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2180 - 2186