Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion

被引:0
|
作者
Shi, Yuqing [1 ]
Du, Shiqiang [2 ]
Wang, Weilan [3 ]
机构
[1] Northwest Minzu Univ, Coll Elect Engn, Lanzhou 730030, Peoples R China
[2] Northwest Minzu Univ, Coll Math & Comp Sci, Lanzhou 730030, Peoples R China
[3] Northwest Minzu Univ, Chinese Natl Informat Technol Res Inst, Lanzhou 730030, Peoples R China
基金
中国国家自然科学基金;
关键词
Image Inpainting; Tensor Completion; Tensor Decomposition; Low-Rank Tensor; Tensor Nuclear Norm; MATRIX FACTORIZATION;
D O I
10.1109/CCDC52312.2021.9601608
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Low-rank tensor completion (LRTC) is a hot research direction in computer vision and machine learning because it can effectively recover the missing entries of tensor. However, most of the existing LRTC methods not only need to repeatedly calculate the time-consuming SVD decomposition, but also only consider a noise distribution in the model. To overcome the above shortcomings, based on the tensor-tensor product (t-product), we propose a new LRTC method-the robust low-rank and sparse tensor decomposition model (RLRST) for tensor completion. Firstly, in order to estimate the unknown entries in tensor data more accurately, two kinds of noise: sparse noise and Gaussian noise are considered simultaneously in RLRST. Secondly, the low-rank recovery tensor is equivalently decomposed into two smaller tensor t-products, which effectively saves the running time of the algorithm. Then, based on the alternate direction method of multipliers (ADMM), an efficient iterative updated algorithm is presented for our RLRST optimization. Finally, numerical experiments on image inpainting tasks demonstrate the effectiveness of our method over other related state-of-the-art tensor completion methods.
引用
收藏
页码:7138 / 7143
页数:6
相关论文
共 50 条
  • [31] Tensor Denoising Using Low-Rank Tensor Train Decomposition
    Gong, Xiao
    Chen, Wei
    Chen, Jie
    Ai, Bo
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1685 - 1689
  • [32] Adaptive Rank Estimation Based Tensor Factorization Algorithm for Low-Rank Tensor Completion
    Liu, Han
    Liu, Jing
    Su, Liyu
    [J]. PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 3444 - 3449
  • [33] Low-rank Tensor Tracking
    Javed, Sajid
    Dias, Jorge
    Werghi, Naoufel
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 605 - 614
  • [34] Low-rank tensor completion for visual data recovery via the tensor train rank-1 decomposition
    Liu, Xiaohua
    Jing, Xiao-Yuan
    Tang, Guijin
    Wu, Fei
    Dong, Xiwei
    [J]. IET IMAGE PROCESSING, 2020, 14 (01) : 114 - 124
  • [35] Union of low-rank tensor spaces: Clustering and completion
    Ashraphijuo, Morteza
    Wang, Xiaodong
    [J]. Journal of Machine Learning Research, 2020, 21
  • [36] Hyperspectral Superresolution Reconstruction via Decomposition of Low-Rank and Sparse Tensor
    Wu, Huajing
    Zhang, Kefei
    Wu, Suqin
    Zhang, Minghao
    Shi, Shuangshuang
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2022, 15 : 8943 - 8957
  • [37] PARALLEL MATRIX FACTORIZATION FOR LOW-RANK TENSOR COMPLETION
    Xu, Yangyang
    Hao, Ruru
    Yin, Wotao
    Su, Zhixun
    [J]. INVERSE PROBLEMS AND IMAGING, 2015, 9 (02) : 601 - 624
  • [38] Hyperspectral Image Denoising With Group Sparse and Low-Rank Tensor Decomposition
    Huang, Zhihong
    Li, Shutao
    Fang, Leyuan
    Li, Huali
    Benediktsson, Jon Atli
    [J]. IEEE ACCESS, 2018, 6 : 1380 - 1390
  • [39] Union of Low-Rank Tensor Spaces: Clustering and Completion
    Ashraphijuo, Morteza
    Wang, Xiaodong
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [40] A Nonconvex Relaxation Approach to Low-Rank Tensor Completion
    Zhang, Xiongjun
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (06) : 1659 - 1671