Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion

被引:0
|
作者
Shi, Yuqing [1 ]
Du, Shiqiang [2 ]
Wang, Weilan [3 ]
机构
[1] Northwest Minzu Univ, Coll Elect Engn, Lanzhou 730030, Peoples R China
[2] Northwest Minzu Univ, Coll Math & Comp Sci, Lanzhou 730030, Peoples R China
[3] Northwest Minzu Univ, Chinese Natl Informat Technol Res Inst, Lanzhou 730030, Peoples R China
基金
中国国家自然科学基金;
关键词
Image Inpainting; Tensor Completion; Tensor Decomposition; Low-Rank Tensor; Tensor Nuclear Norm; MATRIX FACTORIZATION;
D O I
10.1109/CCDC52312.2021.9601608
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Low-rank tensor completion (LRTC) is a hot research direction in computer vision and machine learning because it can effectively recover the missing entries of tensor. However, most of the existing LRTC methods not only need to repeatedly calculate the time-consuming SVD decomposition, but also only consider a noise distribution in the model. To overcome the above shortcomings, based on the tensor-tensor product (t-product), we propose a new LRTC method-the robust low-rank and sparse tensor decomposition model (RLRST) for tensor completion. Firstly, in order to estimate the unknown entries in tensor data more accurately, two kinds of noise: sparse noise and Gaussian noise are considered simultaneously in RLRST. Secondly, the low-rank recovery tensor is equivalently decomposed into two smaller tensor t-products, which effectively saves the running time of the algorithm. Then, based on the alternate direction method of multipliers (ADMM), an efficient iterative updated algorithm is presented for our RLRST optimization. Finally, numerical experiments on image inpainting tasks demonstrate the effectiveness of our method over other related state-of-the-art tensor completion methods.
引用
收藏
页码:7138 / 7143
页数:6
相关论文
共 50 条
  • [1] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [2] Robust to Rank Selection: Low-Rank Sparse Tensor-Ring Completion
    Yu, Jinshi
    Zhou, Guoxu
    Sun, Weijun
    Xie, Shengli
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (05) : 2451 - 2465
  • [3] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [4] Tensor Factorization for Low-Rank Tensor Completion
    Zhou, Pan
    Lu, Canyi
    Lin, Zhouchen
    Zhang, Chao
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (03) : 1152 - 1163
  • [5] A low-rank and sparse enhanced Tucker decomposition approach for tensor completion
    Pan, Chenjian
    Ling, Chen
    He, Hongjin
    Qi, Liqun
    Xu, Yanwei
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2024, 465
  • [6] Low-Rank Tensor Completion by Approximating the Tensor Average Rank
    Wang, Zhanliang
    Dong, Junyu
    Liu, Xinguo
    Zeng, Xueying
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 4592 - 4600
  • [7] PROVABLE MODELS FOR ROBUST LOW-RANK TENSOR COMPLETION
    Huang, Bo
    Mu, Cun
    Goldfarb, Donald
    Wright, John
    [J]. PACIFIC JOURNAL OF OPTIMIZATION, 2015, 11 (02): : 339 - 364
  • [8] Robust approximations of low-rank minimization for tensor completion
    Gao, Shangqi
    Zhuang, Xiahai
    [J]. NEUROCOMPUTING, 2020, 379 : 319 - 333
  • [9] Robust Low-Rank Tensor Completion Based on Tensor Ring Rank via,&epsilon
    Li, Xiao Peng
    So, Hing Cheung
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 3685 - 3698
  • [10] Iterative tensor eigen rank minimization for low-rank tensor completion
    Su, Liyu
    Liu, Jing
    Tian, Xiaoqing
    Huang, Kaiyu
    Tan, Shuncheng
    [J]. INFORMATION SCIENCES, 2022, 616 : 303 - 329