A low-rank and sparse enhanced Tucker decomposition approach for tensor completion

被引:5
|
作者
Pan, Chenjian [1 ,2 ]
Ling, Chen [2 ]
He, Hongjin [1 ]
Qi, Liqun [3 ]
Xu, Yanwei [4 ]
机构
[1] Ningbo Univ, Sch Math & Stat, Ningbo 315211, Peoples R China
[2] Hangzhou Dianzi Univ, Sch Sci, Hangzhou 310018, Peoples R China
[3] Hong Kong Polytech Univ, Dept Appl Math, Kowloon, Hong Kong, Peoples R China
[4] 2012 Labs Huawei Tech Investment Co Ltd, Future Network Theory Lab, Shatin, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Tensor completion; Tucker decomposition; Nuclear norm; Internet traffic data; Image inpainting; THRESHOLDING ALGORITHM; MATRIX FACTORIZATION; RECOVERY;
D O I
10.1016/j.amc.2023.128432
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion. Our model possesses a sparse regularization term to promote a sparse core of the Tucker decomposition, which is beneficial for tensor data compression. Moreover, we enforce low-rank regularization terms on factor matrices of the Tucker decomposition for inducing the low-rankness of the tensor with a cheap computational cost. Numerically, we propose a customized splitting method with easy subproblems to solve the underlying model. It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors. A series of computational experiments on real-world data sets, including internet traffic data sets and color images, demonstrate that our model performs better than many existing state-of-the-art matricization and tensorization approaches in terms of achieving higher recovery accuracy.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    [J]. PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [2] Nonnegative Tensor Completion via Low-Rank Tucker Decomposition: Model and Algorithm
    Chen, Bilian
    Sun, Ting
    Zhou, Zhehao
    Zeng, Yifeng
    Cao, Langcai
    [J]. IEEE ACCESS, 2019, 7 : 95903 - 95914
  • [3] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [4] Tensor Regression Using Low-Rank and Sparse Tucker Decompositions
    Ahmed, Talal
    Raja, Haroon
    Bajwa, Waheed U.
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (04): : 944 - 966
  • [5] Low-rank tensor completion based on non-convex logDet function and Tucker decomposition
    Shi, Chengfei
    Huang, Zhengdong
    Wan, Li
    Xiong, Tifan
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2021, 15 (06) : 1169 - 1177
  • [6] Low-rank tensor completion based on non-convex logDet function and Tucker decomposition
    Chengfei Shi
    Zhengdong Huang
    Li Wan
    Tifan Xiong
    [J]. Signal, Image and Video Processing, 2021, 15 : 1169 - 1177
  • [7] "Sparse plus Low-Rank"tensor completion approach for recovering images and videos
    Pan, Chenjian
    Ling, Chen
    He, Hongjin
    Qi, Liqun
    Xu, Yanwei
    [J]. SIGNAL PROCESSING-IMAGE COMMUNICATION, 2024, 127
  • [8] Constructing low-rank Tucker tensor approximations using generalized completion
    Petrov, Sergey
    [J]. RUSSIAN JOURNAL OF NUMERICAL ANALYSIS AND MATHEMATICAL MODELLING, 2024, 39 (02) : 113 - 119
  • [9] Tensor Completion Via Collaborative Sparse and Low-Rank Transforms
    Li, Ben-Zheng
    Zhao, Xi-Le
    Wang, Jian-Li
    Chen, Yong
    Jiang, Tai-Xiang
    Liu, Jun
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2021, 7 : 1289 - 1303
  • [10] Low-rank tensor completion with sparse regularization in a transformed domain
    Wang, Ping-Ping
    Li, Liang
    Cheng, Guang-Hui
    [J]. NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2021, 28 (06)