Low-rank tensor completion based on tensor train rank with partially overlapped sub-blocks

被引:9
|
作者
He, Jingfei [1 ]
Zheng, Xunan [1 ]
Gao, Peng [1 ]
Zhou, Yatong [1 ]
机构
[1] Hebei Univ Technol, Sch Elect & Informat Engn, Tianjin Key Lab Elect Mat & Devices, 5340 Xiping Rd, Tianjin 300401, Peoples R China
关键词
Tensor train rank; Low-rank tensor completion; Partially overlapped sub-block; Tensor augmentation technique;
D O I
10.1016/j.sigpro.2021.108339
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, low-rank tensor based methods using tensor train (TT) rank have achieved promising performance on multidimensional signal processing. Especially taking advantage of a tensor augmentation technique called ket augmentation (KA), methods based on TT rank can more efficiently capture correlation of the generated higher-order tensor, but serious block artifacts are caused. In this paper, a tensor completion method using parallel matrix factorization based on TT rank with partially overlapped sub-blocks is proposed. Combined with the partially overlapped sub-blocks scheme and KA technique, an improved tensor augmentation technique is proposed to further increase the order of generated tensor, enhance the low-rankness, and alleviate block artifacts. To reduce the computational time, parallel matrix factorization is utilized to minimize the TT rank. Besides, a fixed weighting function is also developed to reduce the blockiness effect according to the shortest distance between the pixel and the corresponding sub-block boundaries. Numerical experiments demonstrate the superiority of the proposed method over the existing state-of-the-art methods in terms of quality and quantity. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:8
相关论文
共 50 条
  • [21] Low-rank tensor completion via combined Tucker and Tensor Train for color image recovery
    Zhang, Tianheng
    Zhao, Jianli
    Sun, Qiuxia
    Zhang, Bin
    Chen, Jianjian
    Gong, Maoguo
    APPLIED INTELLIGENCE, 2022, 52 (07) : 7761 - 7776
  • [22] Tensor Train Factorization with Spatio-temporal Smoothness for Streaming Low-rank Tensor Completion
    Yu, Gaohang
    Wan, Shaochun
    Ling, Chen
    Qi, Liqun
    Xu, Yanwei
    FRONTIERS OF MATHEMATICS, 2024, 19 (05): : 933 - 959
  • [23] Low-rank tensor completion via combined Tucker and Tensor Train for color image recovery
    Tianheng Zhang
    Jianli Zhao
    Qiuxia Sun
    Bin Zhang
    Jianjian Chen
    Maoguo Gong
    Applied Intelligence, 2022, 52 : 7761 - 7776
  • [24] Auto-weighted robust low-rank tensor completion via tensor-train
    Chen, Chuan
    Wu, Zhe-Bin
    Chen, Zi-Tai
    Zheng, Zi-Bin
    Zhang, Xiong-Jun
    INFORMATION SCIENCES, 2021, 567 : 100 - 115
  • [25] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    BIT Numerical Mathematics, 2014, 54 : 447 - 468
  • [26] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [27] CROSS: EFFICIENT LOW-RANK TENSOR COMPLETION
    Zhang, Anru
    ANNALS OF STATISTICS, 2019, 47 (02): : 936 - 964
  • [28] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [29] Optimal Low-Rank Tensor Tree Completion
    Li, Zihan
    Zhu, Ce
    Long, Zhen
    Liu, Yipeng
    2023 IEEE 25TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, MMSP, 2023,
  • [30] A dual framework for low-rank tensor completion
    Nimishakavi, Madhav
    Jawanpuria, Pratik
    Mishra, Bamdev
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31