Low-rank tensor completion based on tensor train rank with partially overlapped sub-blocks

被引:9
|
作者
He, Jingfei [1 ]
Zheng, Xunan [1 ]
Gao, Peng [1 ]
Zhou, Yatong [1 ]
机构
[1] Hebei Univ Technol, Sch Elect & Informat Engn, Tianjin Key Lab Elect Mat & Devices, 5340 Xiping Rd, Tianjin 300401, Peoples R China
关键词
Tensor train rank; Low-rank tensor completion; Partially overlapped sub-block; Tensor augmentation technique;
D O I
10.1016/j.sigpro.2021.108339
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, low-rank tensor based methods using tensor train (TT) rank have achieved promising performance on multidimensional signal processing. Especially taking advantage of a tensor augmentation technique called ket augmentation (KA), methods based on TT rank can more efficiently capture correlation of the generated higher-order tensor, but serious block artifacts are caused. In this paper, a tensor completion method using parallel matrix factorization based on TT rank with partially overlapped sub-blocks is proposed. Combined with the partially overlapped sub-blocks scheme and KA technique, an improved tensor augmentation technique is proposed to further increase the order of generated tensor, enhance the low-rankness, and alleviate block artifacts. To reduce the computational time, parallel matrix factorization is utilized to minimize the TT rank. Besides, a fixed weighting function is also developed to reduce the blockiness effect according to the shortest distance between the pixel and the corresponding sub-block boundaries. Numerical experiments demonstrate the superiority of the proposed method over the existing state-of-the-art methods in terms of quality and quantity. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data
    Ji, Teng-Yu
    Zhao, Xi-Le
    Sun, Dong-Lin
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1162 - 1166
  • [32] Multilayer Sparsity-Based Tensor Decomposition for Low-Rank Tensor Completion
    Xue, Jize
    Zhao, Yongqiang
    Huang, Shaoguang
    Liao, Wenzhi
    Chan, Jonathan Cheung-Wai
    Kong, Seong G.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (11) : 6916 - 6930
  • [33] TWO HEURISTICS SOLVING LOW TENSOR TRAIN RANK TENSOR COMPLETION
    Tang, Yunfei
    Yang, Qingzhi
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2025, 21 (02) : 925 - 954
  • [34] Low-Rank Tensor Completion Pansharpening Based on Haze Correction
    Wang, Peng
    Su, Yiyang
    Huang, Bo
    Zhu, Daiyin
    Liu, Wenjian
    Nedzved, Alexander
    Krasnoproshin, Viktor V.
    Leung, Henry
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 20
  • [35] LRTCFPan: Low-Rank Tensor Completion Based Framework for Pansharpening
    Wu, Zhong-Cheng
    Huang, Ting-Zhu
    Deng, Liang-Jian
    Huang, Jie
    Chanussot, Jocelyn
    Vivone, Gemine
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 1640 - 1655
  • [36] Low-Rank Tensor Completion by Sum of Tensor Nuclear Norm Minimization
    Su, Yaru
    Wu, Xiaohui
    Liu, Wenxi
    IEEE ACCESS, 2019, 7 : 134943 - 134953
  • [37] Color Image Denoising Based on Low-rank Tensor Train
    Zhang, Yang
    Han, Zhi
    Tang, Yandong
    TENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING (ICGIP 2018), 2019, 11069
  • [38] Low-rank tensor train for tensor robust principal component analysis
    Yang, Jing-Hua
    Zhao, Xi-Le
    Ji, Teng-Yu
    Ma, Tian-Hui
    Huang, Ting-Zhu
    APPLIED MATHEMATICS AND COMPUTATION, 2020, 367
  • [39] Unifying tensor factorization and tensor nuclear norm approaches for low-rank tensor completion
    Du, Shiqiang
    Xiao, Qingjiang
    Shi, Yuqing
    Cucchiara, Rita
    Ma, Yide
    NEUROCOMPUTING, 2021, 458 : 204 - 218
  • [40] PARALLEL MATRIX FACTORIZATION FOR LOW-RANK TENSOR COMPLETION
    Xu, Yangyang
    Hao, Ruru
    Yin, Wotao
    Su, Zhixun
    INVERSE PROBLEMS AND IMAGING, 2015, 9 (02) : 601 - 624