Low-rank tensor completion via combined non-local self-similarity and low-rank regularization

被引:35
|
作者
Li, Xiao-Tong [1 ]
Zhao, Xi-Le [1 ]
Jiang, Tai-Xiang [1 ]
Zheng, Yu-Bang [1 ]
Ji, Teng-Yu [2 ]
Huang, Ting-Zhu [1 ]
机构
[1] Univ Elect Sci & Technol China, Res Ctr Image & Vis Comp, Sch Math Sci, Chengdu 611731, Sichuan, Peoples R China
[2] Northwestern Polytech Univ, Sch Sci, Xian 710072, Shaanxi, Peoples R China
关键词
Low-rank tensor completion; Parallel low-rank matrix factorization; Non-local self-similarity; Plug and Play; Block successive upper-bound minimization; THRESHOLDING ALGORITHM; MATRIX FACTORIZATION; MODEL; PENALIZATION; SPARSITY; IMAGES;
D O I
10.1016/j.neucom.2019.07.092
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Global low-rank methods have achieved great successes in tensor completion. However, these methods neglected the abundant non-local self-similarities, which exist in a wide range of multi-dimensional imaging data. To integrate the global and non-local property of the underlying tensor, we propose a novel low-rank tensor completion model via combined non-local self-similarity and low-rank regularization, which is named as NLS-LR. We adopt the parallel low-rank matrix factorization to guarantee the global low-rankness while plugging in non-local based denoisers to promote the non-local self-similarity instead of tailoring regularizers. To tackle the proposed model, we develop an efficient block successive upper-bound minimization (BSUM) based algorithm. Numerical experiment results demonstrate that the proposed method outperforms many state-of-the-art tensor completion methods in terms of quality metrics and visual effects. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:1 / 12
页数:12
相关论文
共 50 条
  • [21] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    [J]. BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [22] On the equivalence between low-rank matrix completion and tensor rank
    Derksen, Harm
    [J]. LINEAR & MULTILINEAR ALGEBRA, 2018, 66 (04): : 645 - 667
  • [23] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    [J]. BIT Numerical Mathematics, 2014, 54 : 447 - 468
  • [24] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [25] A dual framework for low-rank tensor completion
    Nimishakavi, Madhav
    Jawanpuria, Pratik
    Mishra, Bamdev
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [26] Optimal Low-Rank Tensor Tree Completion
    Li, Zihan
    Zhu, Ce
    Long, Zhen
    Liu, Yipeng
    [J]. 2023 IEEE 25TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, MMSP, 2023,
  • [27] Compressive sensing via nonlocal low-rank tensor regularization
    Feng, Lei
    Sun, Huaijiang
    Sun, Quansen
    Xia, Guiyu
    [J]. NEUROCOMPUTING, 2016, 216 : 45 - 60
  • [28] Accurate Tensor Completion via Adaptive Low-Rank Representation
    Zhang, Lei
    Wei, Wei
    Shi, Qinfeng
    Shen, Chunhua
    van den Hengel, Anton
    Zhang, Yanning
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (10) : 4170 - 4184
  • [29] Tensor Completion Via Collaborative Sparse and Low-Rank Transforms
    Li, Ben-Zheng
    Zhao, Xi-Le
    Wang, Jian-Li
    Chen, Yong
    Jiang, Tai-Xiang
    Liu, Jun
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2021, 7 : 1289 - 1303
  • [30] Low-rank tensor completion via smooth matrix factorization
    Zheng, Yu-Bang
    Huang, Ting-Zhu
    Ji, Teng-Yu
    Zhao, Xi-Le
    Jiang, Tai-Xiang
    Ma, Tian-Hui
    [J]. APPLIED MATHEMATICAL MODELLING, 2019, 70 : 677 - 695