Tensor Completion via Nonlocal Low-Rank Regularization

被引:47
|
作者
Xie, Ting [1 ]
Li, Shutao [1 ]
Fang, Leyuan [1 ]
Liu, Licheng [1 ]
机构
[1] Hunan Univ, Coll Elect & Informat Engn, Changsha 410082, Hunan, Peoples R China
关键词
Hyperspectral image (HSI); low-rank approximation; nonlocal strategy; tensor completion (TC); MATRIX; FACTORIZATION; ALGORITHMS;
D O I
10.1109/TCYB.2018.2825598
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Tensor completion (TC), aiming to recover original high-order data from its degraded observations, has recently drawn much attention in hyperspectral images (HSIs) domain. Generally, the widely used TC methods formulate the rank minimization problem with a convex trace norm penalty, which shrinks all singular values equally, and may generate a much biased solution. Besides, these TC methods assume the whole high-order data is or low-rank, which may fail to recover the detail information in high-order data with diverse and complex structures. In this paper, a novel nonlocal low-rank regularization-based TC (NLAR-TC) method is proposed for HSIs, which includes two main steps. In the first step, an initial completion result is generated by the proposed low-rank regularization-based TC (LRR-TC) model, which combines the logarithm of the determinant with the tensor trace norm. This model can more effectively approximate the tensor rank, since the logarithm function values can be adaptively tuned for each input. In the second step, the nonlocal spatial-spectral similarity is integrated into the LRR-TC model, to obtain the final completion result. Specifically, the initial completion result is first divided into groups of nonlocal similar cubes (each group forms a 3-D tensor), and then the LRR-TC is applied to each group. Since similar cubes within each group contain similar structures, each 3-D tensor should have low-rank property, and thus further improves the completion result. Experimental results demonstrate that the proposed NLRR-TC method outperforms state-of-the-art HSIs completion techniques.
引用
收藏
页码:2344 / 2354
页数:11
相关论文
共 50 条
  • [1] Compressive sensing via nonlocal low-rank tensor regularization
    Feng, Lei
    Sun, Huaijiang
    Sun, Quansen
    Xia, Guiyu
    [J]. NEUROCOMPUTING, 2016, 216 : 45 - 60
  • [2] Tensor Completion via Smooth Rank Function Low-Rank Approximate Regularization
    Yu, Shicheng
    Miao, Jiaqing
    Li, Guibing
    Jin, Weidong
    Li, Gaoping
    Liu, Xiaoguang
    [J]. REMOTE SENSING, 2023, 15 (15)
  • [3] Low-Rank Tensor Completion via Tensor Nuclear Norm With Hybrid Smooth Regularization
    Zhao, Xi-Le
    Nie, Xin
    Zheng, Yu-Bang
    Ji, Teng-Yu
    Huang, Ting-Zhu
    [J]. IEEE ACCESS, 2019, 7 : 131888 - 131901
  • [4] Low-Rank tensor completion based on nonconvex regularization
    Su, Xinhua
    Ge, Huanmin
    Liu, Zeting
    Shen, Yanfei
    [J]. SIGNAL PROCESSING, 2023, 212
  • [5] Nonlocal Low-Rank Tensor Completion for Visual Data
    Zhang, Lefei
    Song, Liangchen
    Du, Bo
    Zhang, Yipeng
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (02) : 673 - 685
  • [6] Low-rank tensor completion via combined non-local self-similarity and low-rank regularization
    Li, Xiao-Tong
    Zhao, Xi-Le
    Jiang, Tai-Xiang
    Zheng, Yu-Bang
    Ji, Teng-Yu
    Huang, Ting-Zhu
    [J]. NEUROCOMPUTING, 2019, 367 : 1 - 12
  • [7] Low-Rank Tensor Completion by Truncated Nuclear Norm Regularization
    Xue, Shengke
    Qiu, Wenyuan
    Liu, Fan
    Jin, Xinyu
    [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2600 - 2605
  • [8] Low-rank tensor completion with sparse regularization in a transformed domain
    Wang, Ping-Ping
    Li, Liang
    Cheng, Guang-Hui
    [J]. NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2021, 28 (06)
  • [9] Low-rank tensor completion via nonlocal self-similarity regularization and orthogonal transformed tensor Schatten-p norm
    Liu, Jiahui
    Zhu, Yulian
    Tian, Jialue
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (03)
  • [10] Noisy Tensor Completion via Low-Rank Tensor Ring
    Qiu, Yuning
    Zhou, Guoxu
    Zhao, Qibin
    Xie, Shengli
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1127 - 1141