Fast Tensor Nuclear Norm for Structured Low-Rank Visual Inpainting

被引:38
|
作者
Xu, Honghui [1 ]
Zheng, Jianwei [1 ]
Yao, Xiaomin [1 ]
Feng, Yuchao [1 ]
Chen, Shengyong [2 ]
机构
[1] Zhejiang Univ Technol, Coll Comp Sci & Technol, Hangzhou 310023, Peoples R China
[2] Tianjin Univ Technol, Coll Comp Sci & Engn, Tianjin 300384, Peoples R China
关键词
Tensors; Visualization; Correlation; Optimization; Learning systems; Three-dimensional displays; Singular value decomposition; Visual inpainting; low-rank tensor completion; tensor nuclear norm (TNN); alternating direction method of multiplier; MATRIX; COMPLETION; DECOMPOSITION; SPARSE; SPACE; TRAIN;
D O I
10.1109/TCSVT.2021.3067022
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Low-rank modeling has achieved great success in visual data completion. However, the low-rank assumption of original visual data may be in approximate mode, which leads to suboptimality for the recovery of underlying details, especially when the missing rate is extremely high. In this paper, we go further by providing a detailed analysis about the rank distributions in Hankel structured and clustered cases, and figure out both non-local similarity and patch-based structuralization play a positive role. This motivates us to develop a new Hankel low-rank tensor recovery method that is competent to truthfully capture the underlying details with sacrifice of slightly more computational burden. First, benefiting from the correlation of different spectral bands and the smoothness of local spatial neighborhood, we divide the visual data into overlapping 3D patches and group the similar ones into individual clusters exploring the non-local similarity. Second, the 3D patches are individually mapped to the structured Hankel tensors for better revealing low-rank property of the image. Finally, we solve the tensor completion model via the well-known alternating direction method of multiplier (ADMM) optimization algorithm. Due to the fact that size expansion happens inevitably in Hankelization operation, we further propose a fast randomized skinny tensor singular value decomposition (rst-SVD) to accelerate the per-iteration running efficiency. Extensive experimental results on real world datasets verify the superiority of our method compared to the state-of-the-art visual inpainting approaches.
引用
收藏
页码:538 / 552
页数:15
相关论文
共 50 条
  • [31] Low-Rank Constraints for Fast Inference in Structured Models
    Chiu, Justin T.
    Deng, Yuntian
    Rush, Alexander M.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [32] Fast Algorithms for Displacement and Low-Rank Structured Matrices
    Chandrasekaran, Shivkumar
    Govindarajan, Nithin
    Rajagopal, Abhejit
    ISSAC'18: PROCEEDINGS OF THE 2018 ACM INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND ALGEBRAIC COMPUTATION, 2018, : 17 - 22
  • [33] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [34] Logarithmic Norm Regularized Low-Rank Factorization for Matrix and Tensor Completion
    Chen, Lin
    Jiang, Xue
    Liu, Xingzhao
    Zhou, Zhixin
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 3434 - 3449
  • [35] REMOTE SENSING IMAGES INPAINTING BASED ON STRUCTURED LOW-RANK MATRIX APPROXIMATION
    Hu, Yue
    Wei, Zidi
    Zhao, Kuangshi
    IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2020, : 1341 - 1344
  • [36] t-Schatten-p Norm for Low-Rank Tensor Recovery
    Kong, Hao
    Xie, Xingyu
    Lin, Zhouchen
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2018, 12 (06) : 1405 - 1419
  • [37] Nuclear norm regularization with a low-rank constraint for matrix completion
    Zhang, Hui
    Cheng, Lizhi
    Zhu, Wei
    INVERSE PROBLEMS, 2010, 26 (11)
  • [38] Fast randomized tensor singular value thresholding for low-rank tensor optimization
    Che, Maolin
    Wang, Xuezhong
    Wei, Yimin
    Zhao, Xile
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2022, 29 (06)
  • [39] Fast and Accurate Randomized Algorithms for Low-rank Tensor Decompositions
    Ma, Linjian
    Solomonik, Edgar
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [40] Low-rank high-order tensor recovery via joint transformed tensor nuclear norm and total variation regularization
    Luo, Xiaohu
    Ma, Weijun
    Wang, Wendong
    Zheng, Yuanshi
    Wang, Jianjun
    NEUROCOMPUTING, 2025, 624