Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data

被引:3
|
作者
Ji, Teng-Yu [1 ]
Zhao, Xi-Le [2 ]
Sun, Dong-Lin [3 ]
机构
[1] Northwestern Polytech Univ, Sch Math & Stat, Xian 710072, Shaanxi, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Math Sci, Chengdu 611731, Sichuan, Peoples R China
[3] Changan Univ, Sch Sci, Xian 710072, Shaanxi, Peoples R China
关键词
Tensors; Kernel; Correlation; Color; Periodic structures; Matrix decomposition; Image reconstruction; Kernel methods; tensor completion; tensor decomposition; RECOVERY; IMAGE;
D O I
10.1109/LSP.2022.3169044
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The existing low-rank tensor completion methods develop many tensor decompositions and corresponding tensor ranks in order to reconstruct the missing information by exploiting the inherent low-rank structure under the assumption that the data is low-rank under one of the kinds of decompositions. However, the assumption is easily violated for real-world data, e.g., color images and multispectral images, as the low-rank structure of these data is not significant. To better take advantage of the global correlation relationship, we propose a kernel low-rank tensor completion model, where original data is mapped into the feature space using a kernel mapping. Although the original data is high-rank, it is low-rank in the feature space owing to the kernel mapping. Therefore, the proposed model could take advantage of the implicitly low-rank structure in the feature space and estimate the missing entries well. Considering it is not easy to explicitly kernelize the tensor, we reformulate the model as the inner product form and introduce the kernel trick for efficiently solving the resulting model. Extensive experiments on color images and multispectral images show that the proposed method outperforms the state-of-the-art low-rank tensor completion methods.
引用
收藏
页码:1162 / 1166
页数:5
相关论文
共 50 条
  • [31] Tensor Completion via Nonlocal Low-Rank Regularization
    Xie, Ting
    Li, Shutao
    Fang, Leyuan
    Liu, Licheng
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (06) : 2344 - 2354
  • [33] Robust approximations of low-rank minimization for tensor completion
    Gao, Shangqi
    Zhuang, Xiahai
    [J]. NEUROCOMPUTING, 2020, 379 : 319 - 333
  • [34] Low-Rank Matrix Completion
    Chi, Yuejie
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 2018, 35 (05) : 178 - 181
  • [35] Nonconvex Low-Rank Symmetric Tensor Completion from Noisy Data
    Cai, Changxiao
    Li, Gen
    Poor, H. Vincent
    Chen, Yuxin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [36] Low-Rank Autoregressive Tensor Completion for Spatiotemporal Traffic Data Imputation
    Chen, Xinyu
    Lei, Mengying
    Saunier, Nicolas
    Sun, Lijun
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (08) : 12301 - 12310
  • [37] A Riemannian trust-region method for low-rank tensor completion
    Heidel, Gennadij
    Schulz, Volker
    [J]. NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2018, 25 (06)
  • [38] A Nonconvex Method to Low-Rank Matrix Completion
    He, Haizhen
    Cui, Angang
    Yang, Hong
    Wen, Meng
    [J]. IEEE ACCESS, 2022, 10 : 55226 - 55234
  • [39] Robust Low-Rank Tensor Completion Based on Tensor Ring Rank via,&epsilon
    Li, Xiao Peng
    So, Hing Cheung
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 3685 - 3698
  • [40] Adaptive Rank Estimation Based Tensor Factorization Algorithm for Low-Rank Tensor Completion
    Liu, Han
    Liu, Jing
    Su, Liyu
    [J]. PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 3444 - 3449