Low Tensor-Ring Rank Completion by Parallel Matrix Factorization

被引:38
|
作者
Yu, Jinshi [1 ,2 ]
Zhou, Guoxu [1 ,3 ]
Li, Chao [4 ]
Zhao, Qibin [1 ,4 ,5 ]
Xie, Shengli [1 ,6 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Peoples R China
[2] Guangdong Univ Technol, Guangdong Key Lab IoT Informat Technol, Guangzhou 510006, Peoples R China
[3] Guangdong Univ Technol, Minist Educ, Key Lab Intelligent Detect & Internet Things Mfg, Guangzhou 510006, Peoples R China
[4] RIKEN, Ctr Adv Intelligence Project AIP, Tokyo 1030027, Japan
[5] Guangdong Univ Technol, Minist Educ, Joint Int Res Lab Intelligent Informat Proc & Sys, Guangzhou 510006, Peoples R China
[6] Guangdong Univ Technol, Guangdong Hong Kong Macao Joint Lab Smart Discret, Guangzhou 510006, Peoples R China
基金
日本学术振兴会;
关键词
Tensile stress; Computational efficiency; Computational modeling; Matrix decomposition; Automation; Complexity theory; Learning systems; Image; video inpainting; tensor completion; tensor-ring (TR) rank; TR decomposition; CANONICAL POLYADIC DECOMPOSITION; UNIQUENESS CONDITIONS; IMAGE;
D O I
10.1109/TNNLS.2020.3009210
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor-ring (TR) decomposition has recently attracted considerable attention in solving the low-rank tensor completion (LRTC) problem. However, due to an unbalanced unfolding scheme used during the update of core tensors, the conventional TR-based completion methods usually require a large TR rank to achieve the optimal performance, which leads to high computational cost in practical applications. To overcome this drawback, we propose a new method to exploit the low TR-rank structure in this article. Specifically, we first introduce a balanced unfolding operation called tensor circular unfolding, by which the relationship between TR rank and the ranks of tensor unfoldings is theoretically established. Using this new unfolding operation, we further propose an algorithm to exploit the low TR-rank structure by performing parallel low-rank matrix factorizations to all circularly unfolded matrices. To tackle the problem of nonuniform missing patterns, we apply a row weighting trick to each circularly unfolded matrix, which significantly improves the adaptive ability to various types of missing patterns. The extensive experiments have demonstrated that the proposed algorithm can achieve outstanding performance using a much smaller TR rank compared with the conventional TR-based completion algorithms; meanwhile, the computational cost is reduced substantially.
引用
收藏
页码:3020 / 3033
页数:14
相关论文
共 50 条
  • [1] Low tensor-ring rank completion: parallel matrix factorization with smoothness on latent space
    Jinshi Yu
    Tao Zou
    Guoxu Zhou
    Neural Computing and Applications, 2023, 35 : 7003 - 7016
  • [2] Low tensor-ring rank completion: parallel matrix factorization with smoothness on latent space
    Yu, Jinshi
    Zou, Tao
    Zhou, Guoxu
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (09): : 7003 - 7016
  • [3] PARALLEL MATRIX FACTORIZATION FOR LOW-RANK TENSOR COMPLETION
    Xu, Yangyang
    Hao, Ruru
    Yin, Wotao
    Su, Zhixun
    INVERSE PROBLEMS AND IMAGING, 2015, 9 (02) : 601 - 624
  • [4] Robust to Rank Selection: Low-Rank Sparse Tensor-Ring Completion
    Yu, Jinshi
    Zhou, Guoxu
    Sun, Weijun
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (05) : 2451 - 2465
  • [5] Low-rank tensor completion via smooth matrix factorization
    Zheng, Yu-Bang
    Huang, Ting-Zhu
    Ji, Teng-Yu
    Zhao, Xi-Le
    Jiang, Tai-Xiang
    Ma, Tian-Hui
    APPLIED MATHEMATICAL MODELLING, 2019, 70 : 677 - 695
  • [6] Tensor Factorization for Low-Rank Tensor Completion
    Zhou, Pan
    Lu, Canyi
    Lin, Zhouchen
    Zhang, Chao
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (03) : 1152 - 1163
  • [7] Logarithmic Norm Regularized Low-Rank Factorization for Matrix and Tensor Completion
    Chen, Lin
    Jiang, Xue
    Liu, Xingzhao
    Zhou, Zhixin
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 3434 - 3449
  • [8] Matrix factorization for low-rank tensor completion using framelet prior
    Jiang, Tai-Xiang
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Ji, Teng-Yu
    Deng, Liang-Jian
    INFORMATION SCIENCES, 2018, 436 : 403 - 417
  • [9] Imbalanced low-rank tensor completion via latent matrix factorization
    Qiu, Yuning
    Zhou, Guoxu
    Zeng, Junhua
    Zhao, Qibin
    Xie, Shengli
    NEURAL NETWORKS, 2022, 155 : 369 - 382
  • [10] Tensor completion using total variation and low-rank matrix factorization
    Ji, Teng-Yu
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Ma, Tian-Hui
    Liu, Gang
    INFORMATION SCIENCES, 2016, 326 : 243 - 257