Imbalanced low-rank tensor completion via latent matrix factorization

被引:6
|
作者
Qiu, Yuning [1 ,2 ]
Zhou, Guoxu [1 ,3 ]
Zeng, Junhua [1 ,2 ]
Zhao, Qibin [1 ,4 ]
Xie, Shengli [1 ,2 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Peoples R China
[2] Sch Automat, Guangdong Hong Kong Macao Joint Lab Smart Discret, Hong Kong 510006, Guangdong, Peoples R China
[3] Minist Educ, Key Lab Intelligent Detect & Internet Things Mfg, Guangzhou 510006, Peoples R China
[4] RIKEN, Tensor Learning Team, Ctr Adv Intelligence Project AIP, Saitama, Japan
关键词
Tensor analysis; Tensor completion; Tensor ring decomposition; Low -rank tensor recovery; Image; video inpainting; RECOVERY; IMAGE; DECOMPOSITIONS;
D O I
10.1016/j.neunet.2022.08.023
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor completion has been widely used in computer vision and machine learning. Most existing tensor completion methods empirically assume the intrinsic tensor is simultaneous low-rank in all over modes. However, tensor data recorded from real-world applications may conflict with these assumptions, e.g., face images taken from different subjects often lie in a union of low-rank subspaces, which may result in a quite high rank or even full rank structure in its sample mode. To this aim, in this paper, we propose an imbalanced low-rank tensor completion method, which can flexibly estimate the low-rank incomplete tensor via decomposing it into a mixture of multiple latent tensor ring (TR) rank components. Specifically, each latent component is approximated using low-rank matrix factorization based on TR unfolding matrix. In addition, an effective proximal alternating minimization algorithm is developed and theoretically proved to maintain the global convergence property, that is, the whole sequence of iterates is convergent and converges to a critical point. Extensive experiments on both synthetic and real-world tensor data demonstrate that the proposed method achieves more favorable completion results with less computational cost when compared to the state-of-the-art tensor completion methods. (c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页码:369 / 382
页数:14
相关论文
共 50 条
  • [1] Low-rank tensor completion via smooth matrix factorization
    Zheng, Yu-Bang
    Huang, Ting-Zhu
    Ji, Teng-Yu
    Zhao, Xi-Le
    Jiang, Tai-Xiang
    Ma, Tian-Hui
    [J]. APPLIED MATHEMATICAL MODELLING, 2019, 70 : 677 - 695
  • [2] PARALLEL MATRIX FACTORIZATION FOR LOW-RANK TENSOR COMPLETION
    Xu, Yangyang
    Hao, Ruru
    Yin, Wotao
    Su, Zhixun
    [J]. INVERSE PROBLEMS AND IMAGING, 2015, 9 (02) : 601 - 624
  • [3] Tensor Factorization for Low-Rank Tensor Completion
    Zhou, Pan
    Lu, Canyi
    Lin, Zhouchen
    Zhang, Chao
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (03) : 1152 - 1163
  • [4] Logarithmic Norm Regularized Low-Rank Factorization for Matrix and Tensor Completion
    Chen, Lin
    Jiang, Xue
    Liu, Xingzhao
    Zhou, Zhixin
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 3434 - 3449
  • [5] Matrix factorization for low-rank tensor completion using framelet prior
    Jiang, Tai-Xiang
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Ji, Teng-Yu
    Deng, Liang-Jian
    [J]. INFORMATION SCIENCES, 2018, 436 : 403 - 417
  • [6] Tensor completion using total variation and low-rank matrix factorization
    Ji, Teng-Yu
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Ma, Tian-Hui
    Liu, Gang
    [J]. INFORMATION SCIENCES, 2016, 326 : 243 - 257
  • [7] A Weighted Tensor Factorization Method for Low-rank Tensor Completion
    Cheng, Miaomiao
    Jing, Liping
    Ng, Michael K.
    [J]. 2019 IEEE FIFTH INTERNATIONAL CONFERENCE ON MULTIMEDIA BIG DATA (BIGMM 2019), 2019, : 30 - 38
  • [8] Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation
    Meng Ding
    Ting-Zhu Huang
    Teng-Yu Ji
    Xi-Le Zhao
    Jing-Hua Yang
    [J]. Journal of Scientific Computing, 2019, 81 : 941 - 964
  • [9] Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation
    Ding, Meng
    Huang, Ting-Zhu
    Ji, Teng-Yu
    Zhao, Xi-Le
    Yang, Jing-Hua
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2019, 81 (02) : 941 - 964
  • [10] Low-Rank Tensor Completion Based on Log-Det Rank Approximation and Matrix Factorization
    Shi, Chengfei
    Huang, Zhengdong
    Wan, Li
    Xiong, Tifan
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2019, 80 (03) : 1888 - 1912