Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors

被引:31
|
作者
Zheng, Yu-Bang [1 ,2 ]
Huang, Ting-Zhu [1 ]
Zhao, Xi-Le [1 ]
Zhao, Qibin [2 ,3 ]
机构
[1] Univ Elect Sci & Technol China, Sch Math Sci, Chengdu, Peoples R China
[2] RIKEN Ctr Adv Intelligence Project AIP, Tensor Learning Team, Tokyo, Japan
[3] Guangdong Univ Technol, Sch Automat, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Tensor decomposition; Tensor completion; Low-rankness; Image processing; Proximal alternating minimization; RANK APPROXIMATION; MATRIX FACTORIZATION; NONCONVEX; RECOVERY; MINIMIZATION;
D O I
10.1007/s10915-022-01841-8
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The recently proposed fully-connected tensor network (FCTN) decomposition has a powerful ability to capture the low-rankness of tensors and has achieved great success in tensor completion. However, the FCTN decomposition-based method is highly sensitive to the choice of the FCTN-rank and can not provide satisfactory results in the recovery of local details. In this paper, we propose a novel tensor completion model by introducing a factor-based regularization to the framework of the FCTN decomposition. The regularization provides a robust performance to the choice of the FCTN-rank and simultaneously enforces the global low-rankness and the local continuity of the target tensor. More specifically, by illustrating that the unfolding matrices of the FCTN factors can be reasonably assumed to be of low-rank in the gradient domain and further imposing a low-rank matrix factorization (LRMF) on them, the proposed model enhances the robustness to the choice of the FCTN-rank. By employing a Tikhonov regularization to the LRMF factors, the proposed model promotes the local continuity and preserves local details of the target tensor. To solve the optimization problem associated with the proposed model, we develop an efficient proximal alternating minimization (PAM)-based algorithm and theoretically demonstrate its convergence. To reduce the running time of the developed algorithm, we design an automatic rank-increasing strategy. Numerical experimental results demonstrate that the proposed method outperforms its competitors.
引用
收藏
页数:35
相关论文
共 50 条
  • [21] Riemannian preconditioned algorithms for tensor completion via tensor ring decomposition
    Bin Gao
    Renfeng Peng
    Ya-xiang Yuan
    Computational Optimization and Applications, 2024, 88 : 443 - 468
  • [22] Riemannian preconditioned algorithms for tensor completion via tensor ring decomposition
    Gao, Bin
    Peng, Renfeng
    Yuan, Ya-xiang
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 88 (02) : 443 - 468
  • [23] Infrared Dim and Small Target Detection Based on Superpixel Segmentation and Spatiotemporal Cluster 4D Fully-Connected Tensor Network Decomposition
    Wei, Wenyan
    Ma, Tao
    Li, Meihui
    Zuo, Haorui
    REMOTE SENSING, 2024, 16 (01)
  • [24] Bayesian robust tensor completion via CP decomposition
    Wang, Xiaohang
    Yu, Philip L. H.
    Yang, Weidong
    Su, Jun
    PATTERN RECOGNITION LETTERS, 2022, 163 : 121 - 128
  • [25] Compressing Fully Connected Layers using Kronecker Tensor Decomposition
    Chen, Shaowu
    Sun, Weize
    Huang, Lei
    Yang, Xin
    Huang, Junhao
    PROCEEDINGS OF 2019 IEEE 7TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT 2019), 2019, : 308 - 312
  • [26] Tensor Wheel Decomposition and Its Tensor Completion Application
    Wu, Zhong-Cheng
    Huang, Ting-Zhu
    Deng, Liang-Jian
    Dou, Hong-Xia
    Meng, Deyu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [27] One-bit tensor completion via transformed tensor singular value decomposition
    Hou, Jingyao
    Zhang, Feng
    Wang, Jianjun
    APPLIED MATHEMATICAL MODELLING, 2021, 95 : 760 - 782
  • [28] One-bit tensor completion via transformed tensor singular value decomposition
    Hou, Jingyao
    Zhang, Feng
    Wang, Jianjun
    Applied Mathematical Modelling, 2021, 95 : 760 - 782
  • [29] Fast and Accurate Tensor Completion With Total Variation Regularized Tensor Trains
    Ko, Ching-Yun
    Batselier, Kim
    Daniel, Luca
    Yu, Wenjian
    Wong, Ngai
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 : 6918 - 6931
  • [30] Non-Local and Fully Connected Tensor Network Decomposition for Remote Sensing Image Denoising
    Tu, Zhihui
    Chen, Shunda
    Lu, Jian
    Li, Lin
    Jiang, Qingtang
    NUMERICAL MATHEMATICS-THEORY METHODS AND APPLICATIONS, 2024, 17 (02): : 379 - 403