Low-Rank Tensor Completion Based on Self-Adaptive Learnable Transforms

被引:10
|
作者
Wu, Tongle [1 ]
Gao, Bin [2 ]
Fan, Jicong [3 ,4 ]
Xue, Jize [5 ]
Woo, W. L. [6 ]
机构
[1] Univ Elect Sci & Technol China, Chengdu 611731, Sichuan, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Automat Engn, Chengdu 611731, Sichuan, Peoples R China
[3] Chinese Univ Hong Kong Shenzhen, Shenzhen 518172, Peoples R China
[4] Shenzhen Res Inst Big Data, Shenzhen 518172, Peoples R China
[5] Northwestern Polytech Univ, Sch Automat Engn, Xian 710072, Peoples R China
[6] Northumbria Univ, Dept Comp & Informat Sci, Newcastle Upon Tyne NE1 8ST, England
基金
中国国家自然科学基金;
关键词
Tensors; Discrete Fourier transforms; Transforms; Frequency-domain analysis; Optimization; Matrix decomposition; Learning systems; Learnable transform; low-rank; self-adaptive; tensor completion; NUCLEAR NORM; FACTORIZATION; MINIMIZATION; IMAGE; REPRESENTATION; RECOVERY; MATRIX;
D O I
10.1109/TNNLS.2022.3215974
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The tensor nuclear norm (TNN), defined as the sum of nuclear norms of frontal slices of the tensor in a frequency domain, has been found useful in solving low-rank tensor recovery problems. Existing TNN-based methods use either fixed or data-independent transformations, which may not be the optimal choices for the given tensors. As the consequence, these methods cannot exploit the potential low-rank structure of tensor data adaptively. In this article, we propose a framework called self-adaptive learnable transform (SALT) to learn a transformation matrix from the given tensor. Specifically, SALT aims to learn a lossless transformation that induces a lower average-rank tensor, where the Schatten- $p$ quasi-norm is used as the rank proxy. Then, because SALT is less sensitive to the orientation, we generalize SALT to other dimensions of tensor (SALTS), namely, learning three self-adaptive transformation matrices simultaneously from given tensor. SALTS is able to adaptively exploit the potential low-rank structures in all directions. We provide a unified optimization framework based on alternating direction multiplier method for SALTS model and theoretically prove the weak convergence property of the proposed algorithm. Experimental results in hyperspectral image (HSI), color video, magnetic resonance imaging (MRI), and COIL-20 datasets show that SALTS is much more accurate in tensor completion than existing methods. The demo code can be found at https://faculty.uestc.edu.cn/gaobin/zh_CN/lwcg/153392/list/index.htm.
引用
收藏
页码:8826 / 8838
页数:13
相关论文
共 50 条
  • [21] Optimal Low-Rank Tensor Tree Completion
    Li, Zihan
    Zhu, Ce
    Long, Zhen
    Liu, Yipeng
    2023 IEEE 25TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, MMSP, 2023,
  • [22] A dual framework for low-rank tensor completion
    Nimishakavi, Madhav
    Jawanpuria, Pratik
    Mishra, Bamdev
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [23] Adaptive Low-Rank Matrix Completion
    Tripathi, Ruchi
    Mohan, Boda
    Rajawat, Ketan
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (14) : 3603 - 3616
  • [24] Multilayer Sparsity-Based Tensor Decomposition for Low-Rank Tensor Completion
    Xue, Jize
    Zhao, Yongqiang
    Huang, Shaoguang
    Liao, Wenzhi
    Chan, Jonathan Cheung-Wai
    Kong, Seong G.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (11) : 6916 - 6930
  • [25] On the equivalence between low-rank matrix completion and tensor rank
    Derksen, Harm
    LINEAR & MULTILINEAR ALGEBRA, 2018, 66 (04): : 645 - 667
  • [26] A Weighted Tensor Factorization Method for Low-rank Tensor Completion
    Cheng, Miaomiao
    Jing, Liping
    Ng, Michael K.
    2019 IEEE FIFTH INTERNATIONAL CONFERENCE ON MULTIMEDIA BIG DATA (BIGMM 2019), 2019, : 30 - 38
  • [27] Noisy Tensor Completion via Low-Rank Tensor Ring
    Qiu, Yuning
    Zhou, Guoxu
    Zhao, Qibin
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1127 - 1141
  • [28] Low-rank tensor completion via combined non-local self-similarity and low-rank regularization
    Li, Xiao-Tong
    Zhao, Xi-Le
    Jiang, Tai-Xiang
    Zheng, Yu-Bang
    Ji, Teng-Yu
    Huang, Ting-Zhu
    NEUROCOMPUTING, 2019, 367 : 1 - 12
  • [29] Low-rank tensor completion based on tensor train rank with partially overlapped sub-blocks
    He, Jingfei
    Zheng, Xunan
    Gao, Peng
    Zhou, Yatong
    SIGNAL PROCESSING, 2022, 190
  • [30] Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation
    Ding, Meng
    Huang, Ting-Zhu
    Ji, Teng-Yu
    Zhao, Xi-Le
    Yang, Jing-Hua
    JOURNAL OF SCIENTIFIC COMPUTING, 2019, 81 (02) : 941 - 964