Dictionary Learning With Low-Rank Coding Coefficients for Tensor Completion

被引:29
|
作者
Jiang, Tai-Xiang [1 ]
Zhao, Xi-Le [2 ]
Zhang, Hao [2 ]
Ng, Michael K. [3 ]
机构
[1] Southwestern Univ Finance & Econ, Sch Econ Informat Engn, FinTech Innovat Ctr, Chengdu 611130, Sichuan, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Math Sci, Res Ctr Image & Vis Comp, Chengdu 611731, Peoples R China
[3] Univ Hong Kong, Dept Math, Pokfulam, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Tensors; Encoding; Transforms; Dictionaries; Discrete Fourier transforms; Machine learning; Electron tubes; Dictionary learning; low-rank coding; tensor completion; tensor singular value decomposition (t-SVD); IMAGE; MINIMIZATION; FACTORIZATION; NONCONVEX; ALGORITHM; RECOVERY; VIDEO;
D O I
10.1109/TNNLS.2021.3104837
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this article, we propose a novel tensor learning and coding model for third-order data completion. The aim of our model is to learn a data-adaptive dictionary from given observations and determine the coding coefficients of third-order tensor tubes. In the completion process, we minimize the low-rankness of each tensor slice containing the coding coefficients. By comparison with the traditional predefined transform basis, the advantages of the proposed model are that: 1) the dictionary can be learned based on the given data observations so that the basis can be more adaptively and accurately constructed and 2) the low-rankness of the coding coefficients can allow the linear combination of dictionary features more effectively. Also we develop a multiblock proximal alternating minimization algorithm for solving such tensor learning and coding model and show that the sequence generated by the algorithm can globally converge to a critical point. Extensive experimental results for real datasets such as videos, hyperspectral images, and traffic data are reported to demonstrate these advantages and show that the performance of the proposed tensor learning and coding method is significantly better than the other tensor completion methods in terms of several evaluation metrics.
引用
收藏
页码:932 / 946
页数:15
相关论文
共 50 条
  • [1] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    [J]. PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [2] Tensor Factorization for Low-Rank Tensor Completion
    Zhou, Pan
    Lu, Canyi
    Lin, Zhouchen
    Zhang, Chao
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (03) : 1152 - 1163
  • [3] Low-Rank Tensor Completion by Approximating the Tensor Average Rank
    Wang, Zhanliang
    Dong, Junyu
    Liu, Xinguo
    Zeng, Xueying
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 4592 - 4600
  • [4] Tensor Convolutional Dictionary Learning With CP Low-Rank Activations
    Humbert, Pierre
    Oudre, Laurent
    Vayatis, Nicolas
    Audiffren, Julien
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 785 - 796
  • [5] Iterative tensor eigen rank minimization for low-rank tensor completion
    Su, Liyu
    Liu, Jing
    Tian, Xiaoqing
    Huang, Kaiyu
    Tan, Shuncheng
    [J]. INFORMATION SCIENCES, 2022, 616 : 303 - 329
  • [6] Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data
    Ji, Teng-Yu
    Zhao, Xi-Le
    Sun, Dong-Lin
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1162 - 1166
  • [7] CROSS: EFFICIENT LOW-RANK TENSOR COMPLETION
    Zhang, Anru
    [J]. ANNALS OF STATISTICS, 2019, 47 (02): : 936 - 964
  • [8] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    [J]. BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [9] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [10] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    [J]. BIT Numerical Mathematics, 2014, 54 : 447 - 468