Dictionary Learning With Low-Rank Coding Coefficients for Tensor Completion

被引:29
|
作者
Jiang, Tai-Xiang [1 ]
Zhao, Xi-Le [2 ]
Zhang, Hao [2 ]
Ng, Michael K. [3 ]
机构
[1] Southwestern Univ Finance & Econ, Sch Econ Informat Engn, FinTech Innovat Ctr, Chengdu 611130, Sichuan, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Math Sci, Res Ctr Image & Vis Comp, Chengdu 611731, Peoples R China
[3] Univ Hong Kong, Dept Math, Pokfulam, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Tensors; Encoding; Transforms; Dictionaries; Discrete Fourier transforms; Machine learning; Electron tubes; Dictionary learning; low-rank coding; tensor completion; tensor singular value decomposition (t-SVD); IMAGE; MINIMIZATION; FACTORIZATION; NONCONVEX; ALGORITHM; RECOVERY; VIDEO;
D O I
10.1109/TNNLS.2021.3104837
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this article, we propose a novel tensor learning and coding model for third-order data completion. The aim of our model is to learn a data-adaptive dictionary from given observations and determine the coding coefficients of third-order tensor tubes. In the completion process, we minimize the low-rankness of each tensor slice containing the coding coefficients. By comparison with the traditional predefined transform basis, the advantages of the proposed model are that: 1) the dictionary can be learned based on the given data observations so that the basis can be more adaptively and accurately constructed and 2) the low-rankness of the coding coefficients can allow the linear combination of dictionary features more effectively. Also we develop a multiblock proximal alternating minimization algorithm for solving such tensor learning and coding model and show that the sequence generated by the algorithm can globally converge to a critical point. Extensive experimental results for real datasets such as videos, hyperspectral images, and traffic data are reported to demonstrate these advantages and show that the performance of the proposed tensor learning and coding method is significantly better than the other tensor completion methods in terms of several evaluation metrics.
引用
收藏
页码:932 / 946
页数:15
相关论文
共 50 条
  • [21] Low-Rank tensor completion based on nonconvex regularization
    Su, Xinhua
    Ge, Huanmin
    Liu, Zeting
    Shen, Yanfei
    [J]. SIGNAL PROCESSING, 2023, 212
  • [22] Low-rank Tensor Completion for PMU Data Recovery
    Ghasemkhani, Amir
    Liu, Yunchuan
    Yang, Lei
    [J]. 2021 IEEE POWER & ENERGY SOCIETY INNOVATIVE SMART GRID TECHNOLOGIES CONFERENCE (ISGT), 2021,
  • [23] Nonlocal Low-Rank Tensor Completion for Visual Data
    Zhang, Lefei
    Song, Liangchen
    Du, Bo
    Zhang, Yipeng
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (02) : 673 - 685
  • [25] Tensor Completion via Nonlocal Low-Rank Regularization
    Xie, Ting
    Li, Shutao
    Fang, Leyuan
    Liu, Licheng
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (06) : 2344 - 2354
  • [26] Robust approximations of low-rank minimization for tensor completion
    Gao, Shangqi
    Zhuang, Xiahai
    [J]. NEUROCOMPUTING, 2020, 379 : 319 - 333
  • [27] Robust Low-Rank Tensor Completion Based on Tensor Ring Rank via,&epsilon
    Li, Xiao Peng
    So, Hing Cheung
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 3685 - 3698
  • [28] Adaptive Rank Estimation Based Tensor Factorization Algorithm for Low-Rank Tensor Completion
    Liu, Han
    Liu, Jing
    Su, Liyu
    [J]. PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 3444 - 3449
  • [29] Low-Rank Tensor Completion by Sum of Tensor Nuclear Norm Minimization
    Su, Yaru
    Wu, Xiaohui
    Liu, Wenxi
    [J]. IEEE ACCESS, 2019, 7 : 134943 - 134953
  • [30] Learning Low-Rank Representation for Matrix Completion
    Kwon, Minsu
    Choi, Ho-Jin
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2020), 2020, : 161 - 164