Tensor Convolutional Dictionary Learning With CP Low-Rank Activations

被引:1
|
作者
Humbert, Pierre [1 ]
Oudre, Laurent [1 ]
Vayatis, Nicolas [1 ]
Audiffren, Julien [2 ]
机构
[1] Univ Paris Saclay, ENS Paris Saclay, CNRS, Ctr Borelli, F-91190 Gif Sur Yvette, France
[2] Univ Fribourg, Cognit & Percept Lab, CH-1700 Fribourg, Switzerland
关键词
Tensors; Convolution; Convolutional codes; Machine learning; Signal processing algorithms; Convergence; Mathematical models; Convolutional dictionary learning; convolutional sparse coding; tensor; canonical polyadic decomposition; LEAST-SQUARES ALGORITHM; THRESHOLDING ALGORITHM; DECOMPOSITION; FACTORIZATION; REGRESSION;
D O I
10.1109/TSP.2021.3135695
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we propose to extend the standard Convolutional Dictionary Learning problem to a tensor representation where the activations are constrained to be "low-rank" through a Canonical Polyadic decomposition. We show that this additional constraint increases the robustness of the CDL with respect to noise and improve the interpretability of the final results. In addition, we discuss in detail the advantages of this representation and introduce two algorithms, based on ADMM or FISTA, that efficiently solve this problem. We show that by exploiting the low rank property of activations, they achieve lower complexity than the main CDL algorithms. Finally, we evaluate our approach on a wide range of experiments, highlighting the modularity and the advantages of this tensorial low-rank formulation.
引用
收藏
页码:785 / 796
页数:12
相关论文
共 50 条
  • [1] Dictionary Learning With Low-Rank Coding Coefficients for Tensor Completion
    Jiang, Tai-Xiang
    Zhao, Xi-Le
    Zhang, Hao
    Ng, Michael K.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (02) : 932 - 946
  • [2] A Low-Rank Tensor Dictionary Learning Method for Hyperspectral Image Denoising
    Gong, Xiao
    Chen, Wei
    Chen, Jie
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 1168 - 1180
  • [3] LOW RANK ACTIVATIONS FOR TENSOR-BASED CONVOLUTIONAL SPARSE CODING
    Humbert, Pierre
    Audiffren, Julien
    Oudre, Laurent
    Vayatis, Nicolas
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3252 - 3256
  • [4] Scalable and Sound Low-Rank Tensor Learning
    Cheng, Hao
    Yu, Yaoliang
    Zhang, Xinhua
    Xing, Eric
    Schuurmans, Dale
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 1114 - 1123
  • [5] Online Robust Low-Rank Tensor Learning
    Li, Ping
    Feng, Jiashi
    Jin, Xiaojie
    Zhang, Luming
    Xu, Xianghua
    Yan, Shuicheng
    [J]. PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2180 - 2186
  • [6] Tensor low-rank sparse representation for tensor subspace learning
    Du, Shiqiang
    Shi, Yuqing
    Shan, Guangrong
    Wang, Weilan
    Ma, Yide
    [J]. NEUROCOMPUTING, 2021, 440 : 351 - 364
  • [7] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    [J]. PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [8] Low-rank dictionary learning for unsupervised feature selection
    Parsa, Mohsen Ghassemi
    Zare, Hadi
    Ghatee, Mehdi
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 202
  • [9] LEARNING A LOW-RANK SHARED DICTIONARY FOR OBJECT CLASSIFICATION
    Vu, Tiep H.
    Monga, Vishal
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2016, : 4428 - 4432
  • [10] Discriminative low-rank dictionary learning for face recognition
    Hoangvu Nguyen
    Yang, Wankou
    Sheng, Biyun
    Sun, Changyin
    [J]. NEUROCOMPUTING, 2016, 173 : 541 - 551