Learn the Approximation Distribution of Sparse Coding with Mixture Sparsity Network

被引:0
|
作者
Li, Li [1 ]
Long, Xiao [2 ,3 ]
Zhuang, Liansheng [1 ,2 ]
Wang, Shafei [4 ]
机构
[1] USTC, Sch Data Sci, Hefei, Peoples R China
[2] USTC, Sch Informat Sci & Technol, Hefei, Peoples R China
[3] Peng Cheng Lab, Shenzhen, Peoples R China
[4] Northern Inst Elect Equipment, Beijing, Peoples R China
关键词
Sparse coding; Learned ISTA; Mixture Sparsity Network;
D O I
10.1007/978-3-030-88013-2_32
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse coding is typically solved by iterative optimization techniques, such as the ISTA algorithm. To accelerate the estimation, neural networks are proposed to produce the best possible approximation of the sparse codes by unfolding and learning weights of ISTA. However, due to the uncertainty in the neural network, one can only obtain a possible approximation with fixed computation cost and tolerable error. Moreover, since the problem of sparse coding is an inverse problem, the optimal possible approximation is often not unique. Inspired by these insights, we propose a novel framework called Learned ISTA with Mixture Sparsity Network (LISTA-MSN) for sparse coding, which learns to predict the best possible approximation distribution conditioned on the input data. By sampling from the predicted distribution, LISTA-MSN can obtain a more precise approximation of sparse codes. Experiments on synthetic data and real image data demonstrate the effectiveness of the proposed method.
引用
收藏
页码:387 / 398
页数:12
相关论文
共 50 条
  • [41] Unsupervised Sparsity-based Unmixing of Hyperspectral Imaging Data Using an Online Sparse Coding Dictionary
    Elrewainy, Ahmed
    Sherif, Sherif S.
    [J]. IMAGE AND SIGNAL PROCESSING FOR REMOTE SENSING XXIV, 2018, 10789
  • [42] Mixture of von Mises-Fisher distribution with sparse prototypes
    Rossi, Fabrice
    Barbaro, Florian
    [J]. NEUROCOMPUTING, 2022, 501 : 41 - 74
  • [43] Signatures for content distribution with network coding
    Zhao, Fang
    Kalker, Ton
    Medard, Muriel
    Han, Keesook J.
    [J]. 2007 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-7, 2007, : 556 - +
  • [44] Hierarchical fusion network for periocular and iris by neural network approximation and sparse autoencoder
    Faisal Algashaam
    Kien Nguyen
    Jasmine Banks
    Vinod Chandran
    Tuan-Anh Do
    Mohamed Alkanhal
    [J]. Machine Vision and Applications, 2021, 32
  • [45] Hierarchical fusion network for periocular and iris by neural network approximation and sparse autoencoder
    Algashaam, Faisal
    Kien Nguyen
    Banks, Jasmine
    Chandran, Vinod
    Tuan-Anh Do
    Alkanhal, Mohamed
    [J]. MACHINE VISION AND APPLICATIONS, 2020, 32 (01)
  • [46] Exploiting Sparse Coding: A Sliding Window Enhancement of a Random Linear Network Coding Scheme
    Garrido, Pablo
    Gomez, David
    Lanza, Jorge
    Aguero, Ramon
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2016, : 757 - 762
  • [47] Learning the Sparsity for ReRAM: Mapping and Pruning Sparse Neural Network for ReRAM based Accelerator
    Lin, Jilan
    Zhu, Zhenhua
    Wang, Yu
    Xie, Yuan
    [J]. 24TH ASIA AND SOUTH PACIFIC DESIGN AUTOMATION CONFERENCE (ASP-DAC 2019), 2019, : 639 - 644
  • [48] Markov Chain Model for the Decoding Probability of Sparse Network Coding
    Garrido, Pablo
    Lucani, Daniel E.
    Aguero, Ramon
    [J]. IEEE TRANSACTIONS ON COMMUNICATIONS, 2017, 65 (04) : 1675 - 1685
  • [49] Sparse random linear network coding for low latency allcast
    Graham, Mark A.
    Ganesh, Ayalvadi
    Piechocki, Robert J.
    [J]. 2019 57TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2019, : 560 - 564
  • [50] Adaptive sparse coding based on memristive neural network with applications
    Xun Ji
    Xiaofang Hu
    Yue Zhou
    Zhekang Dong
    Shukai Duan
    [J]. Cognitive Neurodynamics, 2019, 13 : 475 - 488