A Decomposition Method for Large-Scale Sparse Coding in Representation Learning

被引:0
|
作者
Li, Yifeng [1 ]
Caron, Richard J. [2 ]
Ngom, Alioune [3 ]
机构
[1] Univ British Columbia, CMMT, Child & Family Res Inst, Vancouver, BC V5Z 1M9, Canada
[2] Univ Windsor, Math & Stat, Windsor, ON N9B 3P4, Canada
[3] Univ Windsor, Sch Comp Sci, Windsor, ON N9B 3P4, Canada
关键词
CLASSIFICATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In representation learning, sparse representation is a parsimonious principle that a sample can be approximated by a sparse superposition of dictionary atoms. Sparse coding is the core of this technique. Since the dictionary is often redundant, the dictionary size can be very large. Many optimization methods have been proposed in the literature for sparse coding. However, the efficiency of the optimization for a tremendous number of dictionary atoms is still a bottleneck. In this paper, we propose to use decomposition method for large-scale sparse coding models. Our experimental results show that our method is very efficient.
引用
收藏
页码:3732 / 3738
页数:7
相关论文
共 50 条
  • [1] Large-scale Sparse Structural Node Representation
    Serra, Edoardo
    Joaristi, Mikel
    Cuzzocrea, Alfredo
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 5247 - 5253
  • [2] Sparse Output Coding for Large-Scale Visual Recognition
    Zhao, Bin
    Xing, Eric P.
    [J]. 2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2013, : 3350 - 3357
  • [3] Large-scale Graph Representation Learning
    Leskovec, Jure
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2017, : 4 - 4
  • [4] Noise-Robust Semi-Supervised Learning by Large-Scale Sparse Coding
    Lu, Zhiwu
    Gao, Xin
    Wang, Liwei
    Wen, Ji-Rong
    Huang, Songfang
    [J]. PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2828 - 2834
  • [5] Large-scale Sparse Tensor Decomposition Using a Damped Gauss-Newton Method
    Ranadive, Teresa M.
    Baskaran, Muthu M.
    [J]. 2020 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2020,
  • [6] Coding for Large-Scale Distributed Machine Learning
    Xiao, Ming
    Skoglund, Mikael
    [J]. ENTROPY, 2022, 24 (09)
  • [7] Learning experience facilitates sparse coding of new odors in a large-scale olfactory bulb model
    Shanglin Zhou
    Boqiang Fan
    Michele Migliore
    Yuguo Yu
    [J]. BMC Neuroscience, 16 (Suppl 1)
  • [8] Representation Learning for Large-Scale Dynamic Networks
    Yu, Yanwei
    Yao, Huaxiu
    Wang, Hongjian
    Tang, Xianfeng
    Li, Zhenhui
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2018), PT II, 2018, 10828 : 526 - 541
  • [9] Large-scale modeling of wordform learning and representation
    Sibley, Daragh E.
    Kello, Christopher T.
    Plaut, David C.
    Elman, Jeffrey L.
    [J]. COGNITIVE SCIENCE, 2008, 32 (04) : 741 - 754
  • [10] Learning Deep Representation with Large-scale Attributes
    Ouyang, Wanli
    Li, Hongyang
    Zeng, Xingyu
    Wang, Xiaogang
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 1895 - 1903