Smooth sparse coding via marginal regression for learning sparse representations

被引:5
|
作者
Balasubramanian, Krishnakumar [1 ]
Yu, Kai [2 ]
Lebanon, Guy [3 ]
机构
[1] Univ Wisconsin, Dept Stat, Madison, WI 53706 USA
[2] Horizong Robot, Beijing, Peoples R China
[3] LinkedIn, Mountain View, CA USA
关键词
Sparse coding; Dictionary learning; Vision;
D O I
10.1016/j.artint.2016.04.009
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose and analyze a novel framework for learning sparse representations based on two statistical techniques: kernel smoothing and marginal regression. The proposed approach provides a flexible framework for incorporating feature similarity or temporal information present in data sets via non-parametric kernel smoothing. We provide generalization bounds for dictionary learning using smooth sparse coding and show how the sample complexity depends on the L-1 norm of kernel function used. Furthermore, we propose using marginal regression for obtaining sparse codes which significantly improves the speed and allows one to scale to large dictionary sizes easily. We demonstrate the advantages of the proposed approach, both in terms of accuracy and speed by extensive experimentation on several real data sets. In addition, we demonstrate how the proposed approach can be used for improving semi-supervised sparse coding. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:83 / 95
页数:13
相关论文
共 50 条
  • [1] Learning Sparse Representations in Reinforcement Learning with Sparse Coding
    Le, Lei
    Kumaraswamy, Raksha
    White, Martha
    [J]. PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2067 - 2073
  • [2] Learning Word Representations with Hierarchical Sparse Coding
    Yogatama, Dani
    Faruqui, Manaal
    Dyer, Chris
    Smith, Noah A.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 87 - 96
  • [3] Learning Image Representations from the Pixel Level via Hierarchical Sparse Coding
    Yu, Kai
    Lin, Yuanqing
    Lafferty, John
    [J]. 2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011, : 1713 - 1720
  • [4] Learning Efficient Data Representations With Orthogonal Sparse Coding
    Schuetze, Henry
    Barth, Erhardt
    Martinetz, Thomas
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2016, 2 (03): : 177 - 189
  • [5] Visual Tracking via Temporally Smooth Sparse Coding
    Liu, Ting
    Wang, Gang
    Wang, Li
    Chan, Kap Luk
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (09) : 1452 - 1456
  • [6] Sparse smooth ridge regression method for supervised learning
    Ren, Weiya
    Li, Guohui
    [J]. Guofang Keji Daxue Xuebao/Journal of National University of Defense Technology, 2015, 37 (06): : 121 - 128
  • [7] Learning Ancestral Atom via Sparse Coding
    Aritake, Toshimitsu
    Hino, Hideitsu
    Murata, Noboru
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2013, 7 (04) : 586 - 594
  • [8] Sparse Bayesian Learning via Stepwise Regression
    Ament, Sebastian
    Gomes, Carla
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [9] Sparse Coding Neural Gas: Learning of overcomplete data representations
    Labusch, Kai
    Barth, Erhardt
    Martinetz, Thomas
    [J]. NEUROCOMPUTING, 2009, 72 (7-9) : 1547 - 1555
  • [10] Learning modular representations from global sparse coding networks
    Eva L Dyer
    Don H Johnson
    Richard G Baraniuk
    [J]. BMC Neuroscience, 11 (Suppl 1)