Learning Sparse Representations in Reinforcement Learning with Sparse Coding

被引:0
|
作者
Le, Lei [1 ]
Kumaraswamy, Raksha [1 ]
White, Martha [1 ]
机构
[1] Indiana Univ, Dept Comp Sci, Bloomington, IN 47405 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A variety of representation learning approaches have been investigated for reinforcement learning; much less attention, however, has been given to investigating the utility of sparse coding. Outside of reinforcement learning, sparse coding representations have been widely used, with non-convex objectives that result in discriminative representations. In this work, we develop a supervised sparse coding objective for policy evaluation. Despite the non-convexity of this objective, we prove that all local minima are global minima, making the approach amenable to simple optimization strategies. We empirically show that it is key to use a supervised objective, rather than the more straightforward unsupervised sparse coding approach. We compare the learned representations to a canonical fixed sparse representation, called tile-coding, demonstrating that the sparse coding representation outperforms a wide variety of tile-coding representations.
引用
收藏
页码:2067 / 2073
页数:7
相关论文
共 50 条
  • [1] The Utility of Sparse Representations for Control in Reinforcement Learning
    Liu, Vincent
    Kumaraswamy, Raksha
    Le, Lei
    White, Martha
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 4384 - 4391
  • [2] Learning Word Representations with Hierarchical Sparse Coding
    Yogatama, Dani
    Faruqui, Manaal
    Dyer, Chris
    Smith, Noah A.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 87 - 96
  • [3] Smooth sparse coding via marginal regression for learning sparse representations
    Balasubramanian, Krishnakumar
    Yu, Kai
    Lebanon, Guy
    ARTIFICIAL INTELLIGENCE, 2016, 238 : 83 - 95
  • [4] Learning Efficient Data Representations With Orthogonal Sparse Coding
    Schuetze, Henry
    Barth, Erhardt
    Martinetz, Thomas
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2016, 2 (03): : 177 - 189
  • [5] Sparse Coding Neural Gas: Learning of overcomplete data representations
    Labusch, Kai
    Barth, Erhardt
    Martinetz, Thomas
    NEUROCOMPUTING, 2009, 72 (7-9) : 1547 - 1555
  • [6] Learning modular representations from global sparse coding networks
    Eva L Dyer
    Don H Johnson
    Richard G Baraniuk
    BMC Neuroscience, 11 (Suppl 1)
  • [7] Class-specific sparse coding for learning of object representations
    Hasler, S
    Wersing, H
    Körner, E
    ARTIFICIAL NEURAL NETWORKS: BIOLOGICAL INSPIRATIONS - ICANN 2005, PT 1, PROCEEDINGS, 2005, 3696 : 475 - 480
  • [8] Learning Sparse Representations of Depth
    Tosic, Ivana
    Olshausen, Bruno A.
    Culpepper, Benjamin J.
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2011, 5 (05) : 941 - 952
  • [9] Learning sparse representations on the sphere
    Sureau, F.
    Voigtlaender, F.
    Wust, M.
    Starck, J. -L.
    Kutyniok, G.
    ASTRONOMY & ASTROPHYSICS, 2019, 621
  • [10] Sparse representations, inference and learning
    Lauditi, C.
    Troiani, E.
    Mezard, M.
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2024, 2024 (10):