Size Adaptation of Separable Dictionary Learning with Information-Theoretic Criteria

被引:0
|
作者
Baltoiu, Andra [1 ]
Dumitrescu, Bogdan [2 ]
机构
[1] Univ Bucharest, Res Inst Univ Bucharest ICUB, Bucharest 050107, Romania
[2] Univ Politehn Bucuresti, Dept Automat Control & Comp, Bucharest 060042, Romania
关键词
dictionary learning; information theoretic criteria; Kronecker structure;
D O I
10.1109/CSCS.2019.00009
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In sparse representation problems, the size of the dictionary is critical to the performance of the learning algorithm and, apart from loose guidelines concerning dictionary integrity, there is little indication on how to determine the optimal size. Information-theoretic criteria (ITC), used generally for model selection, have recently been employed for the task. This paper extends the work for the case of separable dictionaries, by modifying the Extended Renormalized Maximum Likelihood criterion to the 2D model and proposes an adaptation algorithm that almost entirely relies on the ITC score. Results in terms of mean size recovery rates are within 1 atom away from the true size, while representation errors are consistently below those obtained when applying dictionary learning with the known size.
引用
收藏
页码:7 / 11
页数:5
相关论文
共 50 条
  • [21] Information-theoretic privacy in federated submodel learning
    Kim, Minchul
    Lee, Jungwoo
    ICT EXPRESS, 2023, 9 (03): : 415 - 419
  • [22] An Information-Theoretic Analysis of Bayesian Reinforcement Learning
    Gouverneur, Amaury
    Rodriguez-Galvez, Borja
    Oechtering, Tobias J.
    Skoglund, Mikael
    2022 58TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2022,
  • [23] Information-Theoretic Confidence Bounds for Reinforcement Learning
    Lu, Xiuyuan
    Van Roy, Benjamin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [24] On the Generalization for Transfer Learning: An Information-Theoretic Analysis
    Wu, Xuetong
    Manton, Jonathan H.
    Aickelin, Uwe
    Zhu, Jingge
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (10) : 7089 - 7124
  • [25] Information-Theoretic Measures for Meta-learning
    Segrera, Saddys
    Pinho, Joel
    Moreno, Maria N.
    HYBRID ARTIFICIAL INTELLIGENCE SYSTEMS, 2008, 5271 : 458 - 465
  • [26] Data mapping by probabilistic modular networks and information-theoretic criteria
    Wang, Y
    Lin, SH
    Li, H
    Kung, SY
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1998, 46 (12) : 3378 - 3397
  • [27] Improving information-theoretic competitive learning by accentuated information maximization
    Kamimura, R
    INTERNATIONAL JOURNAL OF GENERAL SYSTEMS, 2005, 34 (03) : 219 - 233
  • [28] Information maximization and cost minimization in information-theoretic competitive learning
    Kamimura, R
    PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), VOLS 1-5, 2005, : 202 - 207
  • [29] Local minima of information-theoretic criteria in blind source separation
    Pham, DT
    Vrins, F
    IEEE SIGNAL PROCESSING LETTERS, 2005, 12 (11) : 788 - 791
  • [30] Forced information maximization to accelerate information-theoretic competitive learning
    Karnimura, Ryotaro
    Kitajima, Ryozo
    2007 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-6, 2007, : 1779 - 1784