Size Adaptation of Separable Dictionary Learning with Information-Theoretic Criteria

被引:0
|
作者
Baltoiu, Andra [1 ]
Dumitrescu, Bogdan [2 ]
机构
[1] Univ Bucharest, Res Inst Univ Bucharest ICUB, Bucharest 050107, Romania
[2] Univ Politehn Bucuresti, Dept Automat Control & Comp, Bucharest 060042, Romania
关键词
dictionary learning; information theoretic criteria; Kronecker structure;
D O I
10.1109/CSCS.2019.00009
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In sparse representation problems, the size of the dictionary is critical to the performance of the learning algorithm and, apart from loose guidelines concerning dictionary integrity, there is little indication on how to determine the optimal size. Information-theoretic criteria (ITC), used generally for model selection, have recently been employed for the task. This paper extends the work for the case of separable dictionaries, by modifying the Extended Renormalized Maximum Likelihood criterion to the 2D model and proposes an adaptation algorithm that almost entirely relies on the ITC score. Results in terms of mean size recovery rates are within 1 atom away from the true size, while representation errors are consistently below those obtained when applying dictionary learning with the known size.
引用
收藏
页码:7 / 11
页数:5
相关论文
共 50 条
  • [41] Information-theoretic logic
    Corcoran, J
    TRUTH IN PERSPECTIVE: RECENT ISSUES IN LOGIC, REPRESENTATION AND ONTOLOGY, 1998, : 113 - 135
  • [42] Information-Theoretic Adverbialism
    Gert, Joshua
    AUSTRALASIAN JOURNAL OF PHILOSOPHY, 2021, 99 (04) : 696 - 715
  • [43] An Information-Theoretic Approach for Multi-task Learning
    Yang, Pei
    Tan, Qi
    Xu, Hao
    Ding, Yehua
    ADVANCED DATA MINING AND APPLICATIONS, PROCEEDINGS, 2009, 5678 : 386 - 396
  • [44] Information-theoretic limits of Bayesian network structure learning
    Ghoshal, Asish
    Honorio, Jean
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 767 - 775
  • [45] Distributed Information-Theoretic Metric Learning in Apache Spark
    Su, Yuxin
    Yang, Haiqin
    King, Irwin
    Lyu, Michael
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 3306 - 3313
  • [46] An Information-Theoretic Framework for Unifying Active Learning Problems
    Nguyen, Quoc Phong
    Low, Bryan Kian Hsiang
    Jaillet, Patrick
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 9126 - 9134
  • [47] Information-Theoretic Dataset Selection for Fast Kernel Learning
    Paiva, Antonio R. C.
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2088 - 2095
  • [48] AN INFORMATION-THEORETIC APPROACH TO TRANSFERABILITY IN TASK TRANSFER LEARNING
    Bao, Yajie
    Li, Yang
    Huang, Shao-Lun
    Zhang, Lin
    Zheng, Lizhong
    Zamir, Amir
    Guibas, Leonidas
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 2309 - 2313
  • [49] MONOTONICITY MAINTENANCE IN INFORMATION-THEORETIC MACHINE LEARNING ALGORITHMS
    BENDAVID, A
    MACHINE LEARNING, 1995, 19 (01) : 29 - 43
  • [50] An information-theoretic learning algorithm for neural network classification
    Miller, DJ
    Rao, A
    Rose, K
    Gersho, A
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 591 - 597