Algorithms for Nonnegative Matrix Factorization with the Kullback–Leibler Divergence

被引:0
|
作者
Le Thi Khanh Hien
Nicolas Gillis
机构
[1] Université de Mons,Department of Mathematics and Operational Research, Faculté Polytechnique
来源
关键词
Nonnegative matrix factorization; Kullback–Leibler divergence; Poisson distribution; Algorithms;
D O I
暂无
中图分类号
学科分类号
摘要
Nonnegative matrix factorization (NMF) is a standard linear dimensionality reduction technique for nonnegative data sets. In order to measure the discrepancy between the input data and the low-rank approximation, the Kullback–Leibler (KL) divergence is one of the most widely used objective function for NMF. It corresponds to the maximum likehood estimator when the underlying statistics of the observed data sample follows a Poisson distribution, and KL NMF is particularly meaningful for count data sets, such as documents. In this paper, we first collect important properties of the KL objective function that are essential to study the convergence of KL NMF algorithms. Second, together with reviewing existing algorithms for solving KL NMF, we propose three new algorithms that guarantee the non-increasingness of the objective function. We also provide a global convergence guarantee for one of our proposed algorithms. Finally, we conduct extensive numerical experiments to provide a comprehensive picture of the performances of the KL NMF algorithms.
引用
收藏
相关论文
共 50 条
  • [1] Algorithms for Nonnegative Matrix Factorization with the Kullback-Leibler Divergence
    Hien, Le Thi Khanh
    Gillis, Nicolas
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2021, 87 (03)
  • [2] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257
  • [3] Algorithms for Nonnegative Matrix Factorization with the β-Divergence
    Fevotte, Cedric
    Idier, Jerome
    [J]. NEURAL COMPUTATION, 2011, 23 (09) : 2421 - 2456
  • [4] The Kullback-Leibler divergence and nonnegative matrices
    Boche, Holger
    Stanczak, Slawomir
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) : 5539 - 5545
  • [5] PRIMAL-DUAL ALGORITHMS FOR NON-NEGATIVE MATRIX FACTORIZATION WITH THE KULLBACK-LEIBLER DIVERGENCE
    Yanez, Felipe
    Bach, Francis
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 2257 - 2261
  • [6] Large Basic Cone and Sparse Subspace Constrained Nonnegative Matrix Factorization With Kullback-Leibler Divergence for Data Representation
    Viet-Hang Duong
    Manh-Quan Bui
    Li, Yung-Hui
    Tai, Tzu-Chiang
    Wang, Jia-Ching
    [J]. IEEE INTELLIGENT SYSTEMS, 2019, 34 (04) : 39 - 47
  • [7] Convergent Projective Non-negative Matrix Factorization with Kullback-Leibler Divergence
    Hu, Lirui
    Dai, Liang
    Wu, Jianguo
    [J]. PATTERN RECOGNITION LETTERS, 2014, 36 : 15 - 21
  • [8] Sparse Non-negative Matrix Factorization with Generalized Kullback-Leibler Divergence
    Chen, Jingwei
    Feng, Yong
    Liu, Yang
    Tang, Bing
    Wu, Wenyuan
    [J]. INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2016, 2016, 9937 : 353 - 360
  • [9] Feature Nonlinear Transformation Non-Negative Matrix Factorization with Kullback-Leibler Divergence
    Hu, Lirui
    Wu, Ning
    Li, Xiao
    [J]. PATTERN RECOGNITION, 2022, 132
  • [10] ONLINE ALGORITHMS FOR NONNEGATIVE MATRIX FACTORIZATION WITH THE ITAKURA-SAITO DIVERGENCE
    Lefevre, Augustin
    Bach, Francis
    Fevotte, Cedric
    [J]. 2011 IEEE WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS (WASPAA), 2011, : 313 - 316