COMPLEX NMF WITH THE GENERALIZED KULLBACK-LEIBLER DIVERGENCE

被引:0
|
作者
Kameoka, Hirokazu [1 ]
Kagami, Hideaki [2 ]
Yukawa, Masahiro [2 ]
机构
[1] NTT Corp, NTT Commun Sci Labs, Tokyo, Tokyo, Japan
[2] Keio Univ, Dept Elect & Elect Engn, Tokyo, Japan
关键词
Audio source separation; non-negative matrix factorization (NMF); Complex NMF; generalized Kullback-Leibler (KL) divergence;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
We previously introduced a phase-aware variant of the non-negative matrix factorization (NMF) approach for audio source separation, which we call the "Complex NMF (CNMF)." This approach makes it possible to realize NMF-like signal decompositions in the complex time-frequency domain. One limitation of the CNMF framework is that the divergence measure is limited to only the Euclidean distance. Some previous studies have revealed that for source separation tasks with NMF, the generalized Kullback-Leibler (KL) divergence tends to yield higher accuracy than when using other divergence measures. This motivated us to believe that CNMF could achieve even greater source separation accuracy if we could derive an algorithm for a KL divergence counterpart of CNMF. In this paper, we start by defining the notion of the "dual" form of the CNMF formulation, derived from the original Euclidean CNMF, and show that a KL divergence counterpart of CNMF can be developed based on this dual formulation. We call this "KL-CNMF". We further derive a convergence-guaranteed iterative algorithm for KL-CNMF based on a majorization-minimization scheme. The source separation experiments revealed that the proposed KL-CNMF yielded higher accuracy than the Euclidean CNMF and NMF with varying divergences.
引用
收藏
页码:56 / 60
页数:5
相关论文
共 50 条
  • [21] AN INVOLUTION INEQUALITY FOR THE KULLBACK-LEIBLER DIVERGENCE
    Pinelis, Iosif
    [J]. MATHEMATICAL INEQUALITIES & APPLICATIONS, 2017, 20 (01): : 233 - 235
  • [22] Distributions of the Kullback-Leibler divergence with applications
    Belov, Dmitry I.
    Armstrong, Ronald D.
    [J]. BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2011, 64 (02): : 291 - 309
  • [23] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [24] Abnormality detection based on the Kullback-Leibler divergence for generalized Gaussian data
    Xiong, Ying
    Jing, Yindi
    Chen, Tongwen
    [J]. CONTROL ENGINEERING PRACTICE, 2019, 85 : 257 - 270
  • [25] A generalization of the Kullback-Leibler divergence and its properties
    Yamano, Takuya
    [J]. JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (04)
  • [26] KULLBACK-LEIBLER DISTANCE BETWEEN COMPLEX GENERALIZED GAUSSIAN DISTRIBUTIONS
    Nafornita, Corina
    Berthoumieu, Yannick
    Nafornita, Ioan
    Isar, Alexandru
    [J]. 2012 PROCEEDINGS OF THE 20TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2012, : 1850 - 1854
  • [27] Computation of Kullback-Leibler Divergence in Bayesian Networks
    Moral, Serafin
    Cano, Andres
    Gomez-Olmedo, Manuel
    [J]. ENTROPY, 2021, 23 (09)
  • [28] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    [J]. 2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [29] Generalization of the Kullback-Leibler divergence in the Tsallis statistics
    Huang, Juntao
    Yong, Wen-An
    Hong, Liu
    [J]. JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2016, 436 (01) : 501 - 512
  • [30] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257