COMPLEX NMF WITH THE GENERALIZED KULLBACK-LEIBLER DIVERGENCE

被引:0
|
作者
Kameoka, Hirokazu [1 ]
Kagami, Hideaki [2 ]
Yukawa, Masahiro [2 ]
机构
[1] NTT Corp, NTT Commun Sci Labs, Tokyo, Tokyo, Japan
[2] Keio Univ, Dept Elect & Elect Engn, Tokyo, Japan
关键词
Audio source separation; non-negative matrix factorization (NMF); Complex NMF; generalized Kullback-Leibler (KL) divergence;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
We previously introduced a phase-aware variant of the non-negative matrix factorization (NMF) approach for audio source separation, which we call the "Complex NMF (CNMF)." This approach makes it possible to realize NMF-like signal decompositions in the complex time-frequency domain. One limitation of the CNMF framework is that the divergence measure is limited to only the Euclidean distance. Some previous studies have revealed that for source separation tasks with NMF, the generalized Kullback-Leibler (KL) divergence tends to yield higher accuracy than when using other divergence measures. This motivated us to believe that CNMF could achieve even greater source separation accuracy if we could derive an algorithm for a KL divergence counterpart of CNMF. In this paper, we start by defining the notion of the "dual" form of the CNMF formulation, derived from the original Euclidean CNMF, and show that a KL divergence counterpart of CNMF can be developed based on this dual formulation. We call this "KL-CNMF". We further derive a convergence-guaranteed iterative algorithm for KL-CNMF based on a majorization-minimization scheme. The source separation experiments revealed that the proposed KL-CNMF yielded higher accuracy than the Euclidean CNMF and NMF with varying divergences.
引用
收藏
页码:56 / 60
页数:5
相关论文
共 50 条
  • [1] NMF Algorithm Based on Extended Kullback-Leibler Divergence
    Gao, Liuyang
    Tian, Yinghua
    Lv, Pinpin
    Dong, Peng
    [J]. PROCEEDINGS OF 2019 IEEE 3RD INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2019), 2019, : 1804 - 1808
  • [2] The generalized Kullback-Leibler divergence and robust inference
    Park, C
    Basu, A
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2003, 73 (05) : 311 - 332
  • [3] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [4] COMPARISON OF NMF WITH KULLBACK-LEIBLER DIVERGENCE AND ITAKURA-SAITO DIVERGENCE FOR ODOR APPROXIMATION
    Prasetyawan, Dani
    Nakamoto, Takamichi
    [J]. 2019 IEEE INTERNATIONAL SYMPOSIUM ON OLFACTION AND ELECTRONIC NOSE (ISOEN 2019), 2019, : 310 - 312
  • [5] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [6] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    [J]. ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [7] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    [J]. CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [8] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    [J]. ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [9] An NMF-HMM Speech Enhancement Method based on Kullback-Leibler Divergence
    Xiang, Yang
    Shi, Liming
    Hojvang, Jesper Lisby
    Rasmussen, Morten Hojfeldt
    Christensen, Mads Graesboll
    [J]. INTERSPEECH 2020, 2020, : 2667 - 2671
  • [10] Kullback-Leibler Divergence Between Multivariate Generalized Gaussian Distributions
    Bouhlel, Nizar
    Dziri, Ali
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (07) : 1021 - 1025