Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization

被引:135
|
作者
Cichocki, Andrzej [1 ,2 ]
Cruces, Sergio [3 ]
Amari, Shun-ichi [4 ]
机构
[1] RIKEN, Brain Sci Inst, Lab Adv Brain Signal Proc, Wako, Saitama 3510198, Japan
[2] PAS, Intelligent Syst Lab, Syst Res Inst, PL-01447 Warsaw, Poland
[3] Univ Seville, Dpto Teoria Senal & Comunicac, Seville 41092, Spain
[4] RIKEN BSI, Lab Math Neurosci, Wako, Saitama 3510198, Japan
关键词
nonnegative matrix factorization (NMF); robust multiplicative NMF algorithms; similarity measures; generalized divergences; Alpha-; Beta-; Gamma-divergences; extended Itakura-Saito like divergences; generalized Kullback-Leibler divergence; INFORMATION GEOMETRY; MAXIMUM-LIKELIHOOD; EM ALGORITHM; RECONSTRUCTION; MODELS; TENSOR; ISRA;
D O I
10.3390/e13010134
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We propose a class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) which are robust with respect to noise and outliers. To achieve this, we formulate a new family generalized divergences referred to as the Alpha-Beta-divergences (AB-divergences), which are parameterized by the two tuning parameters, alpha and beta, and smoothly connect the fundamental Alpha-, Beta- and Gamma-divergences. By adjusting these tuning parameters, we show that a wide range of standard and new divergences can be obtained. The corresponding learning algorithms for NMF are shown to integrate and generalize many existing ones, including the Lee-Seung, ISRA (Image Space Reconstruction Algorithm), EMML (Expectation Maximization Maximum Likelihood), Alpha-NMF, and Beta-NMF. Owing to more degrees of freedom in tuning the parameters, the proposed family of AB-multiplicative NMF algorithms is shown to improve robustness with respect to noise and outliers. The analysis illuminates the links of between AB-divergence and other divergences, especially Gamma-and Itakura-Saito divergences.
引用
收藏
页码:134 / 170
页数:37
相关论文
共 50 条
  • [1] Deep Nonnegative Matrix Factorization With Beta Divergences
    Leplat, Valentin
    Hien, Le T. K.
    Onwunta, Akwum
    Gillis, Nicolas
    [J]. Neural Computation, 2024, 36 (11) : 2365 - 2402
  • [2] Symmetric Nonnegative Matrix Factorization With Beta-Divergences
    Shi, Min
    Yi, Qingming
    Lv, Jun
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2012, 19 (08) : 539 - 542
  • [3] Online Nonnegative Matrix Factorization with General Divergences
    Zhao, Renbo
    Tan, Vincent Y. F.
    Xu, Huan
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 37 - 45
  • [4] Robust Kernel Nonnegative Matrix Factorization
    Xia, Zhichen
    Ding, Chris
    Chow, Edmond
    [J]. 12TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW 2012), 2012, : 522 - 529
  • [5] Robust Manifold Nonnegative Matrix Factorization
    Huang, Jin
    Nie, Feiping
    Huang, Heng
    Ding, Chris
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2014, 8 (03)
  • [6] Generalized Separable Nonnegative Matrix Factorization
    Pan, Junjun
    Gillis, Nicolas
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (05) : 1546 - 1561
  • [7] Robust nonnegative matrix factorization with structure regularization
    Huang, Qi
    Yin, Xuesong
    Chen, Songcan
    Wang, Yigang
    Chen, Bowen
    [J]. NEUROCOMPUTING, 2020, 412 (72-90) : 72 - 90
  • [8] Robust Graph Regularized Nonnegative Matrix Factorization
    Huang, Qi
    Zhang, Guodao
    Yin, Xuesong
    Wang, Yigang
    [J]. IEEE ACCESS, 2022, 10 : 86962 - 86978
  • [9] A generalized divergence measure for nonnegative matrix factorization
    Kompass, Raul
    [J]. NEURAL COMPUTATION, 2007, 19 (03) : 780 - 791
  • [10] GENERALIZED INTERVAL VALUED NONNEGATIVE MATRIX FACTORIZATION
    Kohjima, Masahiro
    Matsubayashi, Tatsushi
    Toda, Hiroyuki
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3412 - 3416