On the Global Convergence of (Fast) Incremental Expectation Maximization Methods

被引:0
|
作者
Karimi, Belhal [1 ]
Wai, Hoi-To [2 ]
Moulines, Eric [1 ]
Lavielle, Marc [3 ]
机构
[1] Ecole Polytech, CMAP, Palaiseau, France
[2] Chinese Univ Hong Kong, Shatin, Hong Kong, Peoples R China
[3] INRIA Saclay, Palaiseau, France
关键词
EM ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The EM algorithm is one of the most popular algorithm for inference in latent data models. The original formulation of the EM algorithm does not scale to large data set, because the whole data set is required at each iteration of the algorithm. To alleviate this problem, Neal and Hinton [1998] have proposed an incremental version of the EM (iEM) in which at each iteration the conditional expectation of the latent data (E-step) is updated only for a mini-batch of observations. Another approach has been proposed by Cappe and Moulines [2009] in which the E-step is replaced by a stochastic approximation step, closely related to stochastic gradient. In this paper, we analyze incremental and stochastic version of the EM algorithm as well as the variance reduced-version of [Chen et al., 2018] in a common unifying framework. We also introduce a new version incremental version, inspired by the SAGA algorithm by Defazio et al. [2014]. We establish non-asymptotic convergence bounds for global convergence. Numerical applications are presented in this article to illustrate our findings.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Fast incremental expectation maximization for finite-sum optimization: nonasymptotic convergence
    G. Fort
    P. Gach
    E. Moulines
    Statistics and Computing, 2021, 31
  • [2] Fast incremental expectation maximization for finite-sum optimization: nonasymptotic convergence
    Fort, G.
    Gach, P.
    Moulines, E.
    STATISTICS AND COMPUTING, 2021, 31 (04)
  • [3] A FAST CONVERGENCE PHASE ESTIMATION METHOD BASED ON EXPECTATION-MAXIMIZATION ALGORITHM
    Wang Ge
    Yu Hong-Yi
    4TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTER THEORY AND ENGINEERING ( ICACTE 2011), 2011, : 309 - 312
  • [4] GLOBAL CONVERGENCE RATE OF PROXIMAL INCREMENTAL AGGREGATED GRADIENT METHODS
    Vanli, N. D.
    Gurbuzbalaban, M.
    Ozdaglar, A.
    SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) : 1282 - 1300
  • [5] Global Convergence Rate of Incremental Aggregated Gradient Methods for Nonsmooth Problems
    Vanli, N. Denizcan
    Gurbuzbalaban, Mert
    Ozdaglar, Asuman
    2016 IEEE 55TH CONFERENCE ON DECISION AND CONTROL (CDC), 2016, : 173 - 178
  • [6] An investigation of convergence rates in expectation maximization (EM) iterative reconstruction
    Liu, Z
    Obi, T
    Yamaguchi, M
    Ohyama, N
    1999 IEEE NUCLEAR SCIENCE SYMPOSIUM - CONFERENCE RECORD, VOLS 1-3, 1999, : 1412 - 1417
  • [7] Convergence of the Monte Carlo expectation maximization for curved exponential families
    Fort, G
    Moulines, E
    ANNALS OF STATISTICS, 2003, 31 (04): : 1220 - 1259
  • [8] SPEAKER LOCALIZATION AND SEPARATION USING INCREMENTAL DISTRIBUTED EXPECTATION-MAXIMIZATION
    Dorfan, Yuval
    Cherkassky, Dani
    Gannot, Sharon
    2015 23RD EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2015, : 1256 - 1260
  • [9] Global Analysis of Expectation Maximization for Mixtures of Two Gaussians
    Xu, Ji
    Hsu, Daniel
    Maleki, Arian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29