An Expectation-Maximization Algorithm to Compute a Stochastic Factorization From Data

被引:0
|
作者
Barreto, Andre M. S. [1 ]
Beirigo, Rafael L. [1 ]
Pineau, Joelle [2 ]
Precup, Doina [2 ]
机构
[1] Lab Nacl Comp Cient, Petropolis, RJ, Brazil
[2] McGill Univ, Montreal, PQ, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
HIDDEN MARKOV-MODELS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When a transition probability matrix is represented as the product of two stochastic matrices, swapping the factors of the multiplication yields another transition matrix that retains some fundamental characteristics of the original. Since the new matrix can be much smaller than its precursor, replacing the former for the latter can lead to significant savings in terms of computational effort. This strategy, dubbed the "stochastic-factorization trick," can be used to compute the stationary distribution of a Markov chain, to determine the fundamental matrix of an absorbing chain, and to compute a decision policy via dynamic programming or reinforcement learning. In this paper we show that the stochastic-factorization trick can also provide benefits in terms of the number of samples needed to estimate a transition matrix. We introduce a probabilistic interpretation of a stochastic factorization and build on the resulting model to develop an algorithm to compute the factorization directly from data. If the transition matrix can be well approximated by a low-order stochastic factorization, estimating its factors instead of the original matrix reduces significantly the number of parameters to be estimated. Thus, when compared to estimating the transition matrix directly via maximum likelihood, the proposed method is able to compute approximations of roughly the same quality using less data. We illustrate the effectiveness of the proposed algorithm by using it to help a reinforcement learning agent learn how to play the game of blackjack.
引用
收藏
页码:3329 / 3336
页数:8
相关论文
共 50 条
  • [1] The expectation-maximization algorithm
    Moon, TK
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 1996, 13 (06) : 47 - 60
  • [2] An expectation-maximization algorithm working on data summary
    Jin, HD
    Leung, KS
    Wong, ML
    [J]. COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2002, : 221 - 226
  • [3] A stochastic expectation-maximization algorithm for the analysis of system lifetime data with known signature
    Yang, Yandan
    Ng, Hon Keung Tony
    Balakrishnan, Narayanaswamy
    [J]. COMPUTATIONAL STATISTICS, 2016, 31 (02) : 609 - 641
  • [4] A stochastic expectation-maximization algorithm for the analysis of system lifetime data with known signature
    Yandan Yang
    Hon Keung Tony Ng
    Narayanaswamy Balakrishnan
    [J]. Computational Statistics, 2016, 31 : 609 - 641
  • [5] Quantum expectation-maximization algorithm
    Miyahara, Hideyuki
    Aihara, Kazuyuki
    Lechner, Wolfgang
    [J]. PHYSICAL REVIEW A, 2020, 101 (01)
  • [6] THE NOISY EXPECTATION-MAXIMIZATION ALGORITHM
    Osoba, Osonde
    Mitaim, Sanya
    Kosko, Bart
    [J]. FLUCTUATION AND NOISE LETTERS, 2013, 12 (03):
  • [7] A variational Expectation-Maximization algorithm for temporal data clustering
    El Assaad, Hani
    Same, Allou
    Govaert, Gerard
    Aknin, Patrice
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2016, 103 : 206 - 228
  • [8] Expectation-Maximization Algorithm with Local Adaptivity
    Leung, Shingyu
    Liang, Gang
    Solna, Knut
    Zhao, Hongkai
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (03): : 834 - 857
  • [9] Missing Step Count Data? Step Away From the Expectation-Maximization Algorithm
    Tackney, Mia S.
    Stahl, Daniel
    Williamson, Elizabeth
    Carpenter, James
    [J]. JOURNAL FOR THE MEASUREMENT OF PHYSICAL BEHAVIOUR, 2022, 5 (04) : 205 - 214
  • [10] On-line expectation-maximization algorithm for latent data models
    Cappe, Olivier
    Moulines, Eric
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2009, 71 : 593 - 613