A multimodal variational approach to learning and inference in switching state space models

被引:0
|
作者
Lee, LJ [1 ]
Attias, H [1 ]
Deng, L [1 ]
Fieguth, P [1 ]
机构
[1] Univ Waterloo, Waterloo, ON N2L 3G1, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An important general model for discrete-time signal processing is the switching state space (SSS) model, which generalizes the hidden Markov model and the Gaussian state space model. Inference and parameter estimation in this model are known to be computationally intractable. This paper presents a powerful new approximation to the SSS model. The approximation is based on a variational technique that preserves the multimodal nature of the continuous state posterior distribution. Furthermore, by incorporating a windowing technique, the resulting EM algorithm has complexity that is just linear in the length of the time series. An alternative Viterbi decoding with frame-based likelihood is also presented which is crucial for the speech application that originally motivates this work. Our experiments focus on demonstrating the effectiveness of the algorithm by extensive simulations. A typical example in speech processing is also included to show the potential of this approach for practical applications.
引用
收藏
页码:505 / 508
页数:4
相关论文
共 50 条
  • [1] Variational inference and learning for segmental switching state space models of hidden speech dynamics
    Lee, LJ
    Attias, H
    Deng, L
    2003 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL I, PROCEEDINGS: SPEECH PROCESSING I, 2003, : 872 - 875
  • [2] Variational learning for switching state-space models
    Ghahramani, Z
    Hinton, GE
    NEURAL COMPUTATION, 2000, 12 (04) : 831 - 864
  • [3] Structured Variational Bayesian Inference for Gaussian State-Space Models With Regime Switching
    Petetin, Yohan
    Janati, Yazid
    Desbouvries, Francois
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 1953 - 1957
  • [4] Variational Bayesian inference of linear state space models
    Pan, Chuanchao
    Wang, Jingzhuo
    Dong, Zijian
    JOURNAL OF ENGINEERING-JOE, 2019, 2019 (23): : 8531 - 8534
  • [5] Scalable Bayesian Learning for State Space Models using Variational Inference with SMC Samplers
    Hirt, Marcel
    Dellaportas, Petros
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 76 - 86
  • [6] State inference in variational Bayesian nonlinear state-space models
    Raiko, T
    Tornio, M
    Honkela, A
    Karhunen, J
    INDEPENDENT COMPONENT ANALYSIS AND BLIND SIGNAL SEPARATION, PROCEEDINGS, 2006, 3889 : 222 - 229
  • [7] Structured Variational Inference in Bayesian State-Space Models
    Wang, Honggang
    Yang, Yun
    Pati, Debdeep
    Bhattacharya, Anirban
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [8] Strategies for sequential inference in factorial switching state space models
    Cemgil, A. Taylan
    2007 IEEE International Conference on Acoustics, Speech, and Signal Processing, Vol II, Pts 1-3, 2007, : 513 - 516
  • [9] Learning Generative State Space Models for Active Inference
    Catal, Ozan
    Wauthier, Samuel
    De Boom, Cedric
    Verbelen, Tim
    Dhoedt, Bart
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2020, 14
  • [10] A structured variational learning approach for switching latent factor models
    Mohamed Saidane
    Christian Lavergne
    AStA Advances in Statistical Analysis, 2007, 91 : 245 - 268