Noisy Hidden Markov Models for Speech Recognition

被引:0
|
作者
Audhkhasi, Kartik [1 ]
Osoba, Osonde [1 ]
Kosko, Bart [1 ]
机构
[1] Univ So Calif, Dept Elect Engn, Signal & Image Proc Inst, Los Angeles, CA 90089 USA
关键词
Hidden Markov model; Expectation Maximization algorithm; noisy EM algorithm; stochastic resonance; speech recognition; noise injection; STOCHASTIC RESONANCE; MECHANISM; BENEFITS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We show that noise can speed training in hidden Markov models (HMMs). The new Noisy ExpectationMaximization (NEM) algorithm shows how to inject noise when learning the maximum-likelihood estimate of the HMM parameters because the underlying Baum-Welch training algorithm is a special case of the Expectation-Maximization (EM) algorithm. The NEM theorem gives a sufficient condition for such an average noise boost. The condition is a simple quadratic constraint on the noise when the HMM uses a Gaussian mixture model at each state. Simulations show that a noisy HMM converges faster than a noiseless HMM on the TIMIT data set.
引用
收藏
页数:6
相关论文
共 50 条