Hidden Gauss-Markov models for signal classification

被引:21
|
作者
Ainsleigh, PL [1 ]
Kehtarnavaz, N
Streit, RL
机构
[1] USN, Undersea Warfare Ctr, Newport, RI 02841 USA
[2] Univ Texas, Dept Elect Engn, Richardson, TX 75080 USA
关键词
Baum-Welch algorithm; continuous-state HMM; EM algorithm; fixed-interval smoother; forward-backward algorithm; hidden Markov model; Kalman filter; maximum likelihood classification; mixture density;
D O I
10.1109/TSP.2002.1003060
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Continuous-state hidden Markov models (CS-HMMs) are developed as a tool for signal classification. Analogs of the Baum, Viterbi, and Baum-Welch algorithms are formulated for this class of models. The CS-HMM algorithms are then specialized to hidden Gauss-Markov models (HGMMs) with linear Gaussian state-transition and output densities. A new Gaussian refactorization lemma is used to show that the Baum and Viterbi algorithms for HGMMs are implemented by two different formulations of the fixed-interval Kalman smoother. The measurement likelihoods obtained from the forward pass of the HGMM Baum algorithm and from the Kalman-filter innovation sequence are shown to be equal. A direct link between the Baum-Welch training algorithm and an existing expectation-maximization (EM) algorithm for Gaussian models is demonstrated. A new expression for the cross covariance between time-adjacent states in HGMMs is derived from the off-diagonal block of the conditional joint covariance matrix. A parameter invariance structure is noted for the HGMM likelihood function. CS-HMMs and HGMMs are extended to incorporate mixture densities for the a priori density of the initial state. Application of HGMMs to signal classification is demonstrated with a three-class test simulation.
引用
收藏
页码:1355 / 1367
页数:13
相关论文
共 50 条