Using state-level information in the HMM

被引:0
|
作者
Li, HZ [1 ]
Liu, ZQ [1 ]
Zhu, XH [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Continuing Educ, Beijing 100088, Peoples R China
关键词
Hidden Markov Models; a posteriori state probability; current state probability; state-level information; phoneme; speech recognition; learning; training;
D O I
10.1109/ICMLC.2004.1378574
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In HMM-based pattern recognition, the structure of HMM is predetermined according to some prior knowledge. In the recognition process, we usually make our judgment based on the maximum likelihood of the HMM which unfortunately may lead to incorrect results. In this paper, we analyze the roles of individual hidden states of the HMM and their associated posterior probabilities that reflect the nature of the components in the observation sequence, which should be taken into consideration. For this, we propose to make a full use of the state-level information, e.g., making use of the distribution of the intersection number of state posterior probability trajectories in the recognition process. We apply the proposed methods to phoneme classification on TIMIT speech corpus and show indeed that we are able to achieve about 2% percent improvement in recognition rate over that of the classical HMM.
引用
收藏
页码:3140 / 3145
页数:6
相关论文
共 50 条