TY - JOUR
T1 - A stochastic version of Expectation Maximization algorithm for better estimation of Hidden Markov Model
AU - Huda, S.
AU - Yearwood, J.
AU - Togneri, Roberto
PY - 2009
Y1 - 2009
N2 - This paper attempts to overcome the local convergence problem of the Expectation Maximization (EM) based training of the Hidden Markov Model (HMM) in speech recognition. We propose a hybrid algorithm, Simulated Annealing Stochastic version of EM (SASEM), combining Simulated Annealing with EM that reformulates the HMM estimation process using a stochastic step between the EM steps and the SA. The stochastic processes of SASEM inside EM can prevent EM from converging to a local maximum and find improved estimation for HMM using the global convergence properties of SA. Experiments on the TIMIT speech corpus show that SASEM obtains higher recognition accuracies than the EM.
AB - This paper attempts to overcome the local convergence problem of the Expectation Maximization (EM) based training of the Hidden Markov Model (HMM) in speech recognition. We propose a hybrid algorithm, Simulated Annealing Stochastic version of EM (SASEM), combining Simulated Annealing with EM that reformulates the HMM estimation process using a stochastic step between the EM steps and the SA. The stochastic processes of SASEM inside EM can prevent EM from converging to a local maximum and find improved estimation for HMM using the global convergence properties of SA. Experiments on the TIMIT speech corpus show that SASEM obtains higher recognition accuracies than the EM.
U2 - 10.1016/j.patrec.2009.06.006
DO - 10.1016/j.patrec.2009.06.006
M3 - Article
SN - 0167-8655
VL - 30
SP - 1301
EP - 1309
JO - Pattern Recognition Letters
JF - Pattern Recognition Letters
IS - 14
ER -