Generalized Baum-Welch and Viterbi Algorithms Based on the Direct Dependency among Observations

Authors

  • Dariusz Plewczynski Centre of New Technologies, University of Warsaw, Banacha 2c Street, 02-097 Warsaw, Poland
  • Hosna Fathipor Financial Mathematics Group, Faculty of Financial Sciences, University of Kharazmi, Tehran, Iran
  • Vahid Rezaei Tabar Department of Statistics, Faculty of mathematics and Computer Sciences, Allameh Tabataba'i University, Tehran, Iran.
Abstract:

The parameters of a Hidden Markov Model (HMM) are transition and emission probabilities‎. ‎Both can be estimated using the Baum-Welch algorithm‎. ‎The process of discovering the sequence of hidden states‎, ‎given the sequence of observations‎, ‎is performed by the Viterbi algorithm‎. ‎In both Baum-Welch and Viterbi algorithms‎, ‎it is assumed that‎, ‎given the states‎, ‎the observations are independent from each other‎. ‎In this paper‎, ‎we first consider the direct dependency between consecutive observations in the HMM‎, ‎and then use conditional independence relations in the context of a Bayesian network which is a probabilistic graphical model for generalizing the Baum-Welch and Viterbi algorithms‎. ‎We compare the performance of the generalized algorithms with the commonly used ones in simulation studies for synthetic data‎. ‎We finally apply these algorithms on real data sets which are related to biological and inflation data‎. ‎We show that the generalized Baum-Welch and Viterbi algorithms significantly outperform the conventional ones when sample sizes become larger‎.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

Generalized Baum-Welch Algorithm Based on the Similarity between Sequences

The profile hidden Markov model (PHMM) is widely used to assign the protein sequences to their respective families. A major limitation of a PHMM is the assumption that given states the observations (amino acids) are independent. To overcome this limitation, the dependency between amino acids in a multiple sequence alignment (MSA) which is the representative of a PHMM can be appended to the PHMM...

full text

Comparing the Bidirectional Baum-Welch Algorithm and the Baum-Welch Algorithm on Regular Lattice

A profile hidden Markov model (PHMM) is widely used in assigning protein sequences to protein families. In this model, the hidden states only depend on the previous hidden state and observations are independent given hidden states. In other words, in the PHMM, only the information of the left side of a hidden state is considered. However, it makes sense that considering the information of the b...

full text

Unidirectional and parallel Baum-Welch algorithms

Hidden Markov models (HMM’s) are popular in many applications, such as automatic speech recognition, control theory, biology, communication theory over channels with bursts of errors, queueing theory, and many others. Therefore, it is important to have robust and fast methods for fitting HMM’s to experimental data (training). Standard statistical methods of maximum likelihood parameter estimati...

full text

Generalized Baum-Welch Algorithm and its Implication to a New Extended Baum-Welch Algorithm

This paper describes how we can use the generalized BaumWelch (GBW) algorithm to develop better extended BaumWelch (EBW) algorithms. Based on GBW, we show that the backoff term in the EBW algorithm comes from KL-divergence which is used as a regularization function. This finding allows us to develop a fast EBW algorithm, which can reduce the time of model space discriminative training by half, ...

full text

Comparative Study of the Baum-Welch and Viterbi Training Algorithms Applied to Read and Spontaneous Speech Recognition

In this paper we compare the performance of acoustic HMMs obtained through Viterbi training with that of acoustic HMMs obtained through the Baum-Welch algorithm. We present recognition results for discrete and continuous HMMs, for read and spontaneous speech databases, acquired at 8 and 16 kHz. We also present results for a combination of Viterbi and Baum-Welch training, intended as a trade-off...

full text

comparing the bidirectional baum-welch algorithm and the baum-welch algorithm on regular lattice

a profile hidden markov model (phmm) is widely used in assigning protein sequences to protein families. in this model, the hidden states only depend on the previous hidden state and observations are independent given hidden states. in other words, in the phmm, only the information of the left side of a hidden state is considered. however, it makes sense that considering the information of the b...

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 17  issue None

pages  205- 225

publication date 2018-12

By following a journal you will be notified via email when a new issue of this journal is published.

Keywords

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023