Citation
Rabiner, Lawrence R. "A tutorial on hidden Markov models and selected applications in speech recognition." Proceedings of the IEEE 77.2 (1989): 257-286.
Summary
The paper presents an in-depth overview of Hidden Markov Models (HMMs) and then gives a real world example of usage of HMM's in speech recognition application. In Hidden Markov Models the system being modeled is assumed to be a Markov Process with unobserved or hidden states. HMMs can be presented as simplest dynamic Bayesian network. For example, a HMM with 4 states and 3 observed events is shown below (probabilities are not mentioned along the edges):
Discussion
Forward chaining is a method in which we find probabilities of upcoming events in forward direction of movement of vents. At each step we calculate the probabilities by multiplying the probabilities of current events with the probabilities of past events. At each stage we can prune out the states which cannot be maximum in any case. In backward chaining we do the same thing but in reverse direction, that is from current event to events in the past.
No comments:
Post a Comment