Forward Algorithm

The goal of the forward algorithm is to compute the joint probability p(x_t,y_{1:t}), where for notational convenience we have abbreviated x(t) as x_t and (y(1), y(2), ..., y(t)) as y_{1:t}. Computing p(x_t,y_{1:t})directly would require marginalizing over all possible state sequences \{x_{1:t-1}\}, the number of which grows exponentially with t. Instead, the forward algorithm takes advantage of the conditional independence rules of the hidden Markov model (HMM) to perform the calculation recursively.

To demonstrate the recursion, let

\alpha_t(x_t) = p(x_t,y_{1:t}) = \sum_{x_{t-1}}p(x_t,x_{t-1},y_{1:t}).

Using the chain rule to expand p(x_t,x_{t-1},y_{1:t}), we can then write

\alpha_t(x_t) = \sum_{x_{t-1}}p(y_t|x_t,x_{t-1},y_{1:t-1})p(x_t|x_{t-1},y_{1:t-1})p(x_{t-1},y_{1:t-1}).

Because y_t is conditionally independent of everything but x_t, and x_t is conditionally independent of everything but x_{t-1}, this simplifies to

\alpha_t(x_t) = p(y_t|x_t)\sum_{x_{t-1}}p(x_t|x_{t-1})\alpha_{t-1}(x_{t-1}).

Thus, since p(y_t|x_t) and p(x_t|x_{t-1}) are given by the model's emission distributions and transition probabilities, one can quickly calculate \alpha_t(x_t) from \alpha_{t-1}(x_{t-1}) and avoid incurring exponential computation time.

The forward algorithm is easily modified to account for observations from variants of the hidden Markov model as well, such as the Markov jump linear system.

posted @ 2015-12-07 15:28  StevenLuke  阅读(155)  评论(0编辑  收藏  举报