Consider the continuous Markov chain of Example 11.17: A chain with two states S = {0, 1}

Question:

Consider the continuous Markov chain of Example 11.17: A chain with two states S = {0, 1} and λ0 = λ1 = λ > 0. In that example, we found that the transition matrix for any t ≥ 0 is given byP(t) = -2xt +e- -2xt 27/27-12- 7-10-2417 -2xt 1 2+e-2xt

Find the stationary distribution π for this chain.


Example 11.17

Consider a continuous Markov chain with two states S = {0, 1}. Assume the holding time parameters are given by λ0 = λ1 = λ > 0. That is, the time that the chain spends in each state before going to the other state has an Exponential(λ) distribution.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Question Posted: