Suppose a process can be considered to be in one of two states (let’s call them state A and state B), but the next state of the process depends not only on the current state but also on the previous state as well. We can still describe this process using a Markov chain, but we will now need four states. The chain will be in state (X, Y), X, Y Ɛ {A, B}, if the process is currently in state X and was previously in state Y.
(a) Show that the transition probability matrix of such a four- state Markov chain must have zeros in at least half of its entries.
(b) Suppose that the transition probability matrix is given by
Find the steady- state distribution of the Markov chain.
(c) What is the steady- state probability that the underlying process is in state A?

  • CreatedNovember 20, 2015
  • Files Included
Post your question