Question: 2.1. Consider the Markov chain whose transition probability matrix is given by Suppose that the initial distribution is p; = 4 for i = 0,
2.1. Consider the Markov chain whose transition probability matrix is given by

Suppose that the initial distribution is p; = 4 for i = 0, 1, 2, 3. Show that Pr(X,, = k} = ;, k = 0, 1, 2, 3, for all n. Can you deduce a general result from this example?
01 2 3 0 0.4 0.3 0.2 0.1 0.1 0.4 0.3 0.2 P = 2 0.3 0.2 0.1 0.4 3 | 0.2 0.2 0.1 0.4 0.3
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
