Question: 2.1. Consider the Markov chain whose transition probability matrix is given by Suppose that the initial distribution is p; = 4 for i = 0,

2.1. Consider the Markov chain whose transition probability matrix is given by

01 2 3 0 0.4 0.3 0.2 0.1 0.1 0.4 0.3 0.2

Suppose that the initial distribution is p; = 4 for i = 0, 1, 2, 3. Show that Pr(X,, = k} = ;, k = 0, 1, 2, 3, for all n. Can you deduce a general result from this example?

01 2 3 0 0.4 0.3 0.2 0.1 0.1 0.4 0.3 0.2 P = 2 0.3 0.2 0.1 0.4 3 | 0.2 0.2 0.1 0.4 0.3

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Probability And Stochastic Modeling Questions!