Consider the Markov chain with three states S = {1, 2, 3}, that has the state transition

Question:

Consider the Markov chain with three states S = {1, 2, 3}, that has the state transition diagram is shown in Figure 11.31.1 12 3 12 2 1 Figure 11.31 - A state transition diagram.

Suppose P(X1 = 1) = 1/2 and P(X1 = 2) = 1/4.

a. Find the state transition matrix for this chain.

b. Find P(X1 = 3,X2 = 2,X3 = 1).

c. Find P(X= 3,X3 = 1).

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Question Posted: