Question: A Markov chain {Xn, n 0} with states 0, 1, 2, has the transition probability matrix 1 2 1 3 1
A Markov chain {Xn, n 0} with states 0, 1, 2, has the transition probability matrix
⎡
⎢
⎢
⎣
1 2
1 3
1 6
0 1 3
2 3
1 2 0 1 2
⎤
⎥
⎥
⎦
If P{X0 = 0} = P{X0 = 1} = 1 4 , find E[X3].
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
