Question: The transition matrix of a Markov chain is $P=left [begin{array}{ccc}3 & 0 & .7.1 & .8&.1.6 & 2 & 2end{array} ight] .$ On the 4th

 The transition matrix of a Markov chain is $P=\left [\begin{array}{ccc}3 &

The transition matrix of a Markov chain is $P=\left [\begin{array}{ccc}3 & 0 & .7.1 & .8&.1.6 & 2 & 2\end{array} ight] .$ On the 4th observation the chain is in state 2. What is the probability that it will be in state 3 on the 6th observation? $.19$ $.01$ $. 17$ $.02$ $.2$ SP.DL.1721

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!