A Markov chain (left{X_{0}, X_{1}, ldots ight}) has state space (mathbf{Z}={0,1,2}) and transition matrix [mathbf{P}=left(begin{array}{ccc} 0.2 &
Question:
A Markov chain \(\left\{X_{0}, X_{1}, \ldots\right\}\) has state space \(\mathbf{Z}=\{0,1,2\}\) and transition matrix
\[\mathbf{P}=\left(\begin{array}{ccc} 0.2 & 0.3 & 0.5 \\ 0.8 & 0.2 & 0 \\ 0.6 & 0 & 0.4 \end{array}\right)\]
(1) Determine the matrix of the 2-step transition probabilities \(\mathbf{P}^{(2)}\).
(2) Given the initial distribution \(P\left(X_{0}=i\right)=1 / 3 ; i=0,1,2\); determine the probabilities \(P\left(X_{2}=0\right)\) and \(P\left(X_{0}=0, X_{1}=1, X_{2}=2\right)\).
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Applied Probability And Stochastic Processes
ISBN: 9780367658496
2nd Edition
Authors: Frank Beichelt
Question Posted: