Suppose that a Markov chain has four states 1, 2, 3, 4 and stationary transition probabilities as

Question:

Suppose that a Markov chain has four states 1, 2, 3, 4 and stationary transition probabilities as specified by the following transition matrix:
Suppose that a Markov chain has four states 1, 2,

a. If the chain is in state 3 at a given time n, what is the probability that it will be in state 2 at time n + 2?
b. If the chain is in state 1 at a given time n, what is the probability that it will be in state 3 at time n + 3?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Probability And Statistics

ISBN: 9780321500465

4th Edition

Authors: Morris H. DeGroot, Mark J. Schervish

Question Posted: