A Markov chain has four states, A, B, C, and D. The probability of going from state

Question:

A Markov chain has four states, A, B, C, and D. The probability of going from state A to state B in one trial is .3, the probability of going from state A to state C in one trial is .4, the probability of going from state B to state A in one trial is .5, the probability of going from state B to state C in one trial is .3, the probability of going from state C to state A in one trial is .2, the probability of going from state C to state B in one trial is .4, and the probability of going from state B to state D in one trial is .6. Draw a transition diagram and write a transition matrix for this chain.

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: