Question: PROBABILITY AND STOCHASTIC PROCESS QUESTION 3 Consider a Markov chain with three states 0, l, 2. Suppose that when the chain is in state 1'
PROBABILITY AND STOCHASTIC PROCESS

QUESTION 3 Consider a Markov chain with three states 0, l, 2. Suppose that when the chain is in state 1' (i = l, 2, 3), the probability of moving to the other states are the same and these probabilities are one half the probability of staying in state i . (3.1) Write down the transition probability matrix P. (3) (3.2) Explain why P is doubly stochastic. (1)
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
