Question: Consider a Markov chain with state {1, 2, 3} and transition probability matrix 0 0.5 0.5 P=0.50 0.5 1 0 0 (a) Determine for every

Consider a Markov chain with state {1, 2, 3} and transition probability matrix 0 0.5 0.5 P=0.50 0.5 1 0 0 (a) Determine for every state whether it is an absorbing state. (b) Draw the transition diagram. (c) Determine whether state 1 and state 2 to be transient or persistent state. (d) Determine the period for state 1 and 2. (e) Find P2) [3 marks] [3 marks] [9 marks] [4 marks] [6 marks] Consider a Markov chain with state {1, 2, 3} and transition probability matrix 0 0.5 0.5 P=0.50 0.5 1 0 0 (a) Determine for every state whether it is an absorbing state. (b) Draw the transition diagram. (c) Determine whether state 1 and state 2 to be transient or persistent state. (d) Determine the period for state 1 and 2. (e) Find P2) [3 marks] [3 marks] [9 marks] [4 marks] [6 marks]
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
