Question: 9. consider a markov chain which its transition probability matrix P1 in question 7, show that after long time, the probability of leaving the second
P1=0.40.3000.30.50.60.40.5P2=0.7000.200.60.1000.40.900.3000.8 8. If it is possible, find the steady state probabilities for the transition probability matrices in question 6. 9. Consider a Markov Chain which its transition probability matrix is P1 in question 7 , show that after long time, the probability of leaving the second state is equal to the probability of entering to this state
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
