Question: 4.4. Let {a; : i = 1, 2, ...} be a probability distribution, and consider the Markov chain whose transition probability matrix is What condition

4.4. Let {a; : i = 1, 2, ...} be a probability distribution, and consider the Markov chain whose transition probability matrix is

0 1 2 3 4 5 as 1 1 00000 2 0

What condition on the probability distribution { a; : i = 1, 2, ... } is necessary and sufficient in order that a limiting distribution exist, and what is this limiting distribution? Assume

a, > 0 and a2 > 0, so that the chain is aperiodic.

0 1 2 3 4 5 as 1 1 00000 2 0 10000 30 01000 400 0100

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Probability And Stochastic Modeling Questions!