Question: 4.4. Let {a; : i = 1, 2, ...} be a probability distribution, and consider the Markov chain whose transition probability matrix is What condition
4.4. Let {a; : i = 1, 2, ...} be a probability distribution, and consider the Markov chain whose transition probability matrix is

What condition on the probability distribution { a; : i = 1, 2, ... } is necessary and sufficient in order that a limiting distribution exist, and what is this limiting distribution? Assume
a, > 0 and a2 > 0, so that the chain is aperiodic.
0 1 2 3 4 5 as 1 1 00000 2 0 10000 30 01000 400 0100
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
