Question: Lemma: Let (X0, X1, X2, . . .) be a Markov chain with state-space S = {1, 2, . . . , n} and transition
Lemma: Let (X0, X1, X2, . . .) be a Markov chain with state-space S = {1, 2, . . . , n} and transition matrix P.
Let w = (w1,w2, . . . ,wn) be a probability vector.
Then w is a limiting distribution for the Markov chain if and only if for any initial distribution (0), the distributions (t) satisfy (t) w as t .
Explain in plain english
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
