Assume that an ergodic Markov chain has states s1, s2, . . . , sk. Let S(n)

Question:

Assume that an ergodic Markov chain has states s1, s2, . . . , sk. Let S(n) j denote the number of times that the chain is in state sj in the first n steps. Let w denote the fixed probability row vector for this chain. Show that, regardless of the starting state, the expected value of S(n) j , divided by n, tends to wj as n ! 1. Hint: If the chain starts in state si, then the expected value of S(n) j is given by the expression
Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Essentials Of Probability And Statistics For Engineers And Scientists

ISBN: 9780321783738

1st Edition

Authors: Ronald E. Walpole, Raymond Myers, Sharon L. Myers, Keying E. Ye

Question Posted: