An ergodic Markov chain is started in equilibrium (i.e., with initial probability vector w). The mean time

Question:

An ergodic Markov chain is started in equilibrium (i.e., with initial probability vector w). The mean time until the next occurrence of state si is m¯i = Σk wkmki + wiri. Show that m¯i = zii/wi, by using the facts that wZ = w and mki = (zii − zki)/wi.
Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Computer Architecture A Quantitative Approach

ISBN: 978-0123704900

4th edition

Authors: John L. Hennessy, David A. Patterson

Question Posted: