Show that the random walk process in Section 2.2.4 is a Markov chain. Find the transition matrices

Question:

Show that the random walk process in Section 2.2.4 is a Markov chain. Find the transition matrices for the following two cases.

(a) The process does not stop upon reaching any level. In this case, Xt may assume all values 0,±1,±2, ... .

(b) The process comes to a stop when Xt = 0 or a, as it was assumed in Section 2.2.4. In this case, Xt may assume values 0,1, ...,a.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: