Question: HOMEWORK 4 1. Exercises All exercises and examples are from [D], Chapter 6 (Markov Chains), with the corresponding numbering (e.g. Exercise 6.4.5). Exercise 1. Write
![HOMEWORK 4 1. Exercises All exercises and examples are from [D],](https://dsd5zvtm8ll6.cloudfront.net/si.experts.images/questions/2024/10/67092207b3a0d_7836709220790b01.jpg)
HOMEWORK 4 1. Exercises All exercises and examples are from [D], Chapter 6 (Markov Chains), with the corresponding numbering (e.g. Exercise 6.4.5). Exercise 1. Write down the one-step transition probability p(x, y) for the simple random walk. Generalize, assuming that instead of moving one step up or down, the chain moves Y steps where Y has a distribution r(y). Show that the simple RW is periodic. Give a simple example of Y such that the walk is aperiodic. Exercise 2. The Ehrenfest chain (Examples 6.2.5 and 6.5.3). There are a total of r balls in two urns, k in urn one and r k in urn two. We pick a random ball from all r and move it to the other urn. Determine the transition matrix. What is S? Is it recurrent, positive recurrent (yes, why? - Exercise 6.4.5), periodic (yes - find the period). Exercise 3. (Exercise 6.2.3). Prove the formula of the n step transition. Investigate the limit as n . Verify that the limit is the invariant distribution. Comment on why that should be the case, in view of Theorem 6.6.4. Here the conditions are very simple, but show they are satisfied. Then verify that the invariant measure is approached exponentially fast. Exercise 4. (Exercise 6.3.1). The hint essentially gives the solution. Write it carefully with the details. Exercise 5. * 1 (Exercise 6.3.5). This exercise shows in a nutshell a very useful type of reasoning in Markov processes. Suppose that S \\ C is finite and for each x S \\ C, Px ( C 0. Then there is an N 0 so that Py ( C > kN ) (1 \u000f)k . Here C is the hitting time of the set C. Exercise 6. * Prove that if S is finite and the chain is irreducible recurrent, then for a given x S, there exist C > 0 and (0, 1) such that P (T x > n) C n , where is an arbitrary distribution. What does this say about the convergence to equilibrium in Theorem 6.6.4 ? Exercise 7. (Exercise 6.3.7). Let Xn be a Markov chain with S = {0, 1, ..., N } and suppose that Xn is a martingale and Px (0 N 0 for all x. (i) Show that 0 and N are absorbing states, i.e., p(0, 0) = p(N, N ) = 1. (ii) Show Px (N 0 are transient (justify). Exercise 9. (Exercise 6.3.8). The Wright-Fisher model. Here S = {0, 1, . . . , N } and \u0012 \u0013 \u0012 \u0013 N i j i N j Xn , i.e. Xn+1 Bin N, . P (Xn+1 = j | Xn = i) = ( ) (1 ) j N N N Show that Xn is a martingale and we can apply Exercise 6.. Exercise 10. * (based on Exercise 6.4.1). Suppose y is recurrent and for k 1, let Rk = Tyk be the time of the k - th return to y, with R0 = 0. For k 1, let rk = Rk Rk1 be the k-th inter-arrival time. Use the strong Markov property to conclude that under Py , the random variables rk , k 1 are i.i.d. 2 Exercise 11. * (Exercise 6.5.4). We have shown in class (and notes) that x (y) = wxy /wyx , where wxy = Px (T y k) = Px (Tky 0. Hint: There exists k = k(x, y) such that pk (x, y) > 0 for any pair of states (justify). Let k0 = max{k(x, y)|x, y S}. Then pk0 +m0 (x, y) > 0 for any x, y S. 3
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
