Question: Consider the markov chain (p) All Markov chains must have a finite number of states. (q) All irreducible Markov chains must have a finite number

 Consider the markov chain (p) All Markov chains must have afinite number of states. (q) All irreducible Markov chains must have a

Consider the markov chain

finite number of states. (r) All irreducible Markov chains are periodic. (s)All irreducible Markov chains are aperiodic. (t) All discrete-time Markov chains areirreducible.For each of the following transition matrices, determine whether the Markov chainwith that transition matrix is regular: (1) Is the Markov chain whose

(p) All Markov chains must have a finite number of states. (q) All irreducible Markov chains must have a finite number of states. (r) All irreducible Markov chains are periodic. (s) All irreducible Markov chains are aperiodic. (t) All discrete-time Markov chains are irreducible.For each of the following transition matrices, determine whether the Markov chain with that transition matrix is regular: (1) Is the Markov chain whose transition matrix whose transition matrix is 0 0.5 0.5 0.5 0 0.5 0 0 regular? (Yes or No) (2) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.3 0 0.7 0 0 regular? (Yes or No) (3) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.6 0 0.4 1 0 0 regular? (Yes or No) (4) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0 0.6 0 0.4 regular? (Yes or No) (5) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.3 0.2 0.5 0 1 04. Consider the Markov chain X" = {X,} with state space S = {0, 1, 2, ...} and transition probabilities 1 ifj=i-1 Puj = 10 otherwise , for i 2 1 and Poo = 0, Poj = for j > 1. (a) Is this Markov chain irreducible? Determine the period for every state. (b) Is the Markov chain recurrent or transient? Explain. (c) Is the Markov chain positive recurrent? If so, compute the sta- tionary probability distribution. (d) For each state i, what is the expected number of steps to return to state i if the Markov chain X starts at state i? 5. Consider a Markov chain X = {X} with state space S = {0, 1, 2, ...} and transition probability matrix 0 1 0 0 P 0 0 P = O p 0 q 0 0 . . . 0 0 P 0 4 0 Here p > 0, q > 0 and p+q =1. Determine when the chain is positive recurrent and compute its stationary distribution.SECTION 5.2. The Weak Law of Large Numbers Problem 4. In order to estimate f, the true fraction of smokers in a large population. Alvin selects n people at random. His estimator Mn is obtained by dividing Sn, the number of smokers in his sample, by n, i.e., Mn = Sn. Alvin chooses the sample size n to be the smallest possible number for which the Chebyshev inequality yields a guarantee that P(IMn - 1 Z E) 56, where e and o are some prespecified tolerances. Determine how the value of n recom- mended by the Chebyshev inequality changes in the following cases. (a) The value of e is reduced to half its original value. (b) The probability o is reduced to half its original value

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!