Question: Transition Probability matrix Consider a Markov chain {Xn : n = 0, 1, 2, ...} with state space {1, 2, 3} and one-step transition probability

Transition Probability matrix

Transition Probability matrix Consider a Markov chain {Xn : n = 0,1, 2, ...} with state space {1, 2, 3} and one-step transitionprobability matrix O NIH NIH P = O 0 O (a) MarkO or X: ( ) The Markov chain is irreducible. ( )

Consider a Markov chain {Xn : n = 0, 1, 2, ...} with state space {1, 2, 3} and one-step transition probability matrix O NIH NIH P = O 0 O (a) Mark O or X: ( ) The Markov chain is irreducible. ( ) The Markov chain is aperiodic. ( ) The Markov chain is transient. ( ) The Markov chain is recurrent. ( ) The Markov chain is null recurrent. ( ) The Markov chain is ergodic. (b) Calculate P(X5 = 1/X2 = 1). (c) Find limn + P(Xn = 1/X2 = 1).A Markov chain with state space {1, 2, 3} has transition probability matrix 0.6 0.3 0.1\\ P. = 0.3 0.3 0.4 0.4 0.1 0.5 (a) Is this Markov chain irreducible? Is the Markov chain recurrent or transient? Explain your answers. (b) What is the period of state 1? Hence deduce the period of the remaining states. Does this Markov chain have a limiting distribution? (c) Consider a general three-state Markov chain with transition matrix P11 P12 P13 P = P21 P22 P23 P31 P32 P33 Give an example of a specific set of probabilities p;; for which the Markov chain is not irreducible (there is no single right answer to this, of course !).1. (a) Explain what is meant by the transition probability matrix of a homogeneous Markov chain. [5 marks] (b) Explain what is meant by the stationary distribution of a Markov chain? [5 marks] (c) A Markov chain has transition probability matrix, A, with entries Ouj; and stationary distribution . Write down an expression for the entries of the reverse Markov chain. [5 marks (d) Consider the following transition probability matrix of a homogo- neous Markov chain, with three states i,j and k (the TPM is in that order). If the stationary vector of the chain is (1/9, 2/9, 2/3), determine whether the Markov chain is reversible. 1 /0.2 0.2 0.6 0.1 0.6 0.3 4 \\0.1 0.1 0.8 [5 marks] (e) Let X1, X2, Xa be a sequence of random variables resulting from the above Markov chain. If X1 = i and Xs = j what is the probability that X2 = k? [5 marks]A stationary distribution of an m-state Markov chain is a probability vector q such that = q P, where P is the probability transition matrix. A Markov chain can have more than one stationary distribution. Identify all the stationary distributions that you can, for the 3-state Markov chain with transition probability matrix O O P Owl Does this Markov chain have a steady-state probability distribution ? 15 points

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!