Question: Consider the Markov chain with three states, S = {1, 2, 3}, that has the following trans matrix: P = (0.6 0.3 0.1 0.5 0.0
Consider the Markov chain with three states, S = {1, 2, 3}, that has the following trans matrix: P = (0.6 0.3 0.1 0.5 0.0 0.5 0.2 0.4 0.4) with initial distribution^pi degree = (0.7;0.2: 0.1). Draw the state transition diagram for this Markov chain Write the transition matrix for two steps Find distribution after three steps Find probability P(X_1 = 3: X_3 = 1: X_4 = 2: X_7 = 1) pi^(infinity)
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
