Question: Problem 4 . Consider a Markov process that transit between states 0 , 1 and 2 , with the transition probability matrix P = From

Problem 4. Consider a Markov process that transit between states 0,1 and 2, with the
transition probability matrix
P=
From this matrix we see that once the process reaches state 0 or state 2, it stays there
forever. Hence these two states are called absorbing states. If the process starts at state 1,
the process may stay in state 1 for a duration but eventually the process will leave state
1 and enter one of the absorbing states.
What is the probability that the process twill enter into state 2 given the process
starts at state 1?
On average how long does it take to reach one of the absorbing states?
 Problem 4. Consider a Markov process that transit between states 0,1

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related General Management Questions!