Question: The state of a process changes daily according to a two-state Markov chain. If the process is in state i during one day, then it

The state of a process changes daily according to a two-state Markov chain. If the process is in state i during one day, then it is in state j the following day with probability Pi,j , where P0,0 = 0.4, P0,1 = 0.6, P1,0 = 0.2, P1,1 = 0.8 Every day a message is sent. If the state of the Markov chain that day is i then the message sent is “good” with probability pi and is “bad” with probability qi = 1 − pi , i = 0, 1

(a) If the process is in state 0 on Monday, what is the probability that a good message is sent on Tuesday?

(b) If the process is in state 0 on Monday, what is the probability that a good message is sent on Friday?

(c) In the long run, what proportion of messages are good?

(d) Let Yn equal 1 if a good message is sent on day n and let it equal 2 otherwise.

Is {Yn, n 1} a Markov chain? If so, give its transition probability matrix. If not, briefly explain why not.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Theory Of Probability Questions!