Question: Suppose a process can be considered to be in one of two states (lets call them state A and state B), but the next state

Suppose a process can be considered to be in one of two states (let€™s call them state A and state B), but the next state of the process depends not only on the current state but also on the previous state as well. We can still describe this process using a Markov chain, but we will now need four states. The chain will be in state (X, Y), X, Y Ɛ {A, B}, if the process is currently in state X and was previously in state Y.
(a) Show that the transition probability matrix of such a four- state Markov chain must have zeros in at least half of its entries.
(b) Suppose that the transition probability matrix is given by
Suppose a process can be considered to be in one

Find the steady- state distribution of the Markov chain.
(c) What is the steady- state probability that the underlying process is in state A?

(A, A) (A, B) (B,A) (B, B) (A,A) 0.8 0.2 00 P-(A, B) 000.4 0.6 (B,A) 0.6 0.4 00 (B,B) LO 00.1 0.9

Step by Step Solution

3.25 Rating (171 Votes )

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock

a If the process is currently in state X Y it must transition to a ... View full answer

blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Document Format (1 attachment)

Word file Icon

589-M-S-M-C (152).docx

120 KBs Word File

Students Have Also Explored These Related Statistics Questions!