Question: (20 points) True or false. You may state the reason for your choice for possible partial credits if your selection is incorrect (If you believe

(20 points) True or false. You may state the
(20 points) True or false. You may state the
(20 points) True or false. You may state the reason for your choice for possible partial credits if your selection is incorrect (If you believe that your selection is correct, you do not need to write any explanation). (a) For a transition probability matrix associated with a Markov chain, the sum of the elements for each row must be exactly equal to 1 ; the sum of the elements for each column does not have to be 1 , but cannot exceed 1 . (b) Markov chain is a powerful modeling tool, since even when the probability distribution on which state the Markov chain will visit in the next step depends on the past n>1 steps, we can always expand the state space to retain the Markov property. (c) The transition probability matrix for a Markov chain must have the same number of rows and columns. (d) If the number of possible states for a stochastic process is infinitely many, then this process cannot be modeled as a Markov chain because the transition probability matrix must have a finite number of rows and columns. (e) If we start from a transient state of a Markov chain, then soon or later, we will get into some recurrent class and then never leave that class. (f) Suppose that a Markov chain has a transient state i and a recurrent state j, then starting from state i, the expected number of visits to state j is either 0 or . (g) In a one-dimensional random walk with probability 0.6 going forward and with probability 0.4 going backward, regardless of where you start, the expected number of visits to each state must be finite. (h) Suppose stochastic process {Xn} can be modeled as a Markov chain, then for any two different states i and j,P[X4=jX3=i]=P[X2=jX1=i]. (i) Suppose that a Markov chain has multiple absorbing states, then starting from any transient state, the sum of the probabilities that the Markov chain will end up in each absorbing state is equal to 1

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related General Management Questions!