Question: 3.1. A two state Markov chain has the transition probability matrix (a) Determine the first return distribution (b) Verify equation (3.2) when i = 0.
3.1. A two state Markov chain has the transition probability matrix

(a) Determine the first return distribution
![]()
(b) Verify equation (3.2) when i = 0. (Refer to III, (5.2).)
P = 0 1 01-a a 19 b 1-b
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
