Question: Consider a Markov chain with transition matrix [0.1 0.1 0.8 0.5 0.5 0 0.7 0 0.3] If the initial probability distribution is: [0.3 0.3 0.4]
Consider a Markov chain with transition matrix
[0.1 0.1 0.8
0.5 0.5 0
0.7 0 0.3]
If the initial probability distribution is:
[0.3 0.3 0.4]
then the probability distribution in the next observation is:
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
