Question: I know this is a Markov chain type problem, but I want to understand how to do it. thanks A Gambler starts a game of

I know this is a Markov chain type problem, but I want to understand how to do it. thanks

A Gambler starts a game of chance with $1. The game is a 50/50 chance to make $1 if you bet 1//fine makes $4/ he gives $1 away to the local school for delinquent rabbits and keeps playing with his remaining $3. If he hits $0 he stops playing. a. Draw the transition probability diagram b /What is the probability that he has -# $2 after 3 rounds of gambling? c.//What is the meaning behind pi($1) d/what is pi
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
