Problem 2 Play Game Turn Chance Win 1 Least 1 Win 1 Probability 2 5 Lose 1 Probability 3 5 Q29488283

Problem 2. You play a game in which each turn you have a chance to win $1. If you have at least $1 then you win $1 with probability 2/5, and you lose $1 with probability 3/5. If you have S0 then you win $1 with probability 2%, and nothing happens with probability 3/5. You start off with $5 and play forever (a) (10p) Model this situation using a Markov chain, and show that there exists a unique stationary distribution for this Markov chain. (b) (5p) Let Ts be the smal lest number of turns needed until you have exactly $5 again. Determine the expectation of T5. (c) (5p) Show that you will almost surely have exactly $1000 at least once during this game.

Problem 2. You play a game in which each turn you have a chance to win $1. If you have at least $1 then you win $1 with probability 2/5, and you lose $1 with probability 3/5. If you have S0 then you win $1 with probability 2%, and nothing happens with probability 3/5. You start off with $5 and play forever (a) (10p) Model this situation using a Markov chain, and show that there exists a unique stationary distribution for this Markov chain. (b) (5p) Let Ts be the smal lest number of turns needed until you have exactly $5 again. Determine the expectation of T5. (c) (5p) Show that you will almost surely have exactly $1000 at least once during this game. Show transcribed image text

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply