Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Successfully reported this slideshow.

Like this presentation? Why not share!

- Chapter 10 monte carlo analysis by Engr. Muhammad Hi... 1741 views
- Session 42_1 Peter Fries-Hansen by Transportforum (VTI) 1460 views
- Berger 2000 by Julyan Arbel 1849 views
- Hr forecasting techniques by Jenil Vora 1347 views
- Markov chain by Yogesh Khandelwal 10901 views
- Markov Chains by guest8901f4 23819 views

No Downloads

Total views

2,218

On SlideShare

0

From Embeds

0

Number of Embeds

10

Shares

0

Downloads

90

Comments

3

Likes

4

No notes for slide

- 1. Introduction to Markov chains Examples of Markov chains: - Random walk on a line
- 2. Introduction to Markov chains Examples of Markov chains: - Random walk on a graph
- 3. Introduction to Markov chains Examples of Markov chains: - Random walk on a hypercube
- 4. Introduction to Markov chains Examples of Markov chains: - Markov chain on graph colorings
- 5. Introduction to Markov chains Examples of Markov chains: - Markov chain on graph colorings
- 6. Introduction to Markov chains Examples of Markov chains: - Markov chain on graph colorings
- 7. Introduction to Markov chains Examples of Markov chains: - Markov chain on matchings of a graph
- 8. Definition of Markov chains - State space W - Transition matrix P, of size |W|x|W| - P(x,y) is the probability of getting from state x to state y - åy2W P(x,y) =
- 9. Definition of Markov chains - State space W - Transition matrix P, of size |W|x|W| - P(x,y) is the probability of getting from state x to state y - åy2W P(x,y) =
- 10. Definition of Markov chains Example: frog on lily pads
- 11. Definition of Markov chains Stationary distribution ¼ (on states in W): Example: frog with unfair coins
- 12. Definition of Markov chains Other examples:
- 13. Definition of Markov chains - aperiodic - irreducible - ergodic: a MC that is both aperiodic and irreducible Thm (3.6 in Jerrum): An ergodic MC has a unique stationary distribution. Moreover, it converges to it as time goes to 1.
- 14. Gambler’s Ruin A gambler has k dollars initially and bets on (fair) coin flips. If she guesses the coin flip: gets 1 dollar. If not, loses 1 dollar. Stops either when 0 or n dollars. Is it a Markov chain ? How many bets until she stops playing ? What are the probabilities of 0 vs n dollars ?
- 15. Gambler’s Ruin Stationary distribution ?
- 16. Gambler’s Ruin Starting with k dollars, what is the probability of reaching n ? What is the probability of reaching 0 ?
- 17. Gambler’s Ruin How many bets until 0 or n ?
- 18. Coupon Collecting - n coupons - each time get a coupon at random, chosen from 1,…,n (i.e., likely will get some coupons multiple times) - how long to get all n coupons ? Is this a Markov chain ?
- 19. Coupon Collecting How long to get all n coupons ?
- 20. Coupon Collecting How long to get all n coupons ?
- 21. Coupon Collecting Proposition: Let ¿ be the random variable for the time needed to collect all n coupons. Then, Pr(¿ > d n log n + cn e) · e-c.
- 22. Random Walk on a Hypercube How many expected steps until we get to a random state ? (I.e., what is the mixing time ?)
- 23. Random Walk on a Hypercube How many expected steps until we get to a random state ? (I.e., what is the mixing time ?)

No public clipboards found for this slide

Login to see the comments