Isle of Wight Maths Society

  Probability and Gambling
Introduction

Where did the ideas of probability
originate?

What is probability?

Elementary probability theory

Examples from the world of
gambling
A little background to the development
of probability theory
"A gambler's dispute in 1654 led to the creation of a
mathematical theory of probability by two famous French
mathematicians, Blaise Pascal and Pierre de Fermat.

 Because of the inherent appeal of games of chance,
probability theory soon became popular, and the subject
developed rapidly during the 18th century. The major
contributors during this period were Jakob Bernoulli
(1654-1705) and Abraham de Moivre (1667-1754).

In 1812 Pierre de Laplace (1749-1827) introduced a host of
new ideas and mathematical techniques in his book, Théorie
Analytique des Probabilités. Laplace applied probabilistic
ideas to many scientific and practical problems. The theory
of errors, actuarial mathematics, and statistical mechanics
are examples of some of the important applications of
probability theory developed in the l9th century.
What is 'probability' and how do we
            measure it?
We envisage a situation in which an outcome or event is in doubt -
'the face of uncertainty'!

Probability may be regarded as the measurement of uncertainty of the
event and the theory developed axiomatically as an exercise in
measure theory. However for practical purposes, statisticians have
chosen the follow possible definitions:


Equally likely outcomes
If an event E happens when one of a proportion p of 'equally likely' outcomes
actually occurs, we say that the probability of E is p and write:
                             P(E) = p
This definition is frequently adopted when calculating probabilities in games of
chance e.g. roulette, card games, dice games, coin tossing etc.



Relative frequency
If , 'in the long run', E happens 100*p% of the time then we say:
                             P(E) = p
If we cannot reasonably assume 'equally likely outcomes', then this defintion
provides a practical way of estimating p through sampling.


Degree of belief
If you personally regard that the 'odds' that E occurs as opposed to not
happening, to be
p/(1-p) then we say:
                             P(E) = p
Axioms of probability Theory




Mutually exclusive events cannot occur together. If all events in a
sample space are ME then we say they are mutually exclusive and
exhaustive (MEE).
NOTE: The probabilty that E does NOT occur is:




     If P(E) = 1 => E certain to occur


     If P(E) = 0 => E certain not to occur




                     Outcome spaces (sets)
This is a description of the outcomes of interest in the calculation of
probabilities.
For gambling situation this could be simply:
S = { win, lose} and P(win) = 0.4 and P(lose) = 0.6
The events 'win' and 'lose' would be associated with the outcome of the gamble
(or trial).
Thus, if we were to toss two (fair) coins and gamble on the occurrence of two
heads occuring, then:
      S = {[H,H], [H,T], [T,H], [T,T]} assuming equally likely outcomes.
      P(win) = p([H,H]) = 1/4 and P(lose) = 3/4
If we bet on the faces being different i.e. the COMBINATION {H,T} occurs
 then P({H,T})
      =
or put simply 2 cases out of four
[ Note:In this trial we could toss a single coin twice instead of tossing two coins
and we assume P(coin lands on edge) = 0]
Combinations, Permutations and the Binomial Discrete
                    Probability Distribution

 In the calculation of probabilities where equally likely outcomes are assumed, many problems
require us to calculate the number of ways of selecting r objects (or events) from n objects (or
events), without regard to order. Each selection of r objects is called a COMBINATION. A
combination is just a sub-set. A permutation is a selection in a particular order.
e.g. toss a coin four times (or toss four coins at once). How many of the possible outcomes would
show 1, 2, 3 or 4 heads?
There are 16 possible ways (permutations) the coins can be arranged i.e.
                    S = {HHHH, HHHT, HHTH, .. ,TTTT}
 each one is a specific outcome ,so our outcome space in this sense would have 16 elements.
However, suppose we are only interested in the number of heads that show. The outcome space
would be {0,1,2,3,4} and if we were gambling on the possibilty of 3 heads our outcome space
could be simply {3heads, not 3 heads}!


But how do we find how many elements of S (all equally likely - why?) would have 0,1,2,3 and 4
heads?
We could count and then draw up a table thus:          Random variables




      N(H)          0                1          2             3               4
   Frequency        1                4          6             4               1
Thus for 3 heads the chances are 4/16 = 1/4 i.e. P(3 heads) = 0.25
In terms of probabilities we have:


      E             0                1          2             3               4
   P(E)             1/16             4/16       6/16          4/16            1/16



                                                                  Probability distribution

This is an example of a discrete probability distribution.
Counting the Permutations
The four coins could be re-arranged amongst themselves in 24 different ways
(permuations) viz:
1st coin choice = 4 ways, followed by 2nd coin choice = 3 ways, and so on giving 4*3*2*1 =
24 ways or 4! (four factorial)
In general n objects could be re-arranged in n! different ways. Now consider where r of the
coins show H and (n-r) show T. Clearly the r lots of H could be re-arranged amongst
themselves in r! ways without it 'looking a different' and similarly the (n-r) tails in (n-r)! ways.
Hence the number of different arrangements of four coins with r heads and (n-r) tails is:

                                                                           1.0




and denoted as


It can easily be shown that this can be written as :

                                                                           1.1




So, if we had 10 coins the number of ways that 4 heads (or 6 tails) could appear is :
                                     10*9*8*7 = 210
                                     4*3*2*1
NOTE: Equation 1.1 is also used to calculate the number of COMBINATIONS of any r
objects out of n objects. This will be explained in the talk. Thus, given 5 objects, I can select
3 objects in




What about the probabilities?
To answer this we need to look at joint events/outcomes
Joint Probabilities and Independent Outcomes


If I toss a fair coin, then P(H) = P(T) = 1/2
This is true if I repeat the trial. This is because the two trials or experiments have independent
outcomes.


The joint occurrence of H followed by H is P(H) * P(H) = 1/2 * 1/2 = 1/4
This could have been arrived at by looking at all the joint outcomes and assuming equally likely
probabilities for each
                                                                                   2.0
In general for n INDEPENDENT events,




This in tossing 5 coins the probability of getting exactly 3 heads


       = (number of ways 3 heads could occur) * P(H) *P(H) * P(H) * P(T) * P(T)


       = 10 * (1/2) 5 = 10/32 = 5/16 = 0.3125
General result
If in n independent trials the probability of an outcome or event is is p and the probabilty it
doesn't occur is (1-p) = q, the probability of exactly r successes is given by:




This is called the binomial probability distribution function. It should be noted that the value 'r' is
the realisation of a random variable. If we denote by X the random variable, it is customary to
write:

                                                                                    3.0
                             P(X = r) = P(r) =
Non-independent outcomes - conditional probability and
                    Bayes' Theorem

Given two outcomes or events A and B say which are not independendent i.e.


                       P(A and B) = P(A) * P(B)


then we define the joint occurence as a conditional probability viz:
                        P(A and B) = P(A given B) * P(B) = P(A|B)


              also,   P(A and B) = P(A|B) * P(B) = P(B|A) * P(A)


This leads to a form of Bayes' theorem:
                                                                           4.0
                            P(A|B) = P(B|A) * P(A)
                                           P(B)
The result can be generalised:




                                                                                 4.1




                           (See video example))
This is an extremely important result in probability theory and has been used as a basis for
reasoning under uncertainty in 'expert' systems, a branch if artificail intelligence.
       Example: from a standard pack of 52 cards I select two cards. What is the probability
they are both aces?
       Given one of the cards is an ace (say the first), this has a probability of 4/52 = 1/13 of
being selected.
 It is wrong to infer the probability of two aces is therefor 1/13 * 1/13 because the selection of
the first ace has altered the probability of selecting the second ace on the same basis.
       P(A1) = 4/52 P(A2 |A1) = 3/51.
       hence P(A1 and A2) = P(A1) * P(A2 |A1) = 4/52 * 3/51
       We could have deduced this result from combinatorial analysis (left as an exercise!)
Examples:
The National Lottery
Probability of Jackpot
Select 5 numbers from 49


Total number of possible combinations =            =     49 * 48 * 47 * 46 *45 * 44
                                                         6*5*4*3*2*1
                                                             = 13,983,816
 P(jackpot) = 0.0000000715 or 13,983,815:1 against


Imagine putting nearly 14 million raffle tickets into a box and selecting just 1 which is the
winner.
Most people gamble on this extremely unlikely event because the prize is large. However, the
prize is rarely £13983816 on a single draw. Anything less and the bet is technically 'unfair'. We
shall look at EXPECTATIONS shortly.


We can calculate this probability another way.
We have 1 chance in 49 of selecting the 'first' number = 1/49.
1/48 of selecting the second corectly,
1/47 the third etc.
So to get all six numbers in the order of the draw (a permutation) the probability is:
                            1/49 * 1/48 * ...*1/44 (conditonal probabilities)
But these 6 numbers can be arranged in any order in 6! ways = 720 so the probabilty is:
                            720 * 1/49 * 1/48 * .... * 1/44 = 0.0000000715 as before.


Probability of three numbers correct out of the six chosen
As before,
P(first number matches) = 6/49      Hence, P(selection as given)

P(second matches) = 5/48                   = 6/49 * 5/48 * 4/47 * 43/46 * 42/45 * 41/44 = 0.00088
                                    We need to scale this by the number of ways three correct
P(third matches) = 3/47             and three incorrect could occur in the six selection
P(fourth doesn't match) = 43/46            = (6 *5 *4)/(3 * 2* 1) = 20 ways
P(fifth doesn't match) = 42/45      So the probability = 20 * 0.00088 = 0.0177
P(sixth doesn't match) = 41/44      or 0.923 : 0.0177 = 55:1 roughly
                                    The fair odds should be 10:1.
Random Variables and Expectation
Given a (discrete) outcome space {E1 , E2 , ....., En) and a 1:1 mapping of each E
to a real valued number n.
i.e. f(E) -> ni then n is a random variable which has a real value. We have already
met this in the case of tossing coins.
A probability distribution (pd) may be defined as a function f in which the domain
consists of the possible values that a random variable (rv) can take on, and the
range is composed of the probabilities associated with those values.
The probability that a random variable X can take on the value x is f(x) and usually
denoted as P(X = x).


Similar definitions can be given for continuous real-valued random variables from a
real domain.


Thus in tossing two coins then if X = number of heads, P(X = 2) = 1/4


Thus, a pd as a set of rv values and their associated probabilities.


The expectation of X denoted by E(X) is defined as


It can be interpreted as the arithmetic mean (called the 1st moment about the origin)
of the distribution.




Example: (from coin tossing distribution
     x           0           1             2         3           4
   P(x)          1/16        4/16          6/16      4/16        1/16


           E(X) = 0 + 1/4 + 3/4 + 3/4 + 1/4 = 2
Thus the expected number of heads (mean number) is 2 heads. This can be verified
experimentally by performing large numbers of tosses of 2 coins and averaging the
number of heads.
Expectation in gambling
                               (expected loss or gain)

We shall use pdf to mean probability distribution function.
 Suppose we have a pdf as follows:
              Outcome WIN             LOSE
            Amount(x) £10             -£2
            P(x)         0.2          0.8


The expectation is :    10 * 0.2 + (-2) * 0.8 = 2 - 1.6 = £0.4
I.e. in the long run you will win 40p (nice if you can find it!!)


If E(X) = 0 then we say the bet is FAIR, in that in the long run neither side gains nor
loses.
If E(X) < 0 then in the long run you will always LOSE


           All games of chance with a negative expected gain
             means that in the long run you will lose money
                          Roulette is a classical example



Statistically odds(E) is defined as P(E)/(1-P(E)) = P(win)/P(lose)
 Bookmakers tend to quote odds the other way up as P(lose)/p(win). Thus if a
bookmaker's odds were interpreted as personal probabilities then 2:1 against
implies P(lose)/P(win) = (1- P(win))/P(win) = 2 so P(win) = 1/3 and P(lose) = 2/3.
Such probabilities do not necessarily reflect the 'true' values whatever these might
be. Games of pure chance such as roulette and the lottery allow the calculation of
probabilities based on equally likely outcomes. However, the payout is not based
on these odds since the 'house' or betting organiser must take a percentage. The
bet in these cases is not fair and hence E(X) < 0
Horse racing and certain card games are different in that they allow an element of
skill or judgment in deciding the bet.
Roulette
On the right is a standard roulette wheel having 18 red
slots, 18 black slots and one green zero (American wheels
have two green zeros, 0 and 00).


The payput odds are based solely on probability.

   Whether you bet on European or American
   tables the payout is simply:




   where n = number of squares bet on
Expectations and the House Edge
Assume a single zero wheel and bet £1 on 24
Assume equally likely outcomes
     Total number of outcomes = 37
     P(ball in any given slot) = 1/37
     P(24) = 1/37 = P(win) and P(lose) = 36/37


If you lose, then you lose your £1 bet i.e. outcome = -£1 with probability 36/37
The house offers 35:1 for the return on a single slot bet i.e on any number not
zero. Thus your return in £35 (profit) if you win with probability 1/37
Note that the 'fair' payout odds should be 37:1


Expectation on the bet = -1 * 36/37 + 35 * 1/37 = 1/37(35 -36) = 1 /37 = £ -0.027


For a bet on red or black (also odd or even) the house offers even odds i.e. 1:1
P(lose) = 19/37 and P(win) = 18/37
E(bet ) = -1 * 19/37 + 1 * 18/37 = -1/37 = -£0.027 as before.


The single or double green zero is effectively the house 'edge' because their
payout odds do not include it. i.e if you kept the house odds and removed the zero
(s) from the wheel you would have a fair bet.
Poker
The probabilities of getting the various hands at poker (e.g. four of a kind, pair,
two pair etc) can be found using the theoretical results we have already looked
at. However, calculating these probabilites can be quite tricky!


Hand                                          Probability          Distribution
No pair                                  1,303,560 in 2,598,960    50 .16%
One pair                                 1,098,240 in 2,598,960    42 .26%
Two pair                                 123,552 in 2,598,960      4 .75%
Three of a kind                          54,912 in 2,598,960       2 .11%
Straight                                 9,180 in 2,598,960        0 .353%
Flush                                    5,112 in 2,598,960        0 .197%
Full house                               3,744 in 2,598,960        0 .144%
Four of a kind                           624 in 2,598,960          0 .0240%
Straight flush (excluding Royal flush)   32 in 2,598,960           0 .00123%
Royal straight flush                     4 in 2,598,960            0 .000154%
TOTAL                                    2,598,960 in 2,598,960    100


The total number of five card hands = 52C5 = 2,598,960 different hands
Example calculation:
Probability of three of a kind (e.g. three kings and any two others [not pairs])
For convenience let



For a given rank of four cards (e.g. Kh,Kd,Ks,Kc) we can select any three in 4C3
= 4 ways.
We have 13 ranks so we can create 13 * 4 = 52 different triples ('three-of-a-
kinds'.)
Now consider those cards not in the rank of the triple. We have 12 ranks in the
four suits and and we can select any pair that are not the same rank. We have
12 ranks to choose from. Ignoring suits for the present we can select 12C2 pairs
among any rank (so no pairs) and each number in the pair could come from four
suits, so the total number of different pairs is 12C2 x 42 = 1056
Total number of 'three-of-a-kinds' = 52 * 1056 = 54,912
P(three-of-a-kind) = 54,912/2,598,960 = 0.02113




         BUT - having a winning hand in poker
      doesn't mean you will win. Poker is not just
      a game of chance but a game of bluff also.
The Martingale Betting Strategy in Roulette
There are various forms of Martingale betting but the simplest is as follows:


Bet either on red/black or odd/even (payout odds are 1:1). If you lose then
double your bet and play again. Repeat until you win then restart your bet
sequence again. Foolproof???
 After each win you only win back your initial bet. For example suppose you bet £1 initially
 and double -up on each loss for the next bet.
                                       The final net gain is:
 Bet(£)       1      2      4      8
                                       Total returned = £16
 Outcome      L      L      L      W
                                       Winnings = £16 - £8(stake) - £7(total outlay) = £1
 Net Gain     -1     -3     -7     1



Assume the gambler is only prepared to bet on up to six losing spins in a row
(e.g. he/she may be bankrupt after this).
In reality, the odds of a streak of 6 losses in a row are much higher than the
many people intuitively believe. Psychological studies have shown that since
people know that the odds of losing 6 times in a row out of 6 plays are low, they
incorrectly assume that in a longer string of plays the odds are also very low.
When people are asked to invent data representing 200 coin tosses, they often
do not add streaks of more than 5 because they believe that these streaks are
very unlikely.
       Assuming a US roulette wheel then the odds of losing a single spin at
       roulette are q = 20/38 = 52.6316%. If you play a total of 6 spins, the odds
       of losing 6 times are q6 = 2.1256%. However if you play more and more
       spins, the odds of losing 6 times in a row begin to increase rapidly.
      In 73 spins, there is a 50.3% chance that you will at some point have lost
      at least 6 spins in a row. (The chance of still being solvent after the first six
      spins is 0.978744, and the chance of becoming bankrupt at each
      subsequent spin is (1-0.526316)x0.021256 = 0.010069, where the first
      term is the chance that you won the (n-6)th spin - if you had lost the (n-6)th
      spin, you would have become bankrupt on the (n-1)th spin. Thus over 73
      spins the probability of remaining solvent is 0.978744 x (1-0.010069)^67 =
      0.49683, and thus the chance of becoming bankrupt is 1-0.49683 =
      50.3%.)

      Similarly, in 150 spins, there is a 77.2% chance that you will lose at least 6
      spins in a row at some point.
      And in 250 spins, there is a 91.1% chance that you will lose at least 6 spins
      in a row at some point.
Betting on the GG's

Bookmakers' payout odds on horses winning may not be coherent if considered
as probabilities. The probability of a horse winning is unknown but might be
guessed in relation to the other horses.
So, how much are you prepared to bet on a horse winning? A lot depends on
your knowledge of horse racing and betting, as well as how much you can
comfortably lose I expect but let us look at a possible way you could win.


If a bookmaker sets odds on horses winning, which if considered as probabilities
are not coherent i.e. do not sum to 1, then it is possible for the bookmaker to
ALWAYS win if the gambler bets on every horse optimally according to the odds.
If he/she doesn't bet this way they will lose more.
The total implied probability is 1.05 !! (not coherent with probability theory).

The following example is taken from Wikipaedia
Horse number       Offered odds         Implied probability Bet Price   Bookie Pays
                                                                        if Horse Wins
      1            Even                  0.5               £100         £100 stake + £100
      2            3 to 1 against        0.25              £50          £50 stake + £150
      3            4 to 1 against        0.2               £40          £40 stake + £160
      4            9 to 1 against        0.1               £20          £20 stake + £180
                                  Total: 1.05        Total: £210        Always: £200


The bookmaker is guaranteed to make a £10 profit for this type of bet. It is called
a Dutch Book or 'lock'.


Now consider if a horse 4 is scratched and the bookmaker does not adjust the
odds. The better can guarantee to win £10 by betting on each horse as before,
since the bookmaker only makes £190 and must pay out £200 on each bet.
With competitive fixed-odds gambling being offered electronically, gamblers can
sometimes create a Dutch book by selecting the best odds from different
bookmakers, in effect by undertaking an arbitrage operation.
Two interesting Problems in Probability




1. The Monty Hall Problem (see video)

2. The Birthday Problem
Solutions:


1. Change your selection




2. The birthday probabilities

Brian Prior - Probability and gambling

  • 1.
    Isle of WightMaths Society Probability and Gambling
  • 2.
    Introduction Where did theideas of probability originate? What is probability? Elementary probability theory Examples from the world of gambling
  • 3.
    A little backgroundto the development of probability theory "A gambler's dispute in 1654 led to the creation of a mathematical theory of probability by two famous French mathematicians, Blaise Pascal and Pierre de Fermat. Because of the inherent appeal of games of chance, probability theory soon became popular, and the subject developed rapidly during the 18th century. The major contributors during this period were Jakob Bernoulli (1654-1705) and Abraham de Moivre (1667-1754). In 1812 Pierre de Laplace (1749-1827) introduced a host of new ideas and mathematical techniques in his book, Théorie Analytique des Probabilités. Laplace applied probabilistic ideas to many scientific and practical problems. The theory of errors, actuarial mathematics, and statistical mechanics are examples of some of the important applications of probability theory developed in the l9th century.
  • 4.
    What is 'probability'and how do we measure it? We envisage a situation in which an outcome or event is in doubt - 'the face of uncertainty'! Probability may be regarded as the measurement of uncertainty of the event and the theory developed axiomatically as an exercise in measure theory. However for practical purposes, statisticians have chosen the follow possible definitions: Equally likely outcomes If an event E happens when one of a proportion p of 'equally likely' outcomes actually occurs, we say that the probability of E is p and write: P(E) = p This definition is frequently adopted when calculating probabilities in games of chance e.g. roulette, card games, dice games, coin tossing etc. Relative frequency If , 'in the long run', E happens 100*p% of the time then we say: P(E) = p If we cannot reasonably assume 'equally likely outcomes', then this defintion provides a practical way of estimating p through sampling. Degree of belief If you personally regard that the 'odds' that E occurs as opposed to not happening, to be p/(1-p) then we say: P(E) = p
  • 5.
    Axioms of probabilityTheory Mutually exclusive events cannot occur together. If all events in a sample space are ME then we say they are mutually exclusive and exhaustive (MEE).
  • 6.
    NOTE: The probabiltythat E does NOT occur is: If P(E) = 1 => E certain to occur If P(E) = 0 => E certain not to occur Outcome spaces (sets) This is a description of the outcomes of interest in the calculation of probabilities. For gambling situation this could be simply: S = { win, lose} and P(win) = 0.4 and P(lose) = 0.6 The events 'win' and 'lose' would be associated with the outcome of the gamble (or trial). Thus, if we were to toss two (fair) coins and gamble on the occurrence of two heads occuring, then: S = {[H,H], [H,T], [T,H], [T,T]} assuming equally likely outcomes. P(win) = p([H,H]) = 1/4 and P(lose) = 3/4 If we bet on the faces being different i.e. the COMBINATION {H,T} occurs then P({H,T}) = or put simply 2 cases out of four [ Note:In this trial we could toss a single coin twice instead of tossing two coins and we assume P(coin lands on edge) = 0]
  • 7.
    Combinations, Permutations andthe Binomial Discrete Probability Distribution In the calculation of probabilities where equally likely outcomes are assumed, many problems require us to calculate the number of ways of selecting r objects (or events) from n objects (or events), without regard to order. Each selection of r objects is called a COMBINATION. A combination is just a sub-set. A permutation is a selection in a particular order. e.g. toss a coin four times (or toss four coins at once). How many of the possible outcomes would show 1, 2, 3 or 4 heads? There are 16 possible ways (permutations) the coins can be arranged i.e. S = {HHHH, HHHT, HHTH, .. ,TTTT} each one is a specific outcome ,so our outcome space in this sense would have 16 elements. However, suppose we are only interested in the number of heads that show. The outcome space would be {0,1,2,3,4} and if we were gambling on the possibilty of 3 heads our outcome space could be simply {3heads, not 3 heads}! But how do we find how many elements of S (all equally likely - why?) would have 0,1,2,3 and 4 heads? We could count and then draw up a table thus: Random variables N(H) 0 1 2 3 4 Frequency 1 4 6 4 1 Thus for 3 heads the chances are 4/16 = 1/4 i.e. P(3 heads) = 0.25 In terms of probabilities we have: E 0 1 2 3 4 P(E) 1/16 4/16 6/16 4/16 1/16 Probability distribution This is an example of a discrete probability distribution.
  • 8.
    Counting the Permutations Thefour coins could be re-arranged amongst themselves in 24 different ways (permuations) viz: 1st coin choice = 4 ways, followed by 2nd coin choice = 3 ways, and so on giving 4*3*2*1 = 24 ways or 4! (four factorial) In general n objects could be re-arranged in n! different ways. Now consider where r of the coins show H and (n-r) show T. Clearly the r lots of H could be re-arranged amongst themselves in r! ways without it 'looking a different' and similarly the (n-r) tails in (n-r)! ways. Hence the number of different arrangements of four coins with r heads and (n-r) tails is: 1.0 and denoted as It can easily be shown that this can be written as : 1.1 So, if we had 10 coins the number of ways that 4 heads (or 6 tails) could appear is : 10*9*8*7 = 210 4*3*2*1 NOTE: Equation 1.1 is also used to calculate the number of COMBINATIONS of any r objects out of n objects. This will be explained in the talk. Thus, given 5 objects, I can select 3 objects in What about the probabilities? To answer this we need to look at joint events/outcomes
  • 9.
    Joint Probabilities andIndependent Outcomes If I toss a fair coin, then P(H) = P(T) = 1/2 This is true if I repeat the trial. This is because the two trials or experiments have independent outcomes. The joint occurrence of H followed by H is P(H) * P(H) = 1/2 * 1/2 = 1/4 This could have been arrived at by looking at all the joint outcomes and assuming equally likely probabilities for each 2.0 In general for n INDEPENDENT events, This in tossing 5 coins the probability of getting exactly 3 heads = (number of ways 3 heads could occur) * P(H) *P(H) * P(H) * P(T) * P(T) = 10 * (1/2) 5 = 10/32 = 5/16 = 0.3125 General result If in n independent trials the probability of an outcome or event is is p and the probabilty it doesn't occur is (1-p) = q, the probability of exactly r successes is given by: This is called the binomial probability distribution function. It should be noted that the value 'r' is the realisation of a random variable. If we denote by X the random variable, it is customary to write: 3.0 P(X = r) = P(r) =
  • 10.
    Non-independent outcomes -conditional probability and Bayes' Theorem Given two outcomes or events A and B say which are not independendent i.e. P(A and B) = P(A) * P(B) then we define the joint occurence as a conditional probability viz: P(A and B) = P(A given B) * P(B) = P(A|B) also, P(A and B) = P(A|B) * P(B) = P(B|A) * P(A) This leads to a form of Bayes' theorem: 4.0 P(A|B) = P(B|A) * P(A) P(B) The result can be generalised: 4.1 (See video example)) This is an extremely important result in probability theory and has been used as a basis for reasoning under uncertainty in 'expert' systems, a branch if artificail intelligence. Example: from a standard pack of 52 cards I select two cards. What is the probability they are both aces? Given one of the cards is an ace (say the first), this has a probability of 4/52 = 1/13 of being selected. It is wrong to infer the probability of two aces is therefor 1/13 * 1/13 because the selection of the first ace has altered the probability of selecting the second ace on the same basis. P(A1) = 4/52 P(A2 |A1) = 3/51. hence P(A1 and A2) = P(A1) * P(A2 |A1) = 4/52 * 3/51 We could have deduced this result from combinatorial analysis (left as an exercise!)
  • 11.
    Examples: The National Lottery Probabilityof Jackpot Select 5 numbers from 49 Total number of possible combinations = = 49 * 48 * 47 * 46 *45 * 44 6*5*4*3*2*1 = 13,983,816 P(jackpot) = 0.0000000715 or 13,983,815:1 against Imagine putting nearly 14 million raffle tickets into a box and selecting just 1 which is the winner. Most people gamble on this extremely unlikely event because the prize is large. However, the prize is rarely £13983816 on a single draw. Anything less and the bet is technically 'unfair'. We shall look at EXPECTATIONS shortly. We can calculate this probability another way. We have 1 chance in 49 of selecting the 'first' number = 1/49. 1/48 of selecting the second corectly, 1/47 the third etc. So to get all six numbers in the order of the draw (a permutation) the probability is: 1/49 * 1/48 * ...*1/44 (conditonal probabilities) But these 6 numbers can be arranged in any order in 6! ways = 720 so the probabilty is: 720 * 1/49 * 1/48 * .... * 1/44 = 0.0000000715 as before. Probability of three numbers correct out of the six chosen As before, P(first number matches) = 6/49 Hence, P(selection as given) P(second matches) = 5/48 = 6/49 * 5/48 * 4/47 * 43/46 * 42/45 * 41/44 = 0.00088 We need to scale this by the number of ways three correct P(third matches) = 3/47 and three incorrect could occur in the six selection P(fourth doesn't match) = 43/46 = (6 *5 *4)/(3 * 2* 1) = 20 ways P(fifth doesn't match) = 42/45 So the probability = 20 * 0.00088 = 0.0177 P(sixth doesn't match) = 41/44 or 0.923 : 0.0177 = 55:1 roughly The fair odds should be 10:1.
  • 12.
    Random Variables andExpectation Given a (discrete) outcome space {E1 , E2 , ....., En) and a 1:1 mapping of each E to a real valued number n. i.e. f(E) -> ni then n is a random variable which has a real value. We have already met this in the case of tossing coins. A probability distribution (pd) may be defined as a function f in which the domain consists of the possible values that a random variable (rv) can take on, and the range is composed of the probabilities associated with those values. The probability that a random variable X can take on the value x is f(x) and usually denoted as P(X = x). Similar definitions can be given for continuous real-valued random variables from a real domain. Thus in tossing two coins then if X = number of heads, P(X = 2) = 1/4 Thus, a pd as a set of rv values and their associated probabilities. The expectation of X denoted by E(X) is defined as It can be interpreted as the arithmetic mean (called the 1st moment about the origin) of the distribution. Example: (from coin tossing distribution x 0 1 2 3 4 P(x) 1/16 4/16 6/16 4/16 1/16 E(X) = 0 + 1/4 + 3/4 + 3/4 + 1/4 = 2 Thus the expected number of heads (mean number) is 2 heads. This can be verified experimentally by performing large numbers of tosses of 2 coins and averaging the number of heads.
  • 13.
    Expectation in gambling (expected loss or gain) We shall use pdf to mean probability distribution function. Suppose we have a pdf as follows: Outcome WIN LOSE Amount(x) £10 -£2 P(x) 0.2 0.8 The expectation is : 10 * 0.2 + (-2) * 0.8 = 2 - 1.6 = £0.4 I.e. in the long run you will win 40p (nice if you can find it!!) If E(X) = 0 then we say the bet is FAIR, in that in the long run neither side gains nor loses. If E(X) < 0 then in the long run you will always LOSE All games of chance with a negative expected gain means that in the long run you will lose money Roulette is a classical example Statistically odds(E) is defined as P(E)/(1-P(E)) = P(win)/P(lose) Bookmakers tend to quote odds the other way up as P(lose)/p(win). Thus if a bookmaker's odds were interpreted as personal probabilities then 2:1 against implies P(lose)/P(win) = (1- P(win))/P(win) = 2 so P(win) = 1/3 and P(lose) = 2/3. Such probabilities do not necessarily reflect the 'true' values whatever these might be. Games of pure chance such as roulette and the lottery allow the calculation of probabilities based on equally likely outcomes. However, the payout is not based on these odds since the 'house' or betting organiser must take a percentage. The bet in these cases is not fair and hence E(X) < 0 Horse racing and certain card games are different in that they allow an element of skill or judgment in deciding the bet.
  • 14.
    Roulette On the rightis a standard roulette wheel having 18 red slots, 18 black slots and one green zero (American wheels have two green zeros, 0 and 00). The payput odds are based solely on probability. Whether you bet on European or American tables the payout is simply: where n = number of squares bet on
  • 15.
    Expectations and theHouse Edge Assume a single zero wheel and bet £1 on 24 Assume equally likely outcomes Total number of outcomes = 37 P(ball in any given slot) = 1/37 P(24) = 1/37 = P(win) and P(lose) = 36/37 If you lose, then you lose your £1 bet i.e. outcome = -£1 with probability 36/37 The house offers 35:1 for the return on a single slot bet i.e on any number not zero. Thus your return in £35 (profit) if you win with probability 1/37 Note that the 'fair' payout odds should be 37:1 Expectation on the bet = -1 * 36/37 + 35 * 1/37 = 1/37(35 -36) = 1 /37 = £ -0.027 For a bet on red or black (also odd or even) the house offers even odds i.e. 1:1 P(lose) = 19/37 and P(win) = 18/37 E(bet ) = -1 * 19/37 + 1 * 18/37 = -1/37 = -£0.027 as before. The single or double green zero is effectively the house 'edge' because their payout odds do not include it. i.e if you kept the house odds and removed the zero (s) from the wheel you would have a fair bet.
  • 16.
    Poker The probabilities ofgetting the various hands at poker (e.g. four of a kind, pair, two pair etc) can be found using the theoretical results we have already looked at. However, calculating these probabilites can be quite tricky! Hand Probability Distribution No pair 1,303,560 in 2,598,960 50 .16% One pair 1,098,240 in 2,598,960 42 .26% Two pair 123,552 in 2,598,960 4 .75% Three of a kind 54,912 in 2,598,960 2 .11% Straight 9,180 in 2,598,960 0 .353% Flush 5,112 in 2,598,960 0 .197% Full house 3,744 in 2,598,960 0 .144% Four of a kind 624 in 2,598,960 0 .0240% Straight flush (excluding Royal flush) 32 in 2,598,960 0 .00123% Royal straight flush 4 in 2,598,960 0 .000154% TOTAL 2,598,960 in 2,598,960 100 The total number of five card hands = 52C5 = 2,598,960 different hands Example calculation: Probability of three of a kind (e.g. three kings and any two others [not pairs]) For convenience let For a given rank of four cards (e.g. Kh,Kd,Ks,Kc) we can select any three in 4C3 = 4 ways. We have 13 ranks so we can create 13 * 4 = 52 different triples ('three-of-a- kinds'.) Now consider those cards not in the rank of the triple. We have 12 ranks in the four suits and and we can select any pair that are not the same rank. We have 12 ranks to choose from. Ignoring suits for the present we can select 12C2 pairs among any rank (so no pairs) and each number in the pair could come from four suits, so the total number of different pairs is 12C2 x 42 = 1056
  • 17.
    Total number of'three-of-a-kinds' = 52 * 1056 = 54,912 P(three-of-a-kind) = 54,912/2,598,960 = 0.02113 BUT - having a winning hand in poker doesn't mean you will win. Poker is not just a game of chance but a game of bluff also.
  • 18.
    The Martingale BettingStrategy in Roulette There are various forms of Martingale betting but the simplest is as follows: Bet either on red/black or odd/even (payout odds are 1:1). If you lose then double your bet and play again. Repeat until you win then restart your bet sequence again. Foolproof??? After each win you only win back your initial bet. For example suppose you bet £1 initially and double -up on each loss for the next bet. The final net gain is: Bet(£) 1 2 4 8 Total returned = £16 Outcome L L L W Winnings = £16 - £8(stake) - £7(total outlay) = £1 Net Gain -1 -3 -7 1 Assume the gambler is only prepared to bet on up to six losing spins in a row (e.g. he/she may be bankrupt after this). In reality, the odds of a streak of 6 losses in a row are much higher than the many people intuitively believe. Psychological studies have shown that since people know that the odds of losing 6 times in a row out of 6 plays are low, they incorrectly assume that in a longer string of plays the odds are also very low. When people are asked to invent data representing 200 coin tosses, they often do not add streaks of more than 5 because they believe that these streaks are very unlikely. Assuming a US roulette wheel then the odds of losing a single spin at roulette are q = 20/38 = 52.6316%. If you play a total of 6 spins, the odds of losing 6 times are q6 = 2.1256%. However if you play more and more spins, the odds of losing 6 times in a row begin to increase rapidly. In 73 spins, there is a 50.3% chance that you will at some point have lost at least 6 spins in a row. (The chance of still being solvent after the first six spins is 0.978744, and the chance of becoming bankrupt at each subsequent spin is (1-0.526316)x0.021256 = 0.010069, where the first term is the chance that you won the (n-6)th spin - if you had lost the (n-6)th spin, you would have become bankrupt on the (n-1)th spin. Thus over 73 spins the probability of remaining solvent is 0.978744 x (1-0.010069)^67 = 0.49683, and thus the chance of becoming bankrupt is 1-0.49683 = 50.3%.) Similarly, in 150 spins, there is a 77.2% chance that you will lose at least 6 spins in a row at some point. And in 250 spins, there is a 91.1% chance that you will lose at least 6 spins in a row at some point.
  • 19.
    Betting on theGG's Bookmakers' payout odds on horses winning may not be coherent if considered as probabilities. The probability of a horse winning is unknown but might be guessed in relation to the other horses. So, how much are you prepared to bet on a horse winning? A lot depends on your knowledge of horse racing and betting, as well as how much you can comfortably lose I expect but let us look at a possible way you could win. If a bookmaker sets odds on horses winning, which if considered as probabilities are not coherent i.e. do not sum to 1, then it is possible for the bookmaker to ALWAYS win if the gambler bets on every horse optimally according to the odds. If he/she doesn't bet this way they will lose more. The total implied probability is 1.05 !! (not coherent with probability theory). The following example is taken from Wikipaedia Horse number Offered odds Implied probability Bet Price Bookie Pays if Horse Wins 1 Even 0.5 £100 £100 stake + £100 2 3 to 1 against 0.25 £50 £50 stake + £150 3 4 to 1 against 0.2 £40 £40 stake + £160 4 9 to 1 against 0.1 £20 £20 stake + £180 Total: 1.05 Total: £210 Always: £200 The bookmaker is guaranteed to make a £10 profit for this type of bet. It is called a Dutch Book or 'lock'. Now consider if a horse 4 is scratched and the bookmaker does not adjust the odds. The better can guarantee to win £10 by betting on each horse as before, since the bookmaker only makes £190 and must pay out £200 on each bet. With competitive fixed-odds gambling being offered electronically, gamblers can sometimes create a Dutch book by selecting the best odds from different bookmakers, in effect by undertaking an arbitrage operation.
  • 20.
    Two interesting Problemsin Probability 1. The Monty Hall Problem (see video) 2. The Birthday Problem
  • 21.
    Solutions: 1. Change yourselection 2. The birthday probabilities