SlideShare a Scribd company logo
1 of 174
Download to read offline
Study Material
for
Probability and Statistics
AAOC ZC111
Distance Learning Programmes Division
Birla Institute of Technology & Science
Pilani – 333031 (Rajasthan)
July 2003
Course Developed by
M.S.Radhakrishnan
Word Processing & Typesetting by
Narendra Saini
Ashok Jitawat
Contents
Page No.
INTRODUCTION, SAMPLE SPACES & EVENTS 1
Probability 1
Events 2
AXIOMS OF PROBABILITY 4
Some elementary consequences of the Axioms 4
Finite Sample Space (in which all outcomes are equally likely) 6
CONDITIONAL PROBABILITY 11
Independent events 11
Theorem on Total Probability 14
BAYE’S THEOREM 16
MATHEMATICAL EXPECTATION & DECISION MAKING 22
RANDOM VARIABLES 26
Discrete Random Variables 27
Binomial Distribution 28
Cumulative Binomial Probabilities 29
Binomial Distribution – Sampling with replacement 31
Mode of a Binomial distribution 31
Hyper Geometric Distribution (Sampling without replacement) 32
Binomial distribution as an approximation to the Hypergeometric
Distribution
34
THE MEAN AND VARIANCE OF PROBABILITY DISTRIBUTIONS 36
The mean of a Binomial Distribution 37
Digression 37
Chebychevs theorem 39
Law of large numbers 41
Poisson Distribution 42
Poisson approximation to binomial distribution 42
Cumulative Poisson distribution 43
Poisson Process 43
The Geometric Distribution 46
Multinomial Distribution 52
Simulation 54
CONTINUOUS RANDOM VARIABLES 56
Probability Density Function (pdf) 57
Normal Distribution 64
Normal Approximation to Binomial Distribution 69
Correction for Continuity 70
Other Probability Densities 71
The uniform Distribution 71
Gamma Function 73
Properties of Gamma Function 74
The Gamma Distribution 74
Exponential Distribution 74
Beta Distribution 78
The Log-Normal Distribution 79
JOINT DISTRIBUTIONS – TWO AND HIGHER DIMENSIONAL
RANDOM VARIABLES
83
Conditional Distribution 86
Independence 87
Two-Dimensional Continuous Random Variables 88
Marginal and Conditional Densities 90
Independence 91
The Cumulative Distribution Function 93
Properties of Expectation 100
Sample Mean 101
Sample Variance 102
SAMPLING DISTRIBUTION 115
Statistical Inference 115
Statistics 116
The Sampling Distribution of the Sample Mean X . 117
Inferences Concerning Means 128
Point Estimation 128
Estimation of n 130
Estimation of Sample proportion 143
Large Samples 143
Tests of Statistical Hypothesis 148
Notation 149
REGRESSION AND CORRELATION 164
Regression 164
Correlation 167
Sample Correlation Coefficient 167
1
INTRODUCTION, SAMPLE SPACES & EVENTS
Probability
Let E be a random experiment (where we ‘know’ all possible outcomes but can’t predict
what the particular outcome will be when the experiment is conducted). The set of all
possible outcomes is called a sample space for the random experiment E.
Example 1:
Let E be the random experiment:
Toss two coins and observe the sequence of heads and tails. A sample space for this
experiment could be { }TTHTTHHHS ,,,= . If however we only observe the number
of heads got, the sample space would be S = {0, 1, 2}.
Example 2:
Let E be the random experiment:
Toss two fair dice and observe the two numbers on the top. A sample space would be
( ) ( ) ( ) ( )
( ) ( ) ( )
( )
( ) −−−−−−−−−−
−−−−−
−−−−−−
=
)6,6(,1,6
|
,1,3
,3,2,2,2,1,2
6,1,,3,1,2,1,1,1
S
If however, we are interested only in the sum of the two numbers on the top, the
sample space could be S = { 2, 3, …, 12}.
Example 3:
Let E be the random experiment:
Count the number of machines produced by a factory until a defective machine is
produced. A sample space for this experiment could be { }−−−−−−= ,3,2,1S .
2
Example 4:
Let E be the random experiment:
Count the life length of a bulb produced by a factory.
Here S will be { } ).,0[0| ∞=≥tt
Events
An event is a subset of the sample space.
Example 5:
Suppose a balanced die is rolled and we observe the number on the top. Let A be the
event: an even number occurs.
Thus in symbols,
{ } { }6,5,4,3,2,16,4,2 =⊂= SA
Two events are said to be mutually exclusive if they cannot occur together; that is there
is no element common between them.
In the above example if B is the event: an odd number occurs, i.e. B = { }5,3,1 , then A and
B are mutually exclusive.
Solved Examples
Example 1:
A manufacturer of small motors is concerned with three major types of defects. If A is
the event that the shaft size is too large, B is the event that the windings are improper and
C is the event that the electrical connections are unsatisfactory, express in words what
events are represented by the following regions of the Venn diagram given below:
(a) region 2 (b) regions 1 and 3 together (c) regions 3, 5, 6 and 8 together.
3
Solution:
(a) Since this region is contained in A and B but not in C, it represents the event that
the shaft is too large and the windings improper but the electrical connections are
satisfactory.
(b) Since this region is common to B and C, it represents the event that the windings
are improper and the electrical connections are unsatisfactory. (c) Since this is the
entire region outside A, it represents the event that the shaft size is not too large.
Example 2:
A carton of 12 rechargeable batteries contain one that is defective. In how many ways can
the inspector choose three of the batteries and
(a) get the one that is defective
(b) not get the one that is defective.
Solution:
(a) one defective can be chosen in one way and two good ones can be chosen in
55
2
11
= ways. Hence one defective and two good can be chosen in 1 x 55 = 55
ways.
(b) Three good ones can be chosen in 165
3
11
= ways
8
A
7
B
2 5
1
4 3
C 6
4
AXIOMS OF PROBABILITY
Let E be a random experiment. Suppose to each event A, we associate a real number
P(A) satisfying the following axioms:
(i) ( ) 10 ≤≤ AP
(ii) ( ) 1=SP
(iii) If A and B are any two mutually exclusive events, then
( ) ( ) ( )BPAPBAP +=∪
(iv) If {A1, A2 - - - - - -An , …} is a sequence of pair- wise mutually exclusive
events, then ...)(...)()(...)...( 2121 ++++=∪∪∪∪ nn APAPAPAAAP
We call P(A) the probability of the event A.
Axiom 1 says that the probability of an event is always a number between 0 and 1.
Axiom 2 says that the probability of the certain event S is 1. Axiom 3 says that the
probability is an additive set function.
Some elementary consequences of the Axioms
1. ( ) 0=φP
Proof: S= φ∪S .. Now S and φ are disjoint.
Hence .0)()()()( =+= φφ PPSPSP Q.E.D.
2. If nAAA ,...,, 21 are any n pair-wise mutually exclusive events, then
( ) ( )
=
=∪∪∪
n
i
in APAAAP
1
21 ... .
Proof: By induction on n.
Def.: If A is an event
A′
the complementary event = S-A (It is the shaded portion in the figure below)
A
5
3. )(1)( APAP −=′
Proof: AAS ′∪=
Now )()()( APAPSP ′+= as A and A′ are disjoint or 1 = )()( APAP ′+ .
Thus )(1)( APAP −=′ . Q.E.D.
4. Probability is a
subtractive set function; i.e.
If BA ⊂ , then
)()()( APBPABP −=− .
5. Probability is a monotone set function:
i.e. )()( BPAPBA ≤⊂
Proof: ( )ABAB −∪= where A, B-A are disjoint.
Thus ).()()()( APABPAPBP ≥−+=
BA ∩
6. If A, B are any two events,
( ) ( )BAPBPAPBAP ∩−+=∩ )()(
Proof:
( ) ( )BAABA ∩′∪=∪ )
where A and BA ∩′ are disjoint
Hence ( ) ( )BAPAPBAP ∩′+=∪ )(
But ( ) ( ),BABAB ∩′∪∩=
union of two disjoint sets
( ) ( )
( ) ( ) ( ).
)(
BAPBPBAPor
BAPBAPBP
∩−=∩′
∩′+∩=
( ) ( )BAPBPAPBAP ∩−+=∪∴ )()( . Q.E.D.
7. If A, B, C are any three events,
( ) )CBA(P)AC(P)CB(P)BA(P)C(P)B(P)A(PCBAP ∩∩+∩−∩−∩−++=∪∪ .
B
A
A B
BA ∩′
6
Proof:
( )( )
)CBA(P)CB(P)CA(P)BA(P)C(P)B(P)A(P
))CB()CA((P)BA(P)C(P)B(P)A(P
)C)BA((P)C(P)BA(P)B(P)A(P
CBAP)C(P)BA(P)CBA(P
∩∩+∩−∩−∩−++=
∩∪∩−∩−++=
∩∪−+∩−+=
∩∪−+∪=∪∪
More generally,
8. If nAAA ,...,, 21 are any n events.
)AAA(P)1(
...)AAA(P)AA(P)A(P
)A...AA(P
n21
1n
nkji1
kji
nj1i
ji
n
1i
I
n21
∩−−−−−−−∩∩−+
−∩∩+∩−=
∪∪∪
−
<≤<≤≤<≤=
Finite Sample Space (in which all outcomes are equally likely)
Let E be a random experiment having only a finite number of outcomes.
Let all the (finite no. of) outcomes be equally likely.
If { }naaaS ,...,, 21= ( naaa ,...,, 21 are equally likely outcomes), { } { } { }n21 a.......aaS ∪= .a
union of m.e. events.
Hence { } { }( )naPaPaPSP −−−+= 21})({)(
But P({a1})=P({a2})= …= P({an}) = p (say)
Hence 1 = p+ p+ . . . +p (n terms) or p = 1/n
Hence if A is a subset consisting of ‘k’ of these outcomes,
A ={a1, a2………ak}, then
n
k
AP =)( =
outcomesofno.Total
outcomesfavorableofNo.
.
7
Example 1:
If a card is drawn from a well-shuffled pack of 52 cards find the probability of drawing
(a) a red king Ans:
52
2
(b) a 3, 4, 5 or 6 Ans:
52
16
(c) a black card Ans:
2
1
(d) a red ace or a black queen Ans:
52
4
Example 2:
When a pair of balanced die is thrown, find probability of getting a sum equal to
(a) 7.
Ans:
6
1
36
6
= (Total number of equally likely outcomes is
36 & the favourable number of outcomes = 6, namely
(1,6), (2,5),, …(6,1).)
(b) 11 Ans:
36
2
(c) 7 or 11 Ans:
36
8
(d) 2, 3 or 12 Ans: =
36
4
36
1
36
2
36
1
=++ .
Example 3:
10 persons in a room are wearing badges marked 1 through 10. 3 persons are chosen at
random and asked to leave the room simultaneously and their badge nos are noted. Find
the probability that
(a) the smallest badge number is 5.
(b) the largest badge number is 5.
8
Solution:
(a) 3 persons can be chosen in 10
C3 equally likely ways. If the smallest badge
number is to be 5, the badge numbers should be 5 and any two of the 5
numbers 6, 7, 8, 9,10. Now 2 numbers out of 5 can be chosen in 5
C2 ways.
Hence the probability that the smallest badge number is 5 is 5
C2 /10
C3 .
(b) Ans. 4
C2 /10
C3 .
Example 4:
A lot consists of 10 good articles, 4 articles with minor defects and 2 with major defects.
Two articles are chosen at random. Find the probability that
(a) both are good Ans:
2
16
2
10
C
C
(b) both have major defects Ans:
2
16
2
2
C
C
(c) At least one is good Ans: 1 – P(none is good) =
2
1
16
6
1
c
c
−
(d) Exactly one is good Ans:
2
11
16
6.10
c
cc
(e) At most one is good Ans. P(none is good) + P(exactly one is good) =
2
11
2
2
16
6.10
16
6
c
cc
c
c
+
(f) Neither has major defects Ans:
2
2
16
14
c
c
(g) Neither is good Ans:
2
2
16
6
c
c
9
Example 5:
From 6 positive and 8 negative integers, 4 integers are chosen at random and multiplied.
Find the probability that their product is positive.
Solution:
The product is positive if all the 4 integers are positive or all of them are negative or two
of them are positive and the other two are negative. Hence the probability is
++
4
14
2
8
2
6
4
14
4
8
4
14
4
6
Example 6:
If, A, B are mutually exclusive events and if P(A) = 0.29, P(B) = 0.43, then
(a) 0.710.291)AP( =−=′
(b) P(A∪B) = 0.29 + 0.43 = 0.72
(c) P( ) m.e.)areBandAsince,BofsubsetaisA(as[0.29BA ′==′∩ )A(P
(d) 0.280.721B)P(A1)BAP( =−=∪−=′∩′
Example 7:
P(A) = 0.35, P(B) = 0.73, P 0.14B)(A =∩ . Find
(a) P (A ∪ B) = P(A) + P(B) - P( A ∩ B) = 0.94.
(b) 0.59B)P(AP(B)B)A(P =∩−=∩′
(c) 0.21B)P(AP(A))B(AP =∩−=′∩
(d) 0.860.141B)P(A1)BAP( =−=∩−=′∪′
Example 8:
A, B, C are 3 mutually exclusive events. Is this assignment of probabilities possible?
P(A) = 0.3, P(B) = 0.4, P(C) = 0.5
10
Ans. P(A ∪ B ∪ C) = P(A) + P(B) + P(C) >1 NOT POSSIBLE
Example 9:
Three newspapers are published in a city. A recent survey of readers indicated the
following:
20% read A 8% read A and B 2% read all
16% read B 5% read A and C
14% read C 4% read B and C
Find probability that an adult chosen at random reads
(a) none of the papers.
Ans. 0.65
100
2458141620
1C)BP(A1 =
+−−−+++
−=∪∪−
(b) reads exactly one paper.
P (Reading exactly one paper)
0.22
100
769
=
++
=
(c) reads at least A and B given he reads at least one of the papers.
P (At least reading A and B given he reads at least one of the papers)
=
35
8
C)BP(A
B)P(A
=
∪∪
∩
A B
C
9 6
3
6
22
7
11
CONDITIONAL PROBABILITY
Let, A, B be two events. Suppose P(B) ≠ 0. The conditional probability of A occurring
given that B has occurred is defined as
P(A | B) = probability of A given B = .
P(B)
B)P(A ∩
Similarly we define P(B | A) =
P(A)
B)P(A ∩
if P(A) ≠ 0.
Hence we get the multiplication theorem
0)P(A)(if)P(A).P(B/AB)P(A ≠=∩ )
0)P(B)(if)P(B).P(A/B ≠=
Example 10
A bag contains 4 red balls and 6 black balls. 2 balls are chosen at random one by one
without replacement. Find the probability that both are red.
Solution
Let A be the event that the first ball drawn is red, B the event the second ball drawn is
red. Hence the probability that both balls drawn are red =
15
2
9
3
10
4
A)|P(BP(A)B)P(A =×=×=∩
Independent events:
Definition: We say two events A, B are independent if P(A∩ B) = P(A). P(B)
Equivalently A and B are independent if P(B | A) = P(B) or P(A | B) = P(A)
Theorem If, A, B are independent, then
(a) A′ , B are independent
(b) A, B′ are independent
(c) B,A ′′ are independent
12
Proof B)A(B)(AB ∩′∪∩=
Mutually
exclusive
B)AP(B)P(AP(B) ∩′+∩=
B)P(A-P(B)B)AP( ∩=∩′
= P(B) – P(A) (P/B)
= P(B) [1-P(A)]
= P(B) P( )A′
∴A, B′ are also independent.
By the same reasoning, A′ and B are independent.
So again A′ and B′ are independent.
Example 11
Find the probability of getting 8 heads in a row in 8 tosses of a fair coin.
Solution
If Ai is the event of getting a head in the ith toss, A1, A2, …, A8 are independent and
P(Ai) =
2
1
for all i. Hence P(getting all heads) =
P(A1) P(A2)…P(An) =
8
2
1
Example 12
It is found that in manufacturing a certain article, defects of one type occur with
probability 0.1 and defects of other type occur with probability 0.05. Assume
independence between the two types of defects. Find the probability that an article chosen
at random has exactly one type of defect given that it is defective.
A B
A∩B
BA ∩′
13
Let A be the event that article has exactly one type of defect.
Let B be the event that the article is defective.
Required
P(B)
B)P(A
B)|P(A
∩
=
P(B) = P(D ∪ E) where D is the event it has type one defect
E is the event it has type two defect
= P(D) + P(E) – P(D ∩ E) = 0.1 + 0.05 - (0.1) (0.05) = 0.145
P(A∩ B) = P (article is having exactly one type of defect)
= P(D) + P(E) – 2 P(D ∩ E) = 0.1 + 0.05 - 2 (0.1) (0.05)
= 0.14
∴Probability =
145.0
14.0
[Note: If A and B are two events, probability that exactly only one of them occurs
is P(A) + P(B) – 2P(A∩ B)]
Example 13
An electronic system has 2 subsystems A and B. It is known that
P (A fails) = 0.2
P (B fails alone) = 0.15
P (A and B fail) = 0.15
Find (a) P (A fails | B has failed)
(b) P (A fails alone)
14
Solution
(a) P(A fails | B has failed)
2
1
0.30
0.15
failed)P(B
failed)BandP(A
===
(b) P (A fails alone) = P (A fails) – P (A and B fail) = 0.02-0.15 = 0.05
Example 14
A binary number is a number having digits 0 and 1. Suppose a binary number is made up
of ‘n’ digits. Suppose the probability of forming an incorrect binary digit is p. Assume
independence between errors. What is the probability of forming an incorrect binary
number?
Ans 1- P (forming a correct no.) = 1 – (1-p)n
.
Example 15
A question paper consists of 5 Multiple choice questions each of which has 4 choices (of
which only one is correct). If a student answers all the five questions randomly, find the
probability that he answers all questions correctly.
Ans
5
4
1
.
Theorem on Total Probability
Let B1, B2, …, Bn be n mutually exclusive events of which one must occur. If A is any
other event, then
( ) )BP(A......BAP)BP(AP(A) 21 n∩++∩+∩=
)B|P(A)P(B ii
1i=
=
n
(For a proof, see your text book.)
Example 16
There are 2 urns. The first one has 4 red balls and 6 black balls. The second has 5 red
balls and 4 black balls. A ball is chosen at random from the 1st
and put in the 2nd
. Now a
ball is drawn at random from the 2nd
urn. Find the probability it is red.
15
Solution:
Let B1 be the event that the first ball drawn is red and B2 be the event that the first ball
drawn is black. Let A be the event that the second ball drawn is red. By the theorem on
total probability,
P(A) = P(B1) P(A | B1) + P(B2) P(A | B2) =
100
54
10
5
10
6
10
6
10
4
=×+× =0.54.
Example 17:
A consulting firm rents cars from three agencies D, E, F. 20% of the cars are rented from
D, 20% from E and the remaining 60% from F. If 10% of cars rented from D, 12% of
cars rented from E, 4% of cars rented from F have bad tires, find the probability that a
car rented from the consulting firm will have bad tires.
Ans. (0.2) (0.1) + (0.2) (0.12) + (0.6) (0.04)
Example 18:
A bolt factory has three divisions B1, B2, B3 that manufacture bolts. 25% of output is
from B1, 35% from B2 and 40% from B3. 5% of the bolts manufactured by B1 are
defective, 4% of the bolts manufactured by B2 are defective and 2% of the bolts
manufactured by B3 are defective. Find the probability that a bolt chosen at random from
the factory is defective.
Ans.
100
2
100
40
100
4
100
35
100
5
100
25
×+×+×
16
BAYES’ THEOREM
Let B1, B2, ……….Bn be n mutually exclusive events of which one of them must occur.
If A is any event, then
)B|)P(AP(B
)B|)P(AP(B
P(A)
)BP(A
A)|P(B
ii
1i
kkk
k n
=
=
∩
=
Example 19
Miss ‘X’ is fond of seeing films. The probability that she sees a film on the day before
the test is 0.7. Miss X is any way good at studies. The probability that she maxes the test
is 0.3 if she sees the film on the day before the test and the corresponding probability is
0.8 if she does not see the film. If Miss ‘X’ maxed the test, find the probability that she
saw the film on the day before the test.
Solution
Let B1 be the event that Miss X saw the film before the test and let B2 be the
complementary event. Let A be the event that she maxed the test.
Required. P(B1 | A)
)B|P(AP(B))B|P(A)P(B
)B|)P(AP(B
211
11
×+×
=
Example 20
At an electronics firm, it is known from past experience that the probability a new worker
who attended the company’s training program meets the production quota is 0.86. The
corresponding probability for a new worker who did not attend the training program is
0.35. It is also known that 80% of all new workers attend the company’s training
8.03.03.07.0
3.07.0
×+×
×
=
17
program. Find probability that a new worker who met the production quota would have
attended the company’s training programme.
Solution
Let B1 be the event that a new worker attended the company’s training programme. Let
B2 be the complementary event, namely a new worker did not attend the training
programme. Let A be the event that a new worker met the production quota. Then we
want P(B1 | A) =
35.02.086.08.0
8.08.0
×+×
×
.
Example 21
A printing machine can print any one of n letters L1, L2,……….Ln. It is operated by
electrical impulses, each letter being produced by a different impulse. Assume that there
is a constant probability p that any impulse prints the letter it is meant to print. Also
assume independence. One of the impulses is chosen at random and fed into the machine
twice. Both times, the letter L1 was printed. Find the probability that the impulse chosen
was meant to print the letter L1.
Solution:
Let B1 be the event that the impulse chosen was meant to print the letter L1. Let B2 be the
complementary event. Let A be the event that both the times the letter L1 was printed.
P(B1) =
n
1
. P(A|B1) = p2
. Now the probability that an impulse prints a wrong letter is (1-
p). Since there are n-1 ways of printing a wrong letter, P(A|B2) =
1
1
−
−
n
p
. Hence P(B1|A)
=
)B|P(A)P(B)B|P(A)P(B
)B|P(A)P(B
2211
11
×+×
×
2
2
2
1
11
1
1
1
−
−
−+
=
n
p
n
p
n
p
n
. This is the required probability.
18
Miscellaneous problems
1 (a). Suppose the digits 1,2,3 are written in a random order. Find probability that at
least one digit occupies its proper place.
Solution
There are 3! = 6 ways of arranging 3 digits (See the figure), out of which in 4
arrangements , at least one digit occupies its proper place. Hence the probability is
3!
4
=
6
4
. 123 213 312
132 231 321
(Remark. An arrangement like 231, where no digit occupies its proper place is
called a derangement.)
(b) Same as (a) but with 4 digits 1,2,3,4 Ans.
24
15
(Try proving this.)
Solution
Let A1 be the Event 1st
digit occupies its proper place
A2 be the Event 2nd
digit occupies its proper place
A3 be the Event 3rd digit occupies its proper place
A4 be the Event 4th digit occupies its proper place
P(at least one digit occupies its proper place)
=P(A1∪A2 ∪A3 ∪A4)
=P(A1) + P(A2) + P(A3) + P(A4)
(There are 4C1 terms each with the same probability)
)AAP(...)AP(A)AP(A)AP(A 43413121 ∩−−∩−∩−∩−
(There are 4C2 terms each with the same probability)
)AAP(A...)AAP(A).AAP(A 432421321 ∩∩++∩∩+∩∩+
(There are 4C3 terms each with the same probability)
- )AAAAP( 4321 ∩∩∩
4!
0!
4c
4!
1!
4c
4!
2!
4c
4!
3!
4c 4321 −+−=
19
24
1
6
1
2
1
1 −+−=
24
15
24
141224
=
−+−
=
(c) Same as (a) but with n digits.
Solution
Let A1 be the Event 1st
digit occupies its proper place
A2 be the Event 2nd
digit occupies its proper place
……………………
An be the Event nth
digit occupies its proper place
P(at least one digit occupies its proper place)
= P(A1∪A2 ∪ … ∪An)
=
!
1
(-1)......-
n!
3)!(n
nc
n!
2)!(n
nc
n!
1)!(n
nc 1-n
321
n
+
−
+
−
−
−
!
1
1)(..........
4!
1
3!
1
2!
1
1 1n
n
−
−−+−= ≈ !
e1 −
− (for n large).
2. In a party there are ‘n’ married couples. If each male chooses at random a
female for dancing, find the probability that no man chooses his wife.
Ans 1-(
!
1
1)(..........
4!
1
3!
1
2!
1
1 1n
n
−
−−+− ).
3. A and B play the following game. They throw alternatively a pair of dice.
Whosoever gets sum of the two numbers on the top as seven wins the game
and the game stops. Suppose A starts the game. Find the probability (a) A
wins the game (b) B wins the game.
20
Solution
A wins the game if he gets seven in the 1st
throw or in the 3rd
throw or in the
5th
throw or …. Hence P(A wins) =
6
1
6
5
6
5
6
5
6
5
6
1
6
5
6
5
6
1
××××+××+ + …
= .
11
6
36
2536
6
1
6
5
1
6
1
2
=
−
=
−
P(B wins) = complementary probability =
11
5
.
4. Birthday Problem
There are n persons in a room. Assume that nobody is born on 29th
Feb.
Assume that any one birthday is as likely as any other birth day. Find the
probability that no two persons will have same birthday.
Solution
If n > 365, at least two will have the same birthday and hence the probability
that no two will have the same birthday is 0.
If n ≤ 365, the desired probability is
( )[ ]
n
(365)
1n365.........364365 −−××
= .
5. A die is rolled until all the faces have appeared on top.
(a) What is probability that exactly 6 throws are needed?
Ans. 6
6
!6
(b) What is probability that exactly ‘n’ throws are needed? ( )6n >
21
6. Polya’s urn problem
An urn contains g green balls and r red balls. A ball is chosen at random and
its color is noted. Then the ball is returned to the urn and c more balls of same
color are added. Now a ball is drawn. Its color is noted and the ball is
replaced. This process is repeated.
(a) Find probability that 1st
ball drawn is green.
Ans.
rg
g
+
(b) Find the probability that the 2nd
ball drawn is green.
Ans.
rg
g
crg
g
rg
r
crg
cg
rg
g
+
=
+++
+
++
+
×
+
(c) Find the probability that the nth
ball drawn is green.
The surprising answer is
rg
g
+
.
7. There are n urns and each urn contains a white and b red balls. A ball is
chosen from Urn 1 and put into Urn 2. Now a ball is chosen at random from
urn 2 and put into urn 3 and this is continued. Finally a ball drawn from Urn n.
Find the probability that it is white.
Solution
Let pr = Probability that the ball drawn from Urn r is white.
∴
1
)p(1
1
1
pp 1r1rr
++
×−+
++
+
×= −−
aa
a
ba
a
; r = 1, 2, …, n.
This is a recurrence relation for pr. Noting that p1 =
ba
a
+
, we can find pn.
22
MATHEMATICAL EXPECTATION & DECISION MAKING
Suppose we roll a die n times. What is the average of the n numbers that appear on the
top?
Suppose 1 occurs on the top n1 times
Suppose 2 occurs on the top n2 times
Suppose 3 occurs on the top n3 times
Suppose 4 occurs on the top n4 times
Suppose 5 occurs on the top n5 times
Suppose 6 occurs on the top n6 times
Total of the n numbers on the top = 621 n....6..........n1n1 ×+×+×
∴Average of the n numbers,
nnnn
621621 n
6...
n
2
n
1
n6..........n2n1
×++×+×=
××+×
=
Here clearly n1, n2, …, n6 are unknown. But by the relative frequency definition of
probability, we may approximate
n
n1
by P(getting 1 on the top) =
6
1
,
n
n2
by
P(getting 2 on the top) =
6
1
, and so on. So we can ‘expect’ the average of the n
numbers to be 5.3
2
7
= . We call this the Mathematical Expectation of the number
on the top.
Definition
Let E be a random experiment with n outcomes a1, a2 ……….an. Suppose P({a1})=p1,
P({a2})=p2, …, P({an})=pn. Then we define the mathematical expectation as
nn2211 pa.........papa ×+×+×
23
Problems
1. If a service club sells 4000 raffle tickets for a cash prize of $800, what is the
mathematical expectation of a person who buys one of these tickets?
Solution. 2.0
5
1
)(0
4000
1
800 ==×+×
2. A charitable organization raises funds by selling 2000 raffle tickets for a 1st
prize
worth $5000 and a second prize $100. What is mathematical expectation of a
person who buys one of the tickets?
Solution. )(0
2000
1
100
2000
1
5000 ×+×+×
3. A game between 2 players is called fair if each player has the same mathematical
expectation. If some one gives us $5 whenever we roll a 1 or a 2 with a balanced
die, what we must pay him when we roll a 3, 4, 5 or 6 to make the game fair?
Solution. If we pay $x when we roll a 3, 4, 5, or 6 for the game to be fair,
6
2
5
6
4
×=×x or x = 10. That is we must pay $10.
4. Gambler’s Ruin
A and B are betting on repeated flips of a balanced coin. At the beginning, A has
m dollars and B has n dollars. After each flip the loser pays the winner 1 dollar
and the game stops when one of them is ruined. Find probability that A will win
B’s n dollars before he loses his m dollars.
Solution.
Let p be the probability that A wins (so that 1-p is the probability that B wins).
Since the game is fair, A’s math exp = B’s math exp.
Thus ( ) 0.pp)m(1p10pn +−=−+× or
nm
m
p
+
=
24
5. An importer is offered a shipment of machines for $140,000. The probability that
he will sell them for $180,000, $170,000 (or) $150,000 are respectively 0.32,
0.55, and 0.13. What is his expected profit?
Solution. Expected profit
= 13.0000,1055.0000,3032.0000,40 ×+×+×
=$30,600
6. The manufacturer of a new battery additive has to decide whether to sell her
product for $80 a can and for $1.2 a can with a ‘double your money back if not
satisfied’ guarantee. How does she feel about the chances that a person will ask
for double his/her money back if
(a) she decides to sell the product for $0.80
(b) she decides to sell the product for $1.20
(c) she can not make up her mind?
Solution. In the 1st
case, she gets a fixed amount of $0.80 a can
In the 2nd
case, she expects to get for each can
(1.20) (1-p) + (-1.2) (p) = 1.20 – (2.4) p
Let p be the prob that a person will ask for double his money back.
(a) happens if 0.80 > 1.20 –2.40 p
p > 1/6
(b) happens if
p < 1/6
(c) happens if p = 1/6
25
7. A manufacturer buys an item for $1.20 and sells it for $4.50. The probabilities for
a demand of 0, 1, 2, 3, 4, “5 or more” items are 0.05, 0.15, 0.30, 0.25, 0.15, 0.10
respectively. How many items he must stock to maximize his expected profit?
No. of items stocked No. sold with prob. Exp. profit
0 0 1 0
1
0 0.05
1 0.95
175.2
1.295.0
5.405.00
=
−×
+×
2
0 0.05
1 0.15
2 0.80 675.3
2.480.09
15.05.405.00
=
−×+
×+×
3
0 0.05
1 0.15
2 0.30
3 0.50 ˆˁˋˈˆˁˋˈˆˁˋˈˆˁˋˈ=
−×
+×+
×+×
3.615.0
5.1330.09
15.05.405.00
4 2.85
5 0.525
6 0.45
Hence he must stock 3 items to maximize his expected profit.
8. A contractor has to choose between 2 jobs. The 1st
job promises a profit of
$240,000 with probability 0.75 and a loss of $60,000 with probability 0.25. The
2nd
job promises a profit of $360,000 with probability 0.5 and a loss of $90,000
with probability 0.5.
(a) Which job should the contractor choose to maximize his expected profit?
i. Exp. profit for job1 = 000,155
4
1
000,60
4
3
000,240 =×−×
ii. Exp. profit for job2 = 36,000 000,135
2
1
000,90
2
1
=×−×
Go in for job1.
(b) What job would the contractor probably choose if her business is in bad
shape and she goes broke unless, she makes a profit of $300,000 on her
next job.
Ans:- She takes the job2 as it gives her higher profit.
26
RANDOM VARIABLES
Let E be a random experiment. A random variable (r.v) X is a function that associates to
each outcome s, a unique real number X (s).
Example 1
Let E be the random experiment of tossing a fair coin 3 times. We see that there are
823
= outcomes TTT, HTT, THT, TTH, HHT, HTH, THH, HHH all of which are
equally likely. Let X be the random variable that ‘counts’ the number of heads obtained.
Thus X can take only 4 values 0,1,2,3. We note that
( ) ( ) ( ) ( ) .
8
1
3,
8
3
2,
8
3
1,
8
1
0 ======== XPXPXPXP This is called the
probability distribution of the rv X. Thus the probability distribution of a rv X is the
listing of the probabilities with which X takes all its values.
Example 2
Let E be the random experiment of rolling a pair of balanced die. There are 36 possible
equally likely outcomes, namely (1,1), (1,2)…… (6,6). Let X be the rv that gives the sum
of the two nos on the top. Hence X take 11 values namely 2,3……12. We note that the
probability distribution of X is
( ) ( ) ( ) ( )
36
2
11XP3XP,
36
1
12XP2XP ======== ,
( ) ( ) ,
36
3
10XP4XP ====
( ) ( )
36
4
9XP5XP ==== .
( ) ( ) ( ) .
6
1
36
6
7XP,
36
5
8XP6XP =======
Example 3
Let E be the random experiment of rolling a die till a 6 appears on the top. Let X be the
no of rolls needed to get the “first” six. Thus X can take values 1,2,3…… Here X takes
an infinite number of values. So it is not possible to list all the probabilities with which X
takes its values. But we can give a formula.
27
( ) ( ).....2,1
6
1
6
5
1
===
−
xxXP
x
(Justification: X = x means the first (x-1) rolls gave a number (other than 6) and
the xth roll gave the first 6. Hence ( ) )
6
1
6
5
6
1
6
5
...
6
5
6
5
1
1
−
−
=×××==
x
timesx
xXP
Discrete Random Variables
We say X is a discrete rv of it can take only a finite number of values (as in example 1,2
above) or a “countably” infinite values (as in example 3).
On the other hand, the annual rainfall in a city, the lifelength of an electronic device, the
diameter of washers produced by a factory are all continuous random variables in the
sense they can take (theoretically at least) all values in an ‘interval’ of the x-axis. We
shall discuss continuous rvs a little later.
Probability distribution of a Discrete RV
Let X be a discrete rv with values ......, 21 xx
Let ( ) ( )( ).....2,1ixXPxf ii ===
We say that ( ){ } ....2,1iixf = is the probability distribution of the rv X.
Properties of the probability distribution
(i) ( ) .....2,1iallfor0xf i =≥
(ii) ( ) 1xf i
i
=
The first condition follows from the fact that the probability is always .0≥ The second
condition follows from the fact that the probability of the certain event = 1.
28
Example 4
Determine whether the following can be the probability distribution of a rv which can
take only 4 values 1,2,3 and 4.
(a) ( ) ( ) ( ) ( ) 26.0426.0326.0226.01 ==== ffff .
No as the sum of all the “probabilities” > 1.
(b) ( ) ( ) ( ) ( ) 28.0429.03,28.0215.01 ==== ffff .
Yes as these are all 0≥ and add up to 1.
(c) ( ) 4,3,2,1
16
1
=
+
= x
x
xf .
No as the sum of all the probabilities < 1.
Binomial Distribution
Let E be a random experiment having only 2 outcomes, say ‘success’ and ‘failure’.
Suppose that P(success) = p and so P(failure) = q (=1-p). Consider n independent
repetitions of E (This means the outcome in any one repetition is not dependent upon the
outcome in any other repetition). We also make the important assumption that P(success)
= p remains the same for all such independent repetitions of E. Let X be the rv that
’counts’ the number of successes obtained in n such independent repetitions of E. Clearly
X is a discrete rv that can take n+1 values namely 0,1,2,….n. We note that there are
n
2 outcomes each of which is a ‘string’ of n letters each of which is an S or F (if n =3, it
will be FFF, SFF, FSF, FFS, SSF, SFS, FSS, SSS).
X = x means in any such outcome there are
x successes and (n-x) failures in some order. One such will be
xnx
FFFFSSSS
−
.... . Since all
the repetitions are independent prob of this outcome will be xnx
qp −
. Exactly the same
prob would be associated with any other outcome for which X = x. But x successes can
occur out of n repetitions in
x
n
mutually exclusive ways. Hence
( ) ( )....n1,0,xqp
x
n
xXP xnx
=== −
29
We say X has a Binomial distribution with parameters n (≡ the number of repetitions)
and p (Prob of success in any one repetition).
We denote ( ) ( )p,n;xbxXP by= to show its dependence on x, n and p. The letter ‘b’
stands for binomial.
Since all the above (n+1) probabilities are the (n+1) terms in the expansion of the
binomial ( )n
pq + , X is said to have a binomial distribution. We at once see that the sum
of all the binomial probabilities = ( ) .11 ==+ nn
pq
The independent repetitions are usually referred to as the “Bernoulli” trials. We note that
( ) ( )q,n;xnbp,n;xb −=
(LHS = Prob of getting x successes in n Bernoulli trials = prob of getting n-x failures in
n Bernoulli trials = R.H.S.)
Cumulative Binomial Probabilities
Let X have a binomial distribution with parameters n and p.
( ) ( ) P0XPxXP +==≤ ( ) ( )xXP......1X =+=
= ( )p,n;kb
x
0k=
is denoted by ( )pnxB ,; and is called the cumulative Binomial distribution function. This
is tabulated in Table 1 of your text book. We note that
( ) ( ) ( ) ( )
( ) ( )p,n;1xBp,n;xB
1xXPxXPxXpp,n;xb
−−=
−≤−≤===
Thus ( )60.00,12;9b = ( ) ( )60.0,12;860.0,12;9 BB −
1419.0
7747.09166.0
=
−=
(You can verify this by directly calculating ( )).9;12,0.60b
30
Example 5 (Exercise 4.15 of your book)
During one stage in the manufacture of integrated circuit chips, a coating must be
applied. If 70% of the chips receive a thick enough coating find the probability that
among 15 chips.
(a) At least 12 will have thick enough coatings.
(b) At most 6 will have thick enough coatings.
(c) Exactly 10 will have thick enough coatings.
Solution
Among 15 chips, let X be the number of chips that will have thick enough coatings.
Hence X is a rv having Binomial distribution with parameters n =15 and p = 0.70.
(a) ( ) ( )11XP112XP ≤−=≥
( )
3969.07031.01
70.0,15;111
=−=
−= B
(b) ( ) ( )70.0,15;6B6XP =≤
0152.0=
(c) ( ) ( ) ( )70.0,15;9B70.0,15;10B10XP −==
2065.0
2784.04849.0
=
−=
Example 6 (Exercise 4.19 of your text book)
A food processor claims that at most 10% of her jars of instant coffee contain less coffee
than printed on the label. To test this claim, 16 jars are randomly selected and contents
weighed. Her claim is accepted if fewer than 3 of the 16 jars contain less coffee (note that
10% of 16 = 1.6 and rounds to 2). Find the probability that the food processor’s claim
will be accepted if the actual percent of the jars containing less coffee is
(a) 5% (b) 10% (c) 15% (d) 20%
Solution:
Let X be the number of jars that contain less coffee (than printed on the label) (among the
16 jars randomly chosen. Thus X is a random variable having a Binomial distribution
31
with parameters n = 16 and p (the prob of “success” = The prob that a jar chosen at
random will have less coffee)
(a) Here p = 5% = 0.05
Hence P (claim is accepted) = ( ) ( ) .9571.005.0,16;2B2XP ==≤
(b) Here p = 10% = 0.10
Hence P (claim is accepted) = ( ) 7892.001.0,16;2 =B
(c) Here p = 15% = 0.15.
Hence P (claim is accepted) = B( ) 5614.015.0,16;2 =
(d) Here p = 20% = 0.20
Hence P(claims accepted) = ( ) 3518.029.0,16,2 =B
Binomial Distribution – Sampling with replacement
Suppose there is an urn containing 10 marbles of which 4 are white and the rest are black.
Suppose 5 marbles are chosen with replacement. Let X be the rv that counts the no of
white marbles drawn. Thus X = 0,1,2,3,4 or 5 (Remember that we replace each marble in
the urn before drawing the next one. Hence we can draw 5 white marbles)
P (“Success”) = P (Drawing a white marble in any one of the 5 draws) =
10
4
(remember
we draw with replacement).
Thus X has a Binomial distribution with parameters n = 5 and
10
4
=p
Hence ( ) ==
10
4
,5;xbxXP
Mode of a Binomial distribution
We say 0x is the mode of the Binomial distribution with parameters n and p if
( )0xXP = is the greatest. From the binomial tables given in the book we can easily see
that
32
When ( ) .mod55,
2
1
,10 etheisorgreatesttheisXPpn ===
Fact
( )
( )
( )pnpxif
p
p
n
xn
pnxb
pnxb
−−<>
−
×
+
−
=
+
11
11;;
,;1
( )
( )ppnnif
pnpxif
−−><
−−==
11
11
Thus so long as x <np – (1-p) the binomial probabilities increase and if x> np-(1-p) they
decrease. Hence if np-(1-p) = x0 is an integer, then the mode is .100 +xandx If n – (1-p)
in not an integer and if 0x = smallest integer ( )pnp −−≥ 1 , the mode is .x0
Hypergeometric Distribution (Sampling without replacement)
An urn contains 10 marbles of which 4 are white. 5 marbles are chosen at random
without replacement. Let X be the rv that counts the number of white marbles drawn.
Thus X can take 5 values names 0,1,2,3,4. What is P (X = x)? Now out of 10 marbles 5
can be chosen in
5
10
equally like ways, out of which there will be
− xx 5
64
ways of
drawing x white marbles (and so 5-x read marbles) (Reason out of 4 white marbles, x can
be chosen in
x
4
ways and out of 6 red marbles, 5-x can be chosen in
− x5
6
ways).
Hence ( ) .4,3,2,1,0
5
10
5
64
=
−
== x
xx
xXP
We generalize the above result.
A box contains N marbles out of which a are white. n marbles are chosen without
replacement. Let X be the random variable that counts the number of white marbles
drawn. X can take the values 0,1,2……. n.
33
( ) ....2,1,0=
−
−
== x
n
N
xn
aN
x
a
aXP n
(Note x must be less than or equal to a and n-x must be less than or equal to N-a)
We say the rv X has a hypergeometric distribution with parameters n,a and N. We denote
P(X=x) by h (x;n,a,N).
Example 7 (Exercise 4.22 of your text book)
Among the 12 solar collectors on display, 9 are flat plate collectors and the other three
are concentrating collectors. If a person choses at random 4 collectors, find the prob that
3 are flat plate ones.
Ans ( )=
4
12
1
3
3
9
12,9,4;3h
Example 8 (Exercise 4.24 of your text book)
If 6 of 18 new buildings in a city violate the building code, what is the probability that a
building inspector, who randomly selects 4 of the new buildings for inspection, will catch
(a) None of the new buildings that violate the building code
Ans ( )=
4
18
4
12
18,6,4;1h
(b) One of the new buildings that violate the building code
34
Ans ( )=
4
18
3
12
1
6
18,6,4;1h
(c) Two of the new buildings that violate the building code
Ans ( )=
4
18
2
12
2
6
18,6,4;2h
(d) At least three of the new buildings that violate the building code
Ans ( ) ( )18,6,4;418,6,4;3 hh +
(Note: We choose 4 buildings out of 18 without replacement. Hence hypergeometric
distribution is appropriate)
Binomial distribution as an approximation to the Hypergeometric Distribution
We can show that ( ) ( ) ∞→→ NaspnxbNanxh ,;,,;
(Where )"" successaofprob
N
a
p == . Hence if N is large the hypergeometric
probability ( )N,a,n;xh can be approximated by the binomial probability
( )p,n;xb where .
N
a
p =
Example 9 (exercise 4.26 of your text)
A shipment of 120 burglar alarms contains 5 that are defective. If 3 of these alarms are
randomly selected and shipped to a customer, find the probability that the customer will
get one defective alarm.
(a) By using the hypergemetric distribution
(b) By approximating the hypergeometric probability by a binomial probability.
35
Solution
Here N = 120 (Large!) a = 5 n = 3 x =1
(a) Reqd prob = ( )120,5,3;1h
1167.0
280840
65555
3
120
2
115
1
5
=
×
==
(b) ( )≈
120
5
,3;1120,5,3;1 bh
1148.0
120
5
1
120
5
1
3
2
=−=
Example 10 (Exercise 4.27 of your text)
Among the 300 employees of a company, 240 are union members, while the others are
not. If 8 of the employees are chosen by lot to serve on the committee which
administrates the provident fund, find the prob that 5 of them will be union members
while the others are not.
(a) Using hypergemoretric distribution
(b) Using binomial approximation
Solution
Here N = 300, a = 240, n = 8 x = 5
(a) ( )300,240,8;5h
(b) ≈
300
240
8,5;b
36
THE MEAN AND VARIANCE OF PROBABILITY DISTRIBUTIONS
We know that the equation of a line can be written as .cmxy += Here m is the slope and
c is the y intercept. Different m,c give different lines. Thus m and c characterize a line.
Similarly we define certain numbers that characterize a probability distribution.
The mean of a probability distribution is simply the mathematical expectation of the
corresponding r.v. If a rv X takes on the values .....xx 2,1 with probabilities
( ) ( )....,xf,xf 21 its mathematical expectation or expected value is
( ) ( ) ( ) obabilityvaluexxPxxxfxxfx ii
i
Pr......2211 ×===++
We use the symbol µ to denote the mean of X.
Thus ( ) ( )ii xxPxXE ===µ (Summation over all xi in the Range of X)
Example 11
Suppose X is a rv having the probability distribution
X 1 2 3
Prob
2
1
3
1
6
1
Hence the mean µ of the prob distribution (of X) is
3
5
6
1
3
3
1
2
2
1
1 =×+×+×=µ
Example 12
Let X be the rv having the distribution
X 0 1
Prob q p
37
where .10Thus.1 ppqpq =×+×=−= µ
The mean of a Binomial Distribution
Suppose X is a rv having Binomial distribution with parameters n and p. Then
Mean of .npX =µ=
(Read the proof on pages 107-108 of your text book)
The mean of a hypergeometric Distribution
If X is a rv having hypergeometric distribution with parameters .,,,
N
a
nthenanN =µ
Digression
The mean of a rv x give the “average” of the values taken by the rv. X. Thus the
average marks in a test is 40 means the students would have got marks less than 40
and greater than 40 but it averages out to be 40. But we do not get an idea about the
spread (≡deviation from the mean) of the marks. This spread is measured by the
variance. Informally speaking by the average of the squares of deviation from the
mean.
Variance of a Probability Distribution of X is defined as the expected value of
( )2
X µ−
Variance of 2
X σ=
( ) ( )
Xi
i
2
i
Rx
xXPx
∈
=−=
Note that R.H.S is always 0≥ (as it is the sum of non-ve numbers)
The positive square root 2
of σσ is called the standard deviation of X and has the
same units as X and .µ
38
Example 13
For the rv X having the prob distribution given in example 11, the variance is
9
5
6
1
9
16
3
1
9
1
2
1
9
4
6
1
3
5
3
3
1
3
5
2
2
1
3
5
1
222
=×+×+=
×−+×−+×−
x
We could have also used the equivalent formula
( ) ( )
( )
.
9
5
9
25
3
10
3
10
18
60
6
9
3
4
2
1
6
1
3
3
1
2
2
1
1XEHere
XEXE
2
2222
2222
=−=σ∴
==++=×+×+×=
µ−=µ−=σ
Example 14
For the probability distribution of example 12,
( )
( ) pqp1ppp
pp1qoXE
22
222
=−=−=σ∴
=×+×=
Variance of the Binomial Distribution
npq=2
σ
Variance of the Hypergeometric Distribution
.
1
.12
−
−
−=
N
nN
N
a
N
a
nσ
39
CHEBYCHEV’S THEOREM
Suppose X is a rv with mean µ and variance 2
σ . Chebychev’s theorem states that: If k
is a constant > 0,
( ) 2
k
1
k|X|P ≤σ≥µ−
In words the prob of getting a value which deviates from its mean µ by at least σk is at
most 2
1
k
.
Note: Chebyshev’s Theorem gives us an upper bound of the prob of an event. Mostly it is
of theoretical interest.
Example 15 (Exercise 4.44 of your text)
In one out of 6 cases, material for bullet proof vests fails to meet puncture standards. If
405 specimens are tested, what does Chebyshev theorem tell us about the prob of getting
at most 30 or at least 105 cases that do not meet puncture standards?
Here
2
135
6
1
405 =×==npµ
2
15
6
5
6
1
4052
=∴
××==
σ
σ qpn
Let X = no of cases out of 405 that do not meet puncture standards
Reqd ( )105Xor30XP ≥≤
Now
2
75
X30X −≤µ−≤
2
75
X105X ≥µ−≥
Thus σ=≥µ−≥≤ 5
2
75
|X|105Xor30X
40
( ) ( ) 04.0
25
1
5
1
5|X|P105Xor30XP 2
==≤σ≥µ−=≥≤∴
Example 16 (Exercise 446 of your text)
How many times do we have to flip a balanced coin to be able to assert with a prob of at
most 0.01 that the difference between the proportion of tails and 0.50 will be at least
0.04?
Solution:
Suppose we flip the coin n times and suppose X is the no of tails obtained. Thus the
proportion of tails = =
flipsofNoTotal
tailsofNo
n
X
. We must find n so that
10.00.040.50
n
X
P ≤≥−
Now X = no of tails among n flips of a balanced coin is a rv having Binomial distribution
with parameters n and 0.5.
Hence ( ) 50.0nnpXE ×===µ
( )50.050.0 ==×== qpasnqpnσ
Now 04.050.0
n
X
≥− is equivalent to .n04.050.0nX ≥×−
We know ( ) 2
k
1
kXP ≤σ≥µ−
Here nk 04.0=σ
n
n
n
k 08.0
50.0
04.0
=
×
=∴
41
( )
( )
( )
15625
08.
100
nor
.100n08.ifor100
01.0
1
kif
01.0
k
1
k|X|P
04.050.0
n
X
P
2
22
2
=≥
≥=≥
≤≤σ≥µ−=
≥−∴
Law of large Numbers
Suppose a factory manufactures items. Suppose there is a constant prob p that an item is
defective. Suppose we choose n items at random and let X be the no of defectives found.
Then X is a rv having binomial distribution with parameters n and p.
( ) npqiancevar,npXEmean 2
=σ==µ∴
Let ε be any no > 0.
Now ε≥− p
n
X
P
( ) ( )σ≥µ−=ε≥−= kxPnnpXP (where ε=σ nk )
≤ .nas0
n
pq
n
npq
n
)theorems'Chebyshevby(
k
1
22222
2
2
∞→→
ε
=
ε
=
ε
σ
=
Thus we can say that the prob that the proportion of defective items differs from the
actual prob. p by any + ve no ∞→→ nas0ε . (This is called the Law of Large
numbers)
This means “most of the times” the proportion of defectives will be close to the actual
(unknown) prob p that an item is defective for large n. So we can estimate
n
X
byp , the
(Sample) proportion of defectives.
42
POISSON DISTRIBUTION
A random variable X is said to have a Poisson distribution with parameter 0>λ if its
probability distribution is given by
( ) ( ) ......2,1,0
!
; ==== −
x
x
exfxXP
x
λ
λ λ
We can easily show: mean of λ=µ=X and variance of .X 2
λ=σ=
Also ( )xXP = is largest when λλ−λ= ifand1x is an integer and when [ ]λ=x = the
greatest integer λ≤ (when λ is not an integer). Also note that ( ) .0 ∞→→= xasxXP
POISSON APPROXIMATION TO BINOMIAL DISTRIBUTION
Suppose X is a rv having Binomial distribution with parameters n and p. We can easily
show ( ) ( ) ( ) ∞→→== nasx;fxXPpn,x;b in such a way that np remains a constant
.λ
Hence for n large, p small, the binomial prob ( )pnxb ,; can be approximated by the
Poisson prob ( )λ;xf where .np=λ
Example 17
( )03.0,100;3b
( )
!3
3
3;3
33−
=≈
e
f
Example 18 (Exercise 4.54 of your text)
If 0.8% of the fuses delivered to an arsenal are defective, use the Poisson approximation
to determine the probability that 4 fuses will be defective in a random sample of 400.
Solution
If X is the number of defectives in a sample of 400, X has the binomial distribution with
parameters n = 400 and p = 0.8% = 0.008.
43
Thus P (4 out of 400 are defective)
( ) ( ) ( )
( )
603.0781.0
!4
2.3
e
2.3008.0400Where;4f008.0,400;4b
4
2.3
−=
=
=×=λλ≈=
−
(from table 2 at the end of the text)
= 0.178
Cumulative Poisson Distribution Function
If X is a rv having Poisson Distribution with parameter ,λ the cumulative Poisson Prob
( ) ( ) ( ) ( )λ===≤=λ=
==
;kfkXPxXP;xF
x
0k
x
0k
For various ( )λλ ;xF,xand has been tabulated in table 2 (of your text book on page 581
to 585) .We use the table 2 as follows.
( ) ( ) ( ) ( )
( ) ( )λ−−λ=
−≤−≤===λ
;1xF;xF
1xXPxXPxXP;xf
Thus ( ) ( ) ( ) .178.0603.0781.02.3;32.3;42.3;4 =−=−= FFf
Poisson Process
There are many situations in which events occur randomly in regular intervals of time.
For example in a time period t, let tX be the number of accidents at a busy road junction
in New Delhi; tX be the number of calls received at a telephone exchange; tX be the
number of radio active particles emitted by a radioactive source etc. In all such examples
we find tX is a discrete rv which can take non-ve integral values 0,1,2,….. The important
thing to note is that all such random variables have “same” distribution except that the
parameter(s) depend on time t.
The collection of random variables ( )tX t > 0 is said to constitute a random process. If
each ( )tX has a Poisson Distribution, we say ( )tX is a Poisson process. Now we show
the rvs ( )tX which counts the number of occurrences of a random phenomena in a time
44
period t constitute a Poisson process under suitable assumptions. Suppose in a time
period t, a random phenomenon which we call “success” occurs. We let Xt = number of
successes in time period t. We assume :
1. In a small time period ,t∆ either no success or one success occurs.
2. The prob of a success in a small time period t∆ is proportional to t∆ i.e. say
( ) tXP t ∆==∆ α1 . ( →α constant of proportionality)
3. The prob of a success during any time period does not depend on what
happened prior to that period.
Divide the time period t into n small time periods each of length t∆ . Hence by
assumptions above, we note that Xt = no of successes in time period t is a rv having
Binomial distribution with parameters n and tp ∆= α . Hence
( ) ( )t,n;xbxXP t ∆α==
( )
tn.where
nasx;f
=
∞→→
So we can say that Xt = no of successes in time period t is a rv having Poisson
distribution with parameter .tα
Meaning of the proportaratility constant α
Since mean of tisXt α=λ , We find α = mean no of successes in unit time.
(Note: For a more rigorous derivation of the distribution of Xt, you may see Meyer,
Introductory probability and statistical applications, pages 165-169).
Example 19 (Exercise 4.56 of your text)
Given that the switch board of a consultant’s office receives on the average 0.6 call per
minute, find the probability that
(a) In a given minute there will be at least one call.
(b) In a 4-minute interval, there will be at least 3 calls.
45
Solution
Xt= no of calls in a t-minute interval is a rv having Poisson distribution with parameter
tt 6.0=α
(a) ( ) ( ) .451.0549.01e10XP11XP 6.0
11 =−=−==−=≥ −
(b) ( ) ( ) ( ) 430.0570.014.2;2F12XP13XP 44 =−=−=≤−=≥
Example 20
Suppose that Xt, the number of particles emitted in t hours from a radio – active source
has a Poisson distribution with parameter 20t. What is the probability that exactly 5
particles are emitted during a 15 minute period?
Solution
15 minutes = hour
4
1
Hence 4
1Xif = no of particles emitted in hour
4
1
( )
)2tablefrom(176.0440.0616.0
!5
5
e
!5
20
4
1
e5XP
5
5
5
204
1
4
1
=−=
=
×
== −×−
46
THE GEOMETRIC DISTRIBUTION
Suppose there is a random experiment having only two possible outcomes, called
‘success’ and ‘failure’. Assume that the prob of a success in any one ‘trial’ (≡repetition
of the experiment) is p and remains the same for all trials. Also assume the trials are
independent. The experiment is repeated till a success is got. Let X be the rv that counts
the number of trials needed to get the 1st
success. Clearly X = x if the first (x-1) trials
were failures and the xth trial gave the first success. Hence
( ) ( ) ( ) ( )......2,1xpqpp1p;xgxXP 1x1x
==−=== −−
We say X has a geometric distribution with parameter p (as the respective probabilities
form a geometric progression with common ratio q).
We can show the mean of this distribution is
p
1
=µ and the variance is 2
2
p
q
=σ
(For example suppose a die is rolled till a 6 is got. It is reasonable to expect on an average
we will need 6
1
6
1
= rolls as there are 6 nos!)
Example 21 (Exercise 4.60 of your text)
An expert hits a target 95% of the time. What is the probability that the expert will miss
the target for the first time on the fifteenth shot?
Solution
Here ‘Success’ means the expert misses the target. Hence ( ) 05.0%5 === SuccessPp . If
X is the rv that counts the no. of shots needed to get ‘a success’, we want
( ) ( ) .05.095.015
1414
×=×== pqXP
47
Example 22
The probability of a successful rocket launching is 0.8. Launching attempts are made till
a successful launching has occurred. Find the probability that exactly 6 attempts will be
necessary.
Solution ( ) 8.02.0
5
×
Example 23
X has a geometric distribution with parameter p. show
(a) ( ) ,.........2,11
==≥ −
rqrXP r
(b) ( ) ( )tXPsxtsxP ≥=>+≥ |
Solution
(a) ( ) .q
q1
pq
.pqrXP 1r
1r
rx
1x −
−∞
=
−
=
−
==≥
(b) ( ) ( )
( )
( ).1
1
tXPq
q
q
sXP
tsXP
sXtsXP t
s
ts
≥===
>
+≥
=>+≥ −
−+
Application to Queuing Systems
Service facility
Customers arrive in a
Depart after service
Poisson Fashion
There is a service facility. Customers arrive in a random fashion and get service if the
server is idle. Else they stand in a Queue and wait to get service.
Examples of Queuing systems
1. Cars arriving at a petrol pump to get petrol
2. Men arriving at a Barber’s shop to get hair cut.
3. Ships arriving at a port to deliver goods.
S
48
Questions that one can ask are :
1. At any point of time on an average how many customers are in the system
(getting service and waiting to get service)?
2. What is the mean time a customer waits in the system?
3. What proportion of time a server is idle? And so on.
We shall consider only the simplest queueing system where there is only one server. We
assume that the population of customers is infinite and that there is no limit on the
number of customers that can wait in the queue.
We also assume that the customers arrive in a ‘Poission fashion’ at the mean rate of α .
This means that tX the number of customers that arrive in a time period t is a rv having
Poisson distribution with parameter tα . We also assume that so long as the service
station is not empty, customers depart in a Poisson fashion at a mean rate of β . This
means, when there is at least one customer, tY , the number of customers that depart
(after getting service) in a time period t is a r.v. having Poisson distribution with
parameter tβ (where αβ > ).
Further assumptions are : In a small time interval ,t∆ there will be a single arrival or a
single departure but not both. (Note that by assumptions of Poisson process in a small
time interval ,t∆ there can be at most one arrival and at most one departure). Let at time
t, tN be the number of customers in the system. Let ( ) ( ).tpnNP nt == We make another
assumption:
( ) nnn .tastp π∞→π→ is known as the steady state probability distribution of the
number of customers in the system. It can be shown:
( )...,2,1,01
1
=−=
−=
n
n
n
o
β
α
β
α
π
β
α
π
Thus L = Mean number of customers in the system getting service and waiting to get
service)
49
αβ
α
π
−
==
∞
=
n
n
n.
0
qL = Mean no of customers in the queue (waiting to get service)
( )
( ) β
α
αββ
α
π −=
−
=−=
∞
=
Ln n
n
2
1
1
W = mean time a customer spends in the system
ααβ
L
=
−
=
1
qW = Mean time a customer spends in the queue.
( )
.
1
βααββ
α
−==
−
= W
Lq
(For a derivation of these results, see Operations Research Vol. 3 by Dr. S.
Venkateswaran and Dr. B Singh, EDD Notes of BITS, Pilani).
Example 24 (Exercise 4.64 of your text)
Trucks arrive at a receiving dock in a Poisson fashion at a mean rate of 2 per hour. The
trucks can be unloaded at a mean rate of 3 per hour in a Poisson fashion (so long as the
receiving dock is not empty).
(a) What is the average number of trucks being unloaded and waiting to get
unloaded?
(b) What is the mean no of trucks in the queue?
(c) What is the mean time a truck spends waiting in the queue?
(d) What is the prob that there are no trucks waiting to be unloaded?
(e) What is the prob that an arriving truck need not wait to get unloaded?
50
Solution
Here α = arrival rate = 2 per hour
β = departure rate = 3 per hour.
Thus
(a) 2
23
2
=
−
=
−
=
αβ
α
L
(b)
( ) ( ) 3
4
13
222
==
−
=
αββ
α
qL
(c)
( )
hrWq
3
2
=
−
=
αββ
α
(d) P (no trucks are waiting to be unloaded)
= (No of trucks in the dock is 0 or 1)
3
2
3
2
1
3
2
11110 −+−=−+−=+=
β
α
β
α
β
α
ππ
9
5
9
2
3
1
=+=
(e) P (arriving truck need not wait)
= P (dock is empty)
=
3
1
0 =π
Example 25
With reference to example 24, suppose that the cost of keeping a truck in the system is
Rs. 15/hour. If it were possible to increase the mean loading rate to 3.5 trucks per hour at
a cost of Rs. 12 per hour, would this be worth while?
51
Solution
In the old scheme, 2,3,2 === Lβα
∴ Mean cost per hour to the dock = 2 x 15 = 30/hr.
In the new scheme
3
4
L,3,2 ==β=α verify!
∴ Net cost per hour to the dock = .hr/321215
3
4
=+×
Hence it is not worthwhile to go in for the new scheme.
52
MULTINOMIAL DISTRIBUTION
Consider a random experiment E and suppose it has k possible outcomes .,...., 21 kAAA
Suppose ( ) ii pAP = for all i and that pi remains the same for all independent repetitions
of E. Consider n independent repetitions of E. Suppose A1 occurs X1 times, A2 occurs X2
times, …, Ak occurs Xk times. Then ( )kk xXxXxXP === ,...., 2211
= kx
k
xx
k
ppp
xxx
n
.....
!!......!
! 21
21
21
for all non-ve integers xxxxwithxxx kk =+++ .....,, 2121
Proof. The probability of getting 11 xA times, 22 xA times, kk xA times in any one way
is kx
k
xx
ppp ......21
21 as all the repetitions are independent. Now among the n repetitions
1A occurs 1x times in
( )!!
!
111 xnx
n
x
n
−
= ways.
From the remaining 1xn − repetitions 2A can occur 2x times in
( )
( )!!
!
212
1
2
1
xxnx
xn
x
xn
−−
−
=
−
ways and so on.
Hence the total number of ways of getting 11 xA times, 22 xA times, …. kk xA times will
be
( )
( )
( )
( )
( )!....!
!.....
...
!!
!
!!
!
121
121
212
1
11 kkk
k
xxxxnx
xxxn
xxnx
xn
xnx
n
−−−
−−
×
−−
−
×
− −
−
1!0.....
!!......!
!
21
21
==++= andnxxxas
xxx
n
k
k
Hence ( ) kx
k
xx
k
kk ppp
xxx
n
xXxXxXP ....
!!....!
!
,....., 21
21
21
2211 ====
53
Example 26
A die is rolled 30 times. Find the probability of getting 1 2 times, 2 3 times, 3 4 times,
4 6 times, 5 7 times and 6 8 times.
Ans
876432
6
1
6
1
6
1
6
1
6
1
6
1
!8!7!6!4!3!2
!30
×
Example 27 (See exercise 4.72 of your text)
The probabilities are, respectively, 0.40, 0.40, and 0.20 that in city driving a certain type
of imported car will average less than 10 kms per litre, anywhere between 10 and 15 kms
per litre, or more than 15 kms per litre. Find the probability that among 12 such cars
tested, 4 will average less than 10 kms per litre, 6 will average anywhere from 10 to 15
kms per litre and 2 will average more than 15 kms per litre.
Solution
( ) ( ) ( ) .20.40.40.
!2!6!4
!12 264
Remark
1. Note that the different probabilities are the various terms in the expansion of the
multinomial
( )n
kppp ......21 ++ .
Hence the name multinomial distribution.
2. The binomial distribution is a special case got by taking k =2.
3. For any fixed ( ) iXkii ≤≤1 (the number of ways of getting iA ) is a random
variable having binomial distribution with parameters n and pi. Thus
( ) ii pnXE = and ( ) ( ) ...k1,2.......i.p1npXV iii =−=
54
SIMULATION
Nowadays simulation techniques are being applied to many problems in Science and
Engineering. If the processes being simulated involve an element of chance, these
techniques are referred to as Monte Carlo methods. For example to study the distribution
of number of calls arriving at a telephone exchange, we can use simulation techniques.
Random Numbers : In simulation problems one uses the tables of random numbers to
“generate” random deviates (values assumed by a random variable). Table of random
numbers consists of many pages on which the digits 0,1,2….. 9 are distributed in such a
was that the probability of any one digit appearing is the same, namely
10
1
1.0 = .
Use of random numbers to generate ‘heads’ and ‘tails’. For example choose the 4th
column of the four page of table 7, start at the top and go down the page. Thus we get
6,2,7,5,5,0,1,8,6,3….. Now we can interpret this as H,H,T, T,T, H, T, H, H,T, because the
prob of getting an odd no. = the propagating an even number = 0.5 Thus we associate
head to the occurrence of an even number and tail to that of an odd no. We can also
associate a head if we get 5,6,7,8, or 9 and tail otherwise. The use can say we got
H,T,H,H,H,T,T,H,H,T….. In problems on simulation we shall adopt the second scheme
as it is easy to use and is easily ‘extendable’ for more than two outcomes. Suppose for
example, we have an experiment having 4 outcomes with prob. 0.1, 0.2, 0.3 and 0.4
respectively.
Thus to simulate the above experiment, we have to allot one of the 10 digits 0,1….9 to
the first outcome, two of them to the second outcome, three of them to the third outcome
and the remaining four to the fourth outcome. Though this can be done in a variety of
ways, we choose the simplest way as follows:
Associate the first digit 0 to the 1st
outcome 10
Associate the next 2 digits 1,2 to the 2nd
outcome 20
Associate the next 3 digits 3,4,5 to the 3rd
outcome 30 .
And associate the last 4 digits 6,7,8,9 to the 4th
outcome 40 .
Hence the above sequence 6,2,7,5,5,0,1,8,6,3… of random numbers would correspond to
the sequence of outcomes ..............,,,,,,,,, 3442133424 OOOOOOOOOO
Using two and higher – digit Random numbers in Simulation
55
Suppose we have a random experiment with three outcomes with probabilities 0.80, 0.15
and 0.05 respective. How can we now use the table of random numbers to simulate this
experiment? We now read 2 numbers at a time : say (starting from page 593 room 12,
column 4) 84,71,14,24,20,31,78, 03………….. Since P (anyone digit) =
10
1
, P (any two
digits) = 01.0
10
1
10
1
=× . Thus each 2 digit random number occurs with prob 0.01.
Now that there will be 100 2 digit random numbers : 00, 01, …, 10, 11, …, 20, 21, …,
98, 99. Thus we associate the first 80 numbers 00,01…79 to the first out come, the next
15 numbers (80, 81, …94) to the second outcome and the last 5 numbers (95, 96, …, 99)
to the 3rd
outcome. Thus the above sequence of 2 digit random numbers would simulate
the outcomes:
.......,,,,,,, 11111112 OOOOOOOO
We describe the above scheme in a diagram as follows:
Outcome Probability Cumulative Probability*
Random Numbers**
O1 0.80 0.80 00-79
O2 0.15 0.95 80-94
O3 0.05 1.00 95-99
* Cumulative prob is got by adding all the probabilities at that position and above thus cumulative
prob at O2 = Prob of O1 + Prob O2 = 0.80 + 0.15 = 0.95.
** You observe the beginning random number is 00 for the 1st
outcome; and for the remaining
outcomes, it is one more than the ending random numbers of the immediately preceding outcome.
Also the ending random number for each outcome is “one less than the cumulative probability”.
Similarly three digit random numbers are used if the prob of an outcome has 3 decimal
places. Read the example on page 133 of your text book.
56
Exercise 4.97 on page 136
Starting with page 592, Row 14, Column 7, we read of the 4 digit random nos as :
R No. Polluting spics R.No. Polluting spics
5095 1 2631 1
0150 0 3033 1
8043 2 9167 3
9079 3 4998 1
6440 2 7036 2
CONTINOUS RANDOM VARIABLES
In many situations, we come across random variables that take all values lying in a
certain interval of the x axis.
Example
(1) life length X of a bulb is a continuous random variable that can take all non-ve
real values.
(2) The time between two consecutive arrivals in a queuing system is a random
variable that can take all non-ve real values.
No. of polluting spices Probability
Cumulative
Probability
Random Numbers
0 0.2466 0.2466 0000-2465
1 0.3452 0.5918 2466-5917
2 0.2417 0.8335 5918-8334
3 0.1128 0.9463 8335-9462
4 0.0395 0.9858 9463-9857
5 0.0111 0.9969 9858-9968
6 0.0026 0.9995 9969-9994
7 0.0005 1.0000 9995-9999
57
(3) The distance R of the point (where a dart hits) (from the centre) is a
continuous random variable that can take all values in the interval (0,a) where
a is the radius of the board.
It is clear that in all such cases, the probability that the random variable takes any one
particular value is meaningless. For example, when you buy a bulb, you ask the question?
What are the chances that it will work for at least 500 hours?
Probability Density function (pdf)
If X is a continuous random variable, the questions about the probability that X takes
values in an interval (a,b) are answered by defining a probability density function.
Def Let X be a continuous rv. A real function f(x) is called the prob density function of X
if
(1) ( ) xallforxf 0≥
(2) ( ) 1=
∞
∞−
dxxf
(3) ( ) ( ) .dxxfbXaP
b
a
=≤≤
Condition (1) is needed as probability is always .0≥
Condition (2) says that the probability of the certain event is 1.
Condition (3) says to get the prob that X takes a value between a and b, integrate the
function f(x) between a and b. (This is similar to finding the mass of a rod by integrating
its density function).
Remarks
1. ( ) ( ) ( ) 0==≤≤== dxxfaXaPaXP
a
a
2. Hence ( ) ( ) ( ) ( )bXaPbXaPbXaPbXaP <<=<≤=≤<=≤≤
Please note that unlike discrete case, it is immaterial whether we include or
exclude one or both the end points.
3. ( ) ( ) xxfxxXxP ∆≈∆+≤≤
58
This is proved using Mean value theorem.
Definition (Cumulative Distribution function)
If X is a continuous rv and if f(x) is its density,
( ) ( ) ( )dttfxXPxXP
x
∞−
=≤<∞−=≤
We denote the above by F(x) and call it the cumulative distribution function (cdf) of X.
Properties of cdf
1. ( ) .10 xallforxF ≤≤
2. ( ) ( )2121 xFxFxx ≤< i.e., F(x) is a non-decreasing function of x.
3. ( ) ( ) ( ) ( ) .1;0 limlim ==∞+==∞−
∞→−∞→
xFfxfF
xx
4. ( ) ( ) ( )xfdttf
dx
d
xF
dx
d
x
==
∞−
(Thus we can get density function f(x) by differentiating the distribution function F(x)).
Example 1 (Exercise 5.2 of your book)
If the prob density of a rv is given by ( ) 102
<<= xkxxf (and 0 elsewhere) find the
value of k and the probability that the rv takes on a value
(a) Between
4
3
4
1
and
(b) Greater than
3
2
Find the distribution function F(x) and hence answer the above questions.
59
Solution
( ) 1=
∞
∞−
dxxf
gives
( ) ( )( )
.31
3
1
1..
1001
2
1
0
1
0
===
><==
korkordxkxei
orxifxfasdxxf
Thus ( ) .0103 2
otherwiseandxxxf ≤≤=
32
13
64
26
4
1
4
3
3
4
3
4
1
33
24
3
4
1
==−==<< dxxXP
27
19
3
2
1
31
3
2
3
2
3
3
2
1
3
2
=−=
=<<=> dxxXPXP
Distribution function ( ) ( )dttfxF
x
∞−
=
Case (i) 0≤x . In this case ( ) 0=tf between ( ) 0=∴∞− xFxand
Case (ii) 0<x<1. In this case ( ) 2
3ttf = between 0 and x and 0 for t<0.
( ) ( ) .3 32
0
xdttdttfxF
xx
===∴
∞−
Case (iii) x > 1
Now ( ) 10 >= tfortf
60
( ) ( ) ( ) )(1
1
iicasebydttfdttfxF
x
===∴
∞−∞−
Hence we can say the distribution function
( )
>
≤<
≤
=
01
10
00
3
x
xx
x
xF
Now ≤−<=<<
4
1
4
3
4
3
4
1
XPXPXP
= ≤−≤
4
1
4
3
XPXP
=
32
13
4
1
4
3
4
1
4
3
33
=−=− FF
27
19
3
2
1
3
2
1
3
2
1
3
2
3
=−=−=
≤−=>
F
XPXP
Example 2 (Exercise 5.4 of your book)
The prob density of a rv X is given by
( ) <≤−
<<
=
elsewhere
xx
xx
xf
0
212
10
Find the prob that the rv takes a value
(a) between 0.2 and 0.8
(b) between 0.6 and 1.2
Find the distribution function and answer the same questions.
61
Solution
(a) ( ) ( )dxxfXP =<<
8.0
2.0
8.02.0
= 3.0
2
2.0
2
8.0
22
8.0
2.0
=−=dxx
(b) ( ) ( )dxxfXP =<<
2.1
6.0
2.16.0
= ( ) ( ) ( )?
2.1
4
1
6.0
whydxxfdxxf +
= ( )
2.1
1
222
2.1
1
1
6.0 2
2
2
6.0
2
1
2
−
−+−=−+
x
dxxdxx
( ) 5.018.032.0
2
8.
2
1
32.0
2
=+==+=
To Find the distribution function ( ) ( ) ( )dttfxxPxF
x
∞−
=≤=
Case (i) 0≤x In this case ( ) xtfortf ≤= 0
( ) ( ) .0==∴
∞−
dttfxF
x
Case (ii) 10 ≤< x In this case ( ) xtfortandttfortf ≤==≤= 00
Hence ( ) ( ) ( ) ( )dttfdttfdttfxF
xx
+==
∞−∞− 0
0
1
=
2
0
2
0
x
dtt
x
=+
Case (iii) 21 ≤< x In this case ( ) 00 ≤= ttf
xtt
tt
≤<−
≤<
12
10
62
( ) ( )dttfxF
x
∞−
=∴
= ( ) ( )dttfdttf
x
+
∞− `1
1
= ( ) ( )dttiicaseby
x
−+ 2
2
1
1
=
( ) ( )
2
2
1
2
2
2
1
2
1
22
xx −
−=
−
−+
Case (iv) x > 2 In this case ( ) xtfortf <<= 20
( ) ( )dttfxF
x
∞−
=∴
( ) ( )
( ) 101
2
2
2
=+=
+=
∞−
dtiiicaseby
dttfdttf
x
x
Thus
( ) ( )
>
≤<
−
−
≤<
≤
=
21
21
2
2
1
10
2
00
2
2
x
x
x
x
x
x
xF
( ) ( ) ( )
( ) ( )
( ) ( )
( ) ( )
5.0
2
6.0
2
8.0
1
6.02.1
6.02.1
6.02.12.16.0
22
=
−−=
−=
≤−≤=
≤−<=<<∴
FF
XPXP
XPXPXP
63
( ) ( )
( ) ( ) 02.0
2
2.
118.11
8.118.1
2
=−−=−=
≤−=>
F
XPXP
The mean and Variance of a continuous r.v
Let X be a continuous rv with density f(x)
We define its mean as
( ) ( )dxxfxXE
∞
∞−
==µ
We define its variance 2
σ as
( ) ( ) ( )
( ) 22
22
µ
µµ
−=
−=−
∞
∞−
XE
dxxfxxE
Here ( ) ( )dxxfxXE 22
∞
∞−
=
Example 3 The density of a rv X is
( ) ( )elsewhereandxxxF 0103 2
<<=
Its mean ( ) ( ) .
4
3
3. 2
1
0
====
∞
∞−
dxxxdxxfxXEµ
( ) ( )
5
3
3. 22
1
0
22
==
=
∞
∞−
dxxx
dxxfxXE
Hence 0375.0
4
3
5
3
2
2
=−=σ
Hence its sd is .1936.0=σ
64
Example 4 The density of a rv X is
( ) >
=
−
elsewhere
xe
xf
x
0
0
20
1 20/
( ) ( ) dxexdxxfxXE x 20/
0 20
1
. −
∞∞
∞−
===µ
Integrating by parts we get
( )[ ]
.20
20. 0
20/20/
=
−−=
∞−− xx
eex
( ) ( )
dxex
dxxfxXE
x 20/2
0
22
20
1 −
∞
∞
∞−
=
=
On integrating by parts we get
( ) ( )( ) ( )[ ]
( )
.20
400400800
800
400.2202
222
0
20/20/20/2
=∴
=−=−=∴
=
−+−−
∞−−−
σ
µσ XE
eexex xxx
NORMAL DISTRIBUTION
A random variable X is said to have the normal distribution (or Gaussian Distribution) if
its density is
( )
( )
∞<<∞−=
−
−
xexf
x
2
2
22
2
1
,; σ
µ
σπ
σµ
Hence σµ, are fixed (called parameters) and .0>σ The graph of the normal density is
a bell shaped curve:
65
Figure
It is symmetrical about the line µ=x and has points of inflection at .σµ ±=x
One can use integration and show that ( ) 1=
∞
∞−
dxxf . We also see that ( ) µ=XE and
variance of ( ) .22
σµ =−= XEX
If ,1,0 == σµ we say that X has standard normal distribution. We usually use the
symbol Z to denote the variable having standard normal distribution. Thus when Z is
standard normal, its density is ( ) .,
2
1 2
2
∞<<∞−= −
zezf z
π
The cumulative distribution function of Z is
( ) ( ) dtezZPzF tz
2
2
2
1 −
∞−
=≤=
π
and represents the area under the density upto z. It is the shaded portion in the figure.
Figure
We at once see from the symmetry of the graph that ( ) 5.0
2
1
0 ==F
( ) ( )zFzF −=− 1
66
F(z) for various positive z has been tabulated at in table 3 (at the end of your book).
We thus see from Table 3 that
( ) ( ) 95.0645.1,6443.037.0 == FF
( ) ( ) 3199.033.2 ≥≈= zforzFF
Hence ( ) 3557.06443.0137.0 =−=−F
( ) etcF 05.095.01645.1 =−=−
Definition of αz
If Z is standard normal, we define αz to be that number such that
( ) ( ) .1 αα αα −==> zForzZP
Since F(1.645) = 0.95 = 1-0.05, we see that
645.105.0 =z
Similarly 33.201.0 =z
we also note αα zz −=−1
Thus 645.105.095.0 −=−= zz
.33.201.099.0 −=−= zz
Important
If X is normal with mean µ and variance ,2
σ it can be shown that the standardized r.v.
σ
µ−
=
X
Z has standard normal distribution. Thus questions about the prob that X
assumes a value between say a and b can be translated into the prob that Z assumes
values in a corresponding range. Specifically :
( )bXaP <<
67
−
−
−
=
−
<<
−
=
−
<
−
<
−
=
σ
µ
σ
µ
σ
µ
σ
µ
σ
µ
σ
µ
σ
µ
a
F
b
F
b
Z
a
P
bXa
P
Example 1 (See Exercise 5.24 on page 152)
Given that X has a normal distribution with mean 2.16=µ and variance ,5625.12
=σ
find the prob that it will take on a value
(a) > 16.8
(b) < 14.9
(c) between 13.6 and 18.8
(d) between 16.5 and 16.7
Here 25.15625.1 ==σ
Thus ( ) −
>
−
=>
25.1
2.168.16
8.16
σ
µX
PXP
( )
( ) ( )
3156.06844.01
48.0148.01
48.0
25.1
6.
=−=
−=≤−=
>=>=
FzP
ZPZP
(b) ( ) −
<
−
=<
25.1
2.169.14
9.14
σ
µX
PXP
( )
( ) ( ) 1492.8508.0104.1104.1
04.1
25.1
3.1
=−=−=−=
−<=−<=
FF
ZPZP
( )
−
<
−
<
−
<<
25.1
2.168.18
25.1
2.166.13
8.186.13
σ
µX
P
XP
68
( )
( ) ( ) ( ) ( )( )
( ) 9624.19812.02108.22
08.2108.208.208.2
08.208.2
25.1
6.2
25.1
6.2
=−×=−=
−−=−−=
<<−=<<−=
F
FFFF
ZPZP
(Note that ( ) ( ) 012 >−=<<− cforcFcZcP )
( )
( ) ( ) ( )
606.05948.06554.0
24.04.04.024.0
25.1
5.
25.1
3.
25.1
2.167.16
25.1
2.165.16
7.165.16
=−=
−=<<=
<<=
−
<
−
<
−
=<<
FFzP
ZP
X
PXP
σ
µ
Example 2
A rv X has a normal distribution with .10=σ If the prob is 0.8212 that it will take on a
value < 82.5, what is the prob that it will take on a value > 58.3?
Solution
Let the mean (unknown) be µ .
Given ( ) 8212.05.82 =<XP
Thus 8212.0
10
5.82
=
−
<
− µ
σ
µX
P
Or 8212.0
10
5.82
=
−
<
µ
ZP
8212.0
10
5.82
=
− µ
F
From table 3, 92.0
10
5.82
=
− µ
Or 3.732.95.82 =−=µ
Hence ( )3.58>XP
69
( )5/1
10
3.733.58
>=
−
>
−
= ZP
X
P
σ
µ
( ) ( )
( )( ) ( ) 9332.05.15.111
5.115.11
==−−=
−−=−≤−=
FF
FZP
Example 3 (See Exercise 5.33 on page 152)
In a Photographic process the developing time of prints may be looked upon as a r.v. X
having normal distribution with 28.16=µ seconds and s.d. of 0.12 second. For which
value is the prob 0.95 that it will be exceeded by the time it takes to develop one of the
prints.
Solution
That is find a number c so that
( ) 95.0=> cXP
i.e 95.0
2.1
28.16
=
−
>
− cX
P
σ
µ
i.e. 95.0
2.1
28.16
=
−
>
c
ZP
Hence 05.0
2.1
28.16
=
−
≤
c
ZP
.306.14645.12.128.16
645.1
2.1
28.16
=×−=∴
=
−
∴
c
c
NORMAL APPROXIMATION TO BINOMIAL DISTRIBUTION
Suppose X is a r.v. having Binomial distribution with parameters n and p. Then it can be
shown that ( ) ( ) .∞→=≤→≤
−
naszFzZPz
npq
npX
P i.e in words, standardized
binomial tends to standard normal.
70
Thus when n is large, the binomial probabilities can be approximated using normal
distribution function.
Example 4 (See Exercise 5.36 on page 153)
A manufacturer knows that on the average 2% of the electric toasters that he makes will
require repairs within 90 days after they are sold. Use normal approximation to the
binomial distribution to determine the prob that among 1200 of these toasters at least 30
will require repairs within the first 90 days after they are sold?
Solution
Let X = No. of toasters (among 1200) that require repairs within the first 90 days after
they are sold. Hence X is a rv having Binomial Distribution with parameters n = 1200
and .02.
100
2
==p
Required ( ) −
≥
−
=≥
85.4
2430
30
npq
npX
PXP
( ) ( )
( ) 1075.08925.0124.11
24.1124.1
=−=−=
<−≥≈
F
ZPZP
Correction for Continuity
Since for continuous rvs
( ) ( )czPczP >=≥ (which is not true for discrete rvs), when we
approximate binomial prob by normal prob, we must ensure that we do not ‘lose’ the end
point. This is achieved by what we call continuity correction: In the previous example,
( )30≥XP also = ( )5.29≥XP (Read the justification given in your book on page 150
line 1to 7).
( )
( ) ( )
1292.
878.0113.1113.11
13.1
85.4
5.5
85.4
245.29
=
−=−=≤−=
≥=≥≈
−
≥
−
=
FZP
ZPZP
npq
npX
P
(probably better answer).
71
Example 5 (See Exercise 5.38 on page 153)
A safety engineer feels that 30% of all industrial accidents in her plant are caused by
failure of employees to follow instructions. Find approximately the prob that among 84
industrial accidents anywhere from 20 to 30 (inclusive) will be due to failure of
employees to follow instructions.
Solution
Let X = no. of accidents (among 84) due to failure of employees to follow instructions.
Thus X is a rv having Binomial distribution with parameters n = 84 and p = 0.3.
Thus 2.42.25 == npqandnp
Required ( )3020 ≤≤ XP
( )5.305.19 ≤≤= XP (continuity correction)
( )
( ) ( ) ( ) ( )
8093.019131.08962.0
136.126.136.126.1
26.136.1
2.4
2.255.30
2.4
2.255.19
=−+=
−+=−−=
≤≤−≈
−
≤
−
≤
−
=
FFFF
ZP
npq
npX
P
OTHER PROBABILITY DENSITIES
The Uniform Distribution
A r.v X is said to have uniform distribution over the interval ( )βα, if its density is given
by
( ) <<
−=
elsewhere
x
xf
0
1
βα
αβ
72
Thus the graph of the density is a constant over the interval ( )βα,
If βα <<< dc
( )
−
−
=
−
=<<
d
c
cd
dxdXcP
αβαβ
1
and thus is proportional to the length of the interval ( )., dc
You may verify that
The mean of ( )
2
βα
µ
+
=== XEX (mid point of the interval ( )βα, )
The variance of
( )
12
2
2 αβ
σ
−
==X . The cumulative distribution function is
( )
>
≤<
−
−
≤
=
β
βα
αβ
α
α
x
x
x
x
xf
1
0
Example 6 (See page 165 exercise 546)
In certain experiments, the error X made in determining the solubility of a substance is a
rv having the uniform density with 025.0025.0 =−= βα and . What is the prob such an
error will be
(a) between 0.010 and 0.015?
(b) between –0.012 and 0.012?
Solution
(a) ( )
( )025.0025.0
010.0015.0
015.0010.0
−−
−
=<< XP
1.0
050.0
005.0
==
(b) ( ) ( )
( )025.0025.0
012.0012.0
012.0012.0
−−
−−
=<<− XP
48.0
25
12
==
73
Example 7 (See exercise 5.47 on page 165)
From experience, Mr. Harris has found that the low bid on a construction job can be
regarded as a rv X having uniform density
( ) <<
=
elsewhere
Cx
C
Cxf
0
2
3
2
4
3
where C is his own estimate of the cost of the job. What percentage should Mr. Harris
add to his cost estimate when submitting bids to maximize his expected profit?
Solution
Suppose Mr. Harris adds k% of C when submitting his bid. Thus Mr. Harris gets a profit
100
kC
if he gets the contract which happens if the lowest bid (by others) +≥
100
kC
C and
gets no profit if the lowest bid
100
kC
C +< . Thus the prob that he gets the bid
−=+−×=<<+=
100
1
4
3
100
2
4
3
2
100
kkC
CC
C
CX
kC
CP
Thus the expected profit of Mr. Harris is
( )....0
100
1
4
3
100
×+−×
kkC
−=
100400
3 2
k
k
C
which is maximum (by using calculus) when k =50.
Thus Mr. Harris’s expected profit is a maximum when he adds 50% of C to C, when
submitting bids.
Gamma Function
This is one of the most useful functions in Mathematics. If x > 0, it is shown that the
improper integral dtte xt 1
0
−−
∞
converges to a fuite real number which we denote by ( )xΓ
(Capital gamma of x). Thus for all real no x > 0, we define
( ) .1
0
dttex xt −−
∞
=Γ
74
Properties of Gamma Function
1. ( ) ( )xxx Γ=+Γ 1 , x > 0
2. ( ) 11 =Γ
3. ( ) ( ) ( ) ( ) !212223,1112 =×=Γ=Γ=Γ=Γ
More generally ( ) !1 nn =+Γ whenever n is a +ve integer or zero.
4. Γ
2
1
.π=
5. ( )xΓ decreases in the interval (0,1) and increases in the interval ( )∞,2 and has a
minimum somewhere between 1 and 2.
THE GAMMA DISTRIBUTION
Let βα1 be 2 +ve real numbers. A r.v X is said to have a Gamma Distribution with
parameters βα1 if its density is
( ) ( )
>
Γ=
−−
elsewhere
xxe
xf
x
0
0
1 1. α
α
β
αβ
It can be shown that
Mean of ( ) αβµ === XEX
(See the working on Page 159 of your text book)
Variance of .22
αβσ ==X
Exponential Distribution
If ,1=α we say X has exponential distribution. Thus X has an exponential distribution
(with parameter 0>β ) if its density is
( ) >
=
−
elsewhere
xe
xf
x
0
0
1 β
β
75
We also see easily that:
1. Mean of ( ) β== XEX
2. Variance of 22
βσ ==X
3. The cumulative distribution function of X is
( ) >−
=
−
elsewhere
xe
xF
x
0
01 β
4. X has the memoryless property:
( ) ( ) 0,.,| >>=>+> tstXPsXtsXP
Proof of (4): ( ) ( )sXPsXP ≤−=> 1
( ) β
s
esF
−
=−=1 (by (3))
( ) ( ) ( )( )
( )sXP
sXtsXP
sXtsXP
>
>∩+>
=>+> |
( )
( )
( )
( )QEDtxPe
e
e
sXP
tsXP t
s
ts
.
/
>===
>
+>
=
−
−
+−
β
β
β
Example 8 (See exercise 5.54 on page 166)
In a certain city, the daily consumption of electric power (in millions of kw hours) can be
treated as a r.v. X having a Gamma distribution with .2,3 == βα If the power plant in
the city has a daily capacity of 12 million kw hrs, what is the prob. that the power supply
will be inadequate on any given day?
Solution
The power supply will be inadequate if demand exceeds the daily capacity.
Hence the prob that the power supply is inadequate
( ) ( )
∞
=>=
12
12 dxxfXP
76
Now as ( )
( )
132
3
32
1
,2,3 −
−
Γ
=== xexf
x
βα
22
16
1
x
ex
−
=
Hence ( )
∞
−
=>
12
22
10
1
12 dxexXP
x
Integrating by parts, we get
[ ]
062.025
10
400
16128122
16
1
82422
10
1
66
6662
12
2222
===
+××+××=
−+−−=
−−
−−−
∞
−−−
ee
eee
eexex
xxx
Example 9 (see exercise 5.58 on Page 166)
The amount of time that a surveillance camera will run without having to be reset is a r.v.
X having exponential distribution with 50=β days. Find the prob that such a camera
(a) will have to be reset in less than 20 days.
(b) will not have to be reset in at least 60 days.
Solution
The density of X is
( ) )0(0
50
1 50
elsewhereandxexf
x
>=
−
(a) P (The camera has to be reset in < 20 days)
= P (the running time < 20)
77
( )
3297.011
50
1
20
5
2
50
20
20
0
20
0
5050
=−=−=
−==<=
−−
−−
ee
edxeXP
xx
(b) P (The camera will not have to be reset in at least 60 days.)
( )
3012.0
50
1
60
5
6
60
50
60
50
==−=
=>=
−
∞
−
∞
−
ee
dxeXP
x
x
Example 10 (See exercise 5.61 on page 166)
Given a Poisson process with the average α arrivals per unit time, find the prob density
of the inter arrival time (i.e the time between two consecutive arrivals).
Solution
Let T be the time between two consecutive arrivals. Thus clearly T is a continuous r.v.
with values > 0. Now T > t No arrival in time period t.
Thus ( ) ( )0==> tXPtTP
( tX = Number of arrivals in time period t)
t
e α−
= (as tX has a Poisson distribution with parameter tαλ = )
Hence the distribution function of T
( ) ( ) ( ) 011 >−=>−=≤== tettPtTPtF tα
( )( )00 ≤= tallforclearlytF
78
Hence the density of ( ) ( )tF
dt
d
tfT =,
>
=
−
elsewhere
tife t
0
0α
α
Hence we would say the IAT is a continuous rv. with exponential density with parameter
α
1
.
The Beta Function
If x,y>0 the beta function, ( )yxB , (read capital Beta x,y), is defined by
( ) ( ) −−
−=
1
0
11
1, dtttyxB
yx
It is well-known that ( ) ( ) ( )
( )
.0,,, >
+Γ
ΓΓ
= yx
yx
yx
yxB
BETA DISTRIBUTION
A r.v. X is said to have a Beta distribution with parameter 0, >βα if its density is
( )
( )
( )
elsewhere
xxx
B
xf
0
101,
,
1 11
<<−=
−− βα
βα
It is easily shown that
(1) ( )
βα
α
µ
+
==XE
(2) ( )
( ) ( )1
2
2
+++
==
βαβα
αβ
σXV
79
Example 11 (See Exercise 5.64)
If the annual proportion of erroneous income tax returns can be looked upon as a rv
having a Beta distribution with ,9,2 == βα what is the prob that in any given year,
there will be fewer than 10% of erroneous returns?
Solution
Let X = annual proportion of erroneous income tax returns. Thus X has a Gamma density
with .9,2 == βα
( ) ( )=<∴
1.0
0
1.0 dxxfXP (Note the proportion can not be < 0)
( )
( ) −−
−=
1.0
0
1912
1
9,2
1
dxxx
B
( ) ( ) ( )
( ) 990
1
11109
1
!11
!81
11
92
9,2 =
××
=
×
=
Γ
ΓΓ
=B
( ) ( ) ( )[ ]−−−=−
1.0
0
1.0
0
988
111. dxxxdxxx
( ) ( ) ( ) ( )
( ) ( )
00293.0
900
19
9.
90
1
10
1
9
1
9
1
10
9.
9.
10
1
10
9.
9
1
9
9.
10
1
9
1
99
1091.0
0
109
=
×−=−+−=
−++
−
=
−
−
−
−
−
=
xx
The Log –Normal Distribution
A r.v X is said to have a log normal distribution if its density is
( )
( )
>>
=
−−−
elsewhere
xex
xf
x
0
0,0
2
1 22
2/ln1
β
βπ
βα
80
It can be shown that if X has log-normal distribution, XlnY = has a normal distribution
with mean αµ = and s.d. .βσ =
Thus ( )bXaP <<
( )bXap lnlnln <<=
−
−
−
=
−
<<
−
=
β
α
β
α
β
α
β
α a
F
b
F
b
Z
a
p
lnlnlnln
Where ( ) cdfzF = of the standard normal variable Z.
Lengthy calculations show that if X has log-normal distribution, its mean ( ) 2
2β
α +
= eXE
and its variance = ( )1
22
2
−+ ββα
ee
More problems on Normal Distribution
Example 12
Let X be normal with mean .sdand Determine c as a function of σµ and such
that
( ) ( )cXPcXP ≥=≤ 2
Solution
( ) ( )cxPcXP ≥=≤ 2
Implies ( ) ( )( )cXPcXP <−=≤ 12
Let ( ) pcXP =≤
Thus
3
2
23 == porp
Now ( ) 6667.
3
2
==
−
=
−
≤
−
=≤
σ
µ
σ
µ
σ
µ c
F
cX
PcXP
Implies 43.0=
−
σ
µc
(approx from Table 3)
σµ 43.0+=∴c
81
Example 13
Suppose X is normal with mean 0 and sd 5. Find ( )41 2
<< XP
Solution
( )
( )
<−<=<<=
<<=
<<
5
1
5
2
5
2
5
1
21
41 2
ZPZPZP
XP
XP
−=−−−=
5
1
5
2
21
5
1
21
5
2
2 FFFF
( )5793.06554.02 −= from Table 3
( ) 1522.00761.2 =×=
Example 14
The annual rain fall in a certain locality is a r.v. X having normal distribution with mean
29.5” and sd 2.5”. How many inches of rain (annually) is exceeded about 5% of the time?
Solution
That is we have to find a number C such that
( )
6125.33
645.15.25.29
645.1
5.2
5.29
05.0
5.2
5.29
.
05.0
05.0
=
×+=∴
==
−
=
−
>
−
=>
C
z
C
Hence
CX
Pei
CXP
σ
µ
82
Example 15
A rocket fuel is to contain a certain percent (say X) of a particular compound. The
specification calls for X to lie between 30 and 35. The manufacturer will make a net
profit on the fuel per gallon which is the following function of X.
( )
≤<<≤
<<
=
3025403505.0$
353010.0$
XorXifgallonper
Xifgallonper
XT
-$0.10 per gallon elsewhere.
If X has a normal distribution with mean 33and s.d. 3, find the prob distribution of T and
hence the expected profit per gallon.
Solution
T = 0.10 if 30 < X < 35
( ) ( )
( ) ( )
5899.018413.07486.0
11
3
2
1
3
2
3
2
1
3
3335
3
3330
353010.0
=−+=
−+=−−=
<<−=
−
<
−
<
−
=
<<==∴
FFFF
ZP
X
P
XPTP
σ
µ
( ) ( ) ( )
( )
( )1
3
8
3
2
3
7
3
8
1
3
2
3
7
1
3
8
3
7
3
2
5
3330
3
3325
3
3340
3
3335
3025403505.0
FFFF
FFFF
ZPZP
X
P
X
P
XPXPTP
−+−=
−−−+−=
−≤<
−
+<≤=
−
<
−
<
−
+
−
<
−
≤
−
=
≤<+<≤==
σ
µ
σ
µ
( )
0138.0
3963.05899.0110.0
3963.08413.09961.07486.09901.0
=
−−=−=
=−+−=
TPHence
Hence expected profit = E(T)
83
( )
077425.0$
0138.010.03963.005.05899.10.0
=
×−+×+×=
JOINT DISTRIBUTIONS – Two and higher dimensional Random
Variables
Suppose X,Y are 2 discrete rvs
and suppose X can take values Yandxx ......., 21 can take
values ........., 21 yy we refer to the function ( ) ( )yYxYPyxf === ,, as the joint prob
distribution of X and Y. The ordered pair (X,Y) is sometimes referred to as a two –
dimensional discrete r.v.
Example 16
Two cards are drawn at random from a pack of 52 cards. Let X be the number of aces
drawn and Y be the number of Queens drawn.
Find the joint prob distribution of X and Y.
Solution
Clearly X can take any one of the three values 0,1,2 and Y one of the three values, 0,1,2.
The joint prob distribution of X, and Y is depicted in the following 3 x 3 table
x
0 1 2
0
2
52
2
44
2
52
2
44
1
4
2
52
2
4
1
2
52
1
44
1
4
2
52
1
4
1
4
0y
2
2
52
2
4
0 0
84
Justification
( )0,0 == yxP
= P (no aces and no queens in t he 2 cards)
=
2
52
2
44
( )0,1 == YXP (the entry in the 2nd
col and 1st
row)
=P (one ace and one other card which is neither ace nor a queen)
.
2
52
1
44
1
44
etc=
Can we write down the distribution of X? X can take any one of the 3 values 0,1,2
What is ( )?0=XP
X = 0 means no ace is drawn but we might draw 2 queens, or 1 queen and one non queen
or 2 cards which are neither aces nor queens.
Thus
( ) ( ) ( ) ( )
( )!
2
52
2
48
2
52
2
4
2
52
1
44
1
4
2
52
2
44
1.3
1,01,00,00
Verify
colinprobtheofSum
YXPYXPYXPXP
=++
=
==+==+====
Similarly ( ) ( ) ( ) ( )2,11,10,11 ==+==+==== YXPYXPYXPXP
85
= Sum of the 3 probabilities in 2nd
col.
( )
( ) ( ) ( ) ( )2,21,20,22
!
2
52
1
48
1
4
0
2
52
1
4
1
4
2
52
1
44
1
4
==+==+====
=++=
YXPYXPYXPXP
Verify
= Sum of the 3 probabilities in 3rd
col
=++=
2
52
2
4
00
2
52
2
4
The distribution of X derived from the joint distribution of X and Y is referred to as the
marginal distribution of X..
Similarly the marginal distribution of Y are the 3 row totals.
Example 17
The joint prob distribution of X and Y is given by
x
-1 0 1
-1
8
1
8
1
8
1
8
3
0
8
1
0
8
1
8
2
y
1
8
1
8
1
8
1
8
3
Marginal Distribution of X
8
3
8
2
8
3
Write the marginal distribution of X and Y. To get the marginal distribution of X, we find
the column totals and write them in the (bottom) margin. Thus the (marginal) distribution
of X is
X -1 0 1
Prob
8
3
8
2
8
3
86
(Do you see why we call it the marginal distribution)
Similarly to get the marginal distribution of Y, we find the 3 row totals and write them in
the (right) margin.
Thus the marginal distribution of y is
Y Prob
-1
8
3
0
8
2
1
8
3
Notation: If ( ) ( )yYxXPyxf === ,, is the joint prob distribution of the 2-dimensional
discrete r.v (X.Y), we denote by g (x) the marginal distribution of X and by h(y) the
marginal distribution of Y.
Thus ( ) ( ) ( ) ( )yxfyYxXPxXPxg ,,
1
1
1
1
======
All y all y
And
( ) ( ) ( ) ( )
xallxall
yxfyYxXPyYPyh ,,
1
1
1
1
======
Conditional Distribution
The conditional prob distribution of Y for a given X = x is defined as
( ) ( )
( )
( )
( )
( )xg
yxf
xXP
yYxXP
xXgivenyYofprobreadxXyYPxyh
,,
)(
=
=
==
=
=====
where g (x) is the marginal distribution of X.
Thus in the above example 17,
( ) ( )
( )
( ) 3
1
1
0,1
1|01|0
8
3
8
1
==
=
==
====
XP
YXP
XYPh
Similarly, the conditional prob distribution of X for a given Y = y is defined as
87
( ) ( ) ( )
( )
( )
( )yh
yxf
yYP
yYxXP
yYxXPyxg
,,
|| =
=
==
====
Where h(y) is the marginal distribution of Y.
In the above example,
( ) ( ) ( )
( )
0
8
2
0
0
0,0
0|00|0 ==
=
==
====
YP
yYP
YXPg
Independence
We say X,Y are independent if
( ) ( ) ( ) .,, yxallforyYPxXPyYxXP =====
Thus X,Y are independent if and only if
( ) ( ) ( ) yandxallforyhxgyxf =,
which is the same as saying of g(x|y) =g(x) for all x and y which is the same as saying
( ) ( )yhxyh =| for all x,y.
In the above example X,Y are not independent as ( ) ( ) ( )000,0 ==≠== YPXPYXP
Example 18
The joint prob distribution of X and Y is given by
X
2 0 1
2 0.1 0.2 0.1
0 0.05 0.1 0.15
Y
1 0.1 0.1 0.1
(a) Find the marginal distribution of x.
Ans
X 2 0 1
Prob 0.25 0.4 0.35
88
(b) Find the marginal distribution of Y
Ans
Y Prob
2 0.4
0 0.3
1 0.3
(c) Find ( )2=+YXP
Ans ( ) ( ) ( )2,01,10,22 =======+ YXorYXorYXifYX
Thus ( ) 35.02.01.005.02 =++==+YXP
(d) Find ( )0=−YXP
Ans : ( ) ( ) ( )1,10,02,20 =======− YXorYXorYXifYX
( ) 3.01.01.01.00 =++==−∴ YXP
(e) Find ( )0≥XP Ans. 1
(f) Find ( ) 3.0
1
3.0
.00 =≥=− AnsXYXP
(g) Find ( )
3
1
6.0
2.0
.10 =≥=− AnsXYXP
(h) Are X,Y independent?
Ans No! ( ) ( ) ( ).111,1 ==≠== YPXPYXP
Two-Dimensional Continuous Random Variables
Let (X,Y) be a continuous 2-dimensional r.v. This means (X,Y) can take all values in a
certain region of the X,Y plane. For example, suppose a dart is thrown at a circular board
of radius 2. Then the position where the dart hits the board (X,Y) is a continuous two
dimensional r.v as it can take all values (x,y) such that .422
≤+ yx
A function ( )yxf , is said to be the joint prob density of (X,Y) if
(i) ( ) yxallforyxf ,0, ≥
89
(ii) ( ) 1, =
∞
∞−
∞
∞−
dxdyyxf
(iii) ( ) ( ) .,, dxdyyxfdYcbXaP
b
a
d
c
=≤≤≤≤
Example 19(a)
Let the joint prob density of (X,Y) be
( )
elsewhere
yxyxf
0
20,20
4
1
, ≤≤≤≤=
Find ( )1≤+YXP
Ans : The region 1≤+ yx is given by the shaded portion.
( )
( ) ( ) =−−=−=
≤+∴
−
==
1
0
1
0
2
1
0
1
0
.
8
1
1
8
1
1
4
1
4
1
1
xdxx
dxdyyxP
x
yx
Example 19(b)
The joint prob density of (X,Y) is
( ) ( )
( )3,1
40,206
8
1
,
<<
<<<<−−=
YXPFind
yxyxyxf
Solution
( ) dxdyyxf
yx
,
3
2
1
0 ==
90
( )
( )
( )
( )
8
3
18
2
5
2
25
8
1
2
5
2
6
8
1
2
5
6
8
1
2
6
8
1
6
8
1
1
0
2
1
0
3
2
21
0
3
2
1
0
=+−−=−
−
−=
−−=
−−
−−=
=
=
==
x
dxx
dx
y
yx
dxdyyx
x
x
yx
If ( )yxf , is the joint prob density of the 2-dimensional continuous rv (X,Y), we define
the marginal prob density of X as
( ) ( )dyyxfxg ,
∞
∞−
=
That is fix x and integrate f(x,y) w.r.t y
Similarly the marginal prob density of Y is
( ) ( )dxyxfyh ,
∞
∞−
=
The conditional prob density of Y for a given x is
( ) ( )
( )xg
yxf
xyh
,
| = (Defined only for those x for which g(x) ≠ 0)
The conditional prob density of X for a given y is
( ) ( )
( )
( ) )0(
,
| ≠= yhwhichforythoseforonlydefined
yh
yxf
yxg
Marginal and Conditional Densities
91
We say X,Y are independent if and only if ( ) ( ) ( )yhxgyxf =,
which is the same as saying ( ) ( ) ( ) ( ).|| yhxyhorxgyxg ==
Example 20
Consider the density of (X,Y) as given in example 19.
The marginal density of x
= ( ) ( )dyyxxg
y
−−=
=
6
8
1
4
2
( )
4
2
2
2
6
8
1
−−
y
yx
( )[ ]
elsewhereand
xx
0
20662
8
1
=
<<−−=
We verify this is a valid density.
( ) ( ) 20026
8
1
<<≥−= xforxxg
Secondly ( ) ( )dxxdxxg 26
8
1
2
0
2
0
−=
[ ] [ ] 1412
8
1
6
8
1 2
0
2
=−=−= xx
The marginal density of Y is
( ) ( )dxyxyh
x
−−
=
6
8
1
2
0
( ) ( )[ ]262
8
1
2
6
8
1
2
0
2
−−=−−=
=
y
x
xy
x
Independence
92
=
( ) <<−
elsewhere
yory
0
4210
8
1
Again ( ) ( )dyyhandyh ≥
4
2
0
( ) [ ] [ ] 11220
8
1
10
8
1
210
8
1 4
2
2
4
2
=−=−=−= yydyy
The conditional density of Y for X = 1
is ( ) ( )
( )
( )
( )
( ) 42,5
4
1
26
8
1
16
8
1
1
,
1| <<−=
−
−
== yy
y
g
yxf
yh
And 0 elsewhere
Again this is a valid density as ( ) 01| ≥yh
And ( ) ( )dyydyyh −= 5
4
1
1|
4
2
4
2
( )
( ) ( )
( )3
3,1
3|1
1
2
1
2
9
4
1
2
5
4
1
4
2
2
<
<<
=<<
=−=
−
−=
YP
YXP
YxP
y
Now
8
3
=Nr
( ) ( ) ( )
( ) ( )
8
5
2
4
2
9
4
1
2
5
4
1
5
4
1
210
8
1
3
3
2
23
2
3
2
3
2
=−=
−
−=−=
−==<=
y
dyy
dyydyyhYPDr
93
The conditional density of Y for X = 1
Is ( ) ( )
( )
( )
( )
( ) 425
4
1
26
8
1
16
8
1
1
,1
1| <<−=
−
−
== yy
y
g
yf
yh
And 0 elsewhere
Again this is a valid density as ( ) 01| ≥yh and ( ) ( )dyydyyh −= 5
4
1
1|
4
2
4
2
( )
( )
( )
( )3
3,1
3|1
1
2
1
2
9
4
1
2
5
4
1
4
2
2
<
<<
=<<
=−=
−
−=
YP
yxP
YXP
y
Now Numerator
8
3
=
( ) ( ) ( )
( ) ( )
8
5
2
4
2
9
4
1
2
5
4
1
5
4
1
210
8
1
3rDenominato
3
2
23
2
3
2
3
2
=−=
−
−=−=
−==<=
y
dyy
dyydyyhYP
Hence ( )
5
3
8
5
8
3
3,1 ==<< YXP
Let ( )yxf , be the joint density of (X,Y). We define the cumulative distribution function
as
( ) ( )yYxXPyxF ≤≤= ,,
( ) ., dvduvuf
yx
∞−∞−
=
The Cumulative Distribution Function
94
Example 21 (See Exercise 5.77 on page 180)
The joint prob density of X and Y is given by
( )
( ) <<<<+
=
elsewhere
yxyx
yxf
0
10,10
,
2
5
6
Find the cumulative distribution function F(x,y)
Case (i) x < 0
( ) ( )
( )
0,
0,0
,,
<
==
=
∞−∞−
vuany
forvufas
dvduvufyxF
yx
Case (ii) y < 0.
Again ( ) 0, =yxF whatever be x.
Case (iii)
( )
( ) ( )
( ) ( )( )000,
,,
10,10
2
5
6
00
<<=+=
=
<<<<
==
∞−
voruforvufasdvduvu
dvduvufyxF
yx
y
v
x
u
y
du
v
uv
yx
u 0
3
0
35
6
+=
=
+=+=
=
325
6
35
6 323
0
xyyx
du
y
uy
x
u
.
Solution
95
Case (iv) 1,10 ≥<< yx
( ) ( )
( )
+=+=
+=
=
=
==
∞−∞−
325
6
3
1
5
6
5
6
,,
2
0
2
1
00
xx
duu
dudvvu
dudvvufyxF
x
u
v
x
u
yx
Case (v) 10,1 <<≥ yx
as in case (iii) we can show
( ) +=
325
6
,
3
yy
yxF
Case (v) 1,1 ≥≥ yx
( ) ( ) ( )dvduvududvvufyxF
vu
yx
2
1
0
1
0
5
6
,, +==
==∞−∞−
1
3
1
2
1
5
6
3
1
5
6
1
0
=+=+=
−
duu
u
(Did you anticipate this?)
Hence
( )6.04.0,5.02.0 <<<< YXP
( )
( ) ( )
( ) ( )?4.0,2.0
4.0,5.06.0,2.0
6.0,5.0
WhyF
FF
F
+
−−
=
96
( ) ( ) ( )( ) ( ) ( ) ( )( )
( ) ( ) ( )( ) ( ) ( ) ( )( )
3
4.02.0
2
4.02.0
3
4.05.0
2
4.05.0
3
6.02.0
2
6.02.0
3
6.05.0
2
6.05.
5
6
3232
3232
++−−
−−+=
( ) ( ) ( ) ( )( ) ( ) ( ) ( ) ( )−
−×−−+×=
3
4.06.0
2.01.2.04.06.0
3
5.0
15.0
5
6
2
2332
( ) ( )[ ] ( ) ( ) ( )[ ][ ]
( ) ( ) ( ) ( )[ ]
[ ]
04344.0
362.01.0
5
6
4.06.02.05.01.0
5
6
4.06.01.01.02.05.0
5
6
3322
3322
=
××=
−+−×=
−+×−=
Example 22
The joint density of X and Y is
( )
( ) <<<<+
=
elsewhere
yxyx
yxf
0
10,10
,
2
5
6
(a) Find the conditional prob density g (x | y)
(b) Find
2
1
|xg
(c) Find the mean of the conditional density of X given that
2
1
=Y
Solution
( ) ( )
( )
( )yhwhere
yh
yxf
yxg
,
| = is the marginal density of y.
97
Thus
( ) ( ) ( )
.10
2
1
5
6
,
2
2
5
6
1
0
1
0
<<+=
+==
==
yy
dxyxdxyxfyh
xx
Hence
( )
( )
( )
( )
10,
4
1
3
4
2
1
|
0
.10,|
4
1
2
1
4
1
2
2
1
2
2
2
1
5
6
2
5
6
<<+=
+
+
=∴
<<
+
+
=
+
+
=
xx
x
xg
elsewhereand
x
y
yx
y
yx
yxg
Hence
dxx
dxxgx
yxE
+×=
=
=
4
1
3
4
2
1
|
2
1
|
1
0
1
0
8
11
8
1
3
1
3
4
833
4
1
0
23
=+=+=
xx
98
Example 23
(X,Y) has a joint density which is uniform on the rhombus find
(a) Marginal density of X.
(b) Marginal density of Y
(c) The conditional density of Y given
2
1
=X
Solution
(X,Y) has uniform density on the rhombus means
( )
bushomrtheofArea
1
y,xf =
bushomrtheover
2
1
=
and 0 elsewhere
(a) Marginal Density of X
Case (i) 0<x<1
( ) ( )xdyxf
x
xy
−==
−
−=
1
2
1
1
1
Case (ii) –1<x<0
( ) xdyxf
x
xy
+==
+
−−=
1
2
1
1
1
Thus
( ) <<−
<<−+
=
elsewhere0
1x0x1
0x1x1
xg
(b) By symmetry marginal density of Y is
99
( ) <<−
<<−+
=
elsewhere
yy
yy
yh
0
101
011
(c)
2
1
2
1
,
2
1
tofromrangesyxfor −=
Thus conditional density of Y for is
2
1
X =
( )
( )
( )
<<−
==
elsewhere0
y1
f
,xf
|yh 2
1
2
1
2
1
2
1
2
1
for
3
2
to
3
2
fromrangsY
3
1
x −=
( ) <<−=
=∴
elsewhere0
3
2
y
3
2
4
3
|yh
3
2
2
1
3
1
100
PROPERTIES OF EXPECTATIONS
Let X be a r.v. a,b be constants
Then
(a) ( ) ( ) bxEabaXE +=+
(b) ( ) ( )XVarabaXVar 2
=+
If nXXX ......, 21 are any n rvs,
( ) ( ) ( ) ( )n21n21 XE....XEXEX.......XXE +++=+++
But if nareX,.....X n1 indep rvs then
( ) ( ) ( ) ( )n21n21 XVar....XVarXVarX.....XXVar +++=+++
In particular if X,Y are independent
( ) ( ) ( ) ( )YVarXVarYXVarYXVar +=−=+
Please note : whether we add X and Y or subtract Y from X, we always must add their
variances.
If X,Y are two rvs, we define their covariance
( ) ( )( )[ ]
( ) ( )YE,XEWhere
YXEY,XCOV
21
21
=µ=µ
µ−µ−=
Th. If X,Y are indep, ( ) ( ) ( ) ( ) 0Y,XCOVandYEXEXYE ==
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.
R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.

More Related Content

What's hot

soil classification lab
soil classification labsoil classification lab
soil classification labahmed fouad
 
Introduction to Probability
Introduction to ProbabilityIntroduction to Probability
Introduction to ProbabilityVikas Gupta
 
Probability powerpoint
Probability powerpointProbability powerpoint
Probability powerpointTiffany Deegan
 
Complements conditional probability bayes theorem
Complements  conditional probability bayes theorem  Complements  conditional probability bayes theorem
Complements conditional probability bayes theorem Long Beach City College
 
Hypergeometric distribution
Hypergeometric distributionHypergeometric distribution
Hypergeometric distributionmohammad nouman
 
Discrete probability distribution (complete)
Discrete probability distribution (complete)Discrete probability distribution (complete)
Discrete probability distribution (complete)ISYousafzai
 
Probability distribution 2
Probability distribution 2Probability distribution 2
Probability distribution 2Nilanjan Bhaumik
 
Probability and counting rules
Probability and counting rulesProbability and counting rules
Probability and counting rulesCikgu Marzuqi
 
Penetration of bituminous materials
Penetration of bituminous materialsPenetration of bituminous materials
Penetration of bituminous materialsNUR
 
Solution manual for design and analysis of experiments 9th edition douglas ...
Solution manual for design and analysis of experiments 9th edition   douglas ...Solution manual for design and analysis of experiments 9th edition   douglas ...
Solution manual for design and analysis of experiments 9th edition douglas ...Salehkhanovic
 
16. AASHTO Pavement Design Method (Rigid).pptx
16. AASHTO  Pavement Design Method (Rigid).pptx16. AASHTO  Pavement Design Method (Rigid).pptx
16. AASHTO Pavement Design Method (Rigid).pptxGhulam Mehdi Sahito
 
NON LINEAR PROGRAMMING
NON LINEAR PROGRAMMING NON LINEAR PROGRAMMING
NON LINEAR PROGRAMMING karishma gupta
 
20 pigeonhole-principle
20 pigeonhole-principle20 pigeonhole-principle
20 pigeonhole-principleananyapandey32
 
multi criteria decision making
multi criteria decision makingmulti criteria decision making
multi criteria decision makingShankha Goswami
 
Marshall Mix Design: Lab Report
Marshall Mix Design: Lab ReportMarshall Mix Design: Lab Report
Marshall Mix Design: Lab ReportPriyansh Singh
 

What's hot (20)

soil classification lab
soil classification labsoil classification lab
soil classification lab
 
Introduction to Probability
Introduction to ProbabilityIntroduction to Probability
Introduction to Probability
 
Probability powerpoint
Probability powerpointProbability powerpoint
Probability powerpoint
 
Complements conditional probability bayes theorem
Complements  conditional probability bayes theorem  Complements  conditional probability bayes theorem
Complements conditional probability bayes theorem
 
Maths 3 ppt
Maths 3 pptMaths 3 ppt
Maths 3 ppt
 
Binomial Probability Distributions
Binomial Probability DistributionsBinomial Probability Distributions
Binomial Probability Distributions
 
Hypergeometric distribution
Hypergeometric distributionHypergeometric distribution
Hypergeometric distribution
 
3.1 probability
3.1 probability3.1 probability
3.1 probability
 
Discrete probability distribution (complete)
Discrete probability distribution (complete)Discrete probability distribution (complete)
Discrete probability distribution (complete)
 
Probability distribution 2
Probability distribution 2Probability distribution 2
Probability distribution 2
 
Chapter 4
Chapter 4Chapter 4
Chapter 4
 
Probability and counting rules
Probability and counting rulesProbability and counting rules
Probability and counting rules
 
Penetration of bituminous materials
Penetration of bituminous materialsPenetration of bituminous materials
Penetration of bituminous materials
 
Solution manual for design and analysis of experiments 9th edition douglas ...
Solution manual for design and analysis of experiments 9th edition   douglas ...Solution manual for design and analysis of experiments 9th edition   douglas ...
Solution manual for design and analysis of experiments 9th edition douglas ...
 
Unit vii problem set
Unit vii problem setUnit vii problem set
Unit vii problem set
 
16. AASHTO Pavement Design Method (Rigid).pptx
16. AASHTO  Pavement Design Method (Rigid).pptx16. AASHTO  Pavement Design Method (Rigid).pptx
16. AASHTO Pavement Design Method (Rigid).pptx
 
NON LINEAR PROGRAMMING
NON LINEAR PROGRAMMING NON LINEAR PROGRAMMING
NON LINEAR PROGRAMMING
 
20 pigeonhole-principle
20 pigeonhole-principle20 pigeonhole-principle
20 pigeonhole-principle
 
multi criteria decision making
multi criteria decision makingmulti criteria decision making
multi criteria decision making
 
Marshall Mix Design: Lab Report
Marshall Mix Design: Lab ReportMarshall Mix Design: Lab Report
Marshall Mix Design: Lab Report
 

Similar to R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.

Introduction to Discrete Probabilities with Scilab - Michaël Baudin, Consort...
Introduction to Discrete Probabilities with Scilab - Michaël Baudin, Consort...Introduction to Discrete Probabilities with Scilab - Michaël Baudin, Consort...
Introduction to Discrete Probabilities with Scilab - Michaël Baudin, Consort...Scilab
 
Reliability-Engineering.pdf
Reliability-Engineering.pdfReliability-Engineering.pdf
Reliability-Engineering.pdfBakiyalakshmiR1
 
Lecture Notes MTH302 Before MTT Myers.docx
Lecture Notes MTH302 Before MTT Myers.docxLecture Notes MTH302 Before MTT Myers.docx
Lecture Notes MTH302 Before MTT Myers.docxRaghavaReddy449756
 
Note 2 probability
Note 2 probabilityNote 2 probability
Note 2 probabilityNur Suaidah
 
Introduction to Mathematical Probability
Introduction to Mathematical ProbabilityIntroduction to Mathematical Probability
Introduction to Mathematical ProbabilitySolo Hermelin
 
GRADE 10 MATH Probability and Statistics
GRADE 10 MATH Probability and StatisticsGRADE 10 MATH Probability and Statistics
GRADE 10 MATH Probability and Statisticskenthromulo
 
Note 1 probability
Note 1 probabilityNote 1 probability
Note 1 probabilityNur Suaidah
 
Práctica 1 auxiliatura estadística
Práctica 1 auxiliatura estadísticaPráctica 1 auxiliatura estadística
Práctica 1 auxiliatura estadísticaJoelAnara
 
Information Theory and coding - Lecture 1
Information Theory and coding - Lecture 1Information Theory and coding - Lecture 1
Information Theory and coding - Lecture 1Aref35
 
Basic concept of probability
Basic concept of probabilityBasic concept of probability
Basic concept of probabilityIkhlas Rahman
 
IIT JAM Mathematical Statistics - MS 2022 | Sourav Sir's Classes
IIT JAM Mathematical Statistics - MS 2022 | Sourav Sir's ClassesIIT JAM Mathematical Statistics - MS 2022 | Sourav Sir's Classes
IIT JAM Mathematical Statistics - MS 2022 | Sourav Sir's ClassesSOURAV DAS
 
Lecture-1-Probability-Theory-Part-1.pdf
Lecture-1-Probability-Theory-Part-1.pdfLecture-1-Probability-Theory-Part-1.pdf
Lecture-1-Probability-Theory-Part-1.pdfMICAHJAMELLEICAWAT1
 
[Junoon - E - Jee] - Probability - 13th Nov.pdf
[Junoon - E - Jee] - Probability - 13th Nov.pdf[Junoon - E - Jee] - Probability - 13th Nov.pdf
[Junoon - E - Jee] - Probability - 13th Nov.pdfPrakashPatra7
 
Applied statistics and probability for engineers solution montgomery && runger
Applied statistics and probability for engineers solution   montgomery && rungerApplied statistics and probability for engineers solution   montgomery && runger
Applied statistics and probability for engineers solution montgomery && rungerAnkit Katiyar
 
Basic Concept Of Probability
Basic Concept Of ProbabilityBasic Concept Of Probability
Basic Concept Of Probabilityguest45a926
 

Similar to R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes. (20)

PRP - Unit 1.pptx
PRP - Unit 1.pptxPRP - Unit 1.pptx
PRP - Unit 1.pptx
 
Introduction to Discrete Probabilities with Scilab - Michaël Baudin, Consort...
Introduction to Discrete Probabilities with Scilab - Michaël Baudin, Consort...Introduction to Discrete Probabilities with Scilab - Michaël Baudin, Consort...
Introduction to Discrete Probabilities with Scilab - Michaël Baudin, Consort...
 
Reliability-Engineering.pdf
Reliability-Engineering.pdfReliability-Engineering.pdf
Reliability-Engineering.pdf
 
Lecture Notes MTH302 Before MTT Myers.docx
Lecture Notes MTH302 Before MTT Myers.docxLecture Notes MTH302 Before MTT Myers.docx
Lecture Notes MTH302 Before MTT Myers.docx
 
Note 2 probability
Note 2 probabilityNote 2 probability
Note 2 probability
 
Course material mca
Course material   mcaCourse material   mca
Course material mca
 
Introduction to Mathematical Probability
Introduction to Mathematical ProbabilityIntroduction to Mathematical Probability
Introduction to Mathematical Probability
 
GRADE 10 MATH Probability and Statistics
GRADE 10 MATH Probability and StatisticsGRADE 10 MATH Probability and Statistics
GRADE 10 MATH Probability and Statistics
 
Note 1 probability
Note 1 probabilityNote 1 probability
Note 1 probability
 
4th Semester CS / IS (2013-June) Question Papers
4th Semester CS / IS (2013-June) Question Papers 4th Semester CS / IS (2013-June) Question Papers
4th Semester CS / IS (2013-June) Question Papers
 
Probability Theory 7
Probability Theory 7Probability Theory 7
Probability Theory 7
 
Probability
ProbabilityProbability
Probability
 
Práctica 1 auxiliatura estadística
Práctica 1 auxiliatura estadísticaPráctica 1 auxiliatura estadística
Práctica 1 auxiliatura estadística
 
Information Theory and coding - Lecture 1
Information Theory and coding - Lecture 1Information Theory and coding - Lecture 1
Information Theory and coding - Lecture 1
 
Basic concept of probability
Basic concept of probabilityBasic concept of probability
Basic concept of probability
 
IIT JAM Mathematical Statistics - MS 2022 | Sourav Sir's Classes
IIT JAM Mathematical Statistics - MS 2022 | Sourav Sir's ClassesIIT JAM Mathematical Statistics - MS 2022 | Sourav Sir's Classes
IIT JAM Mathematical Statistics - MS 2022 | Sourav Sir's Classes
 
Lecture-1-Probability-Theory-Part-1.pdf
Lecture-1-Probability-Theory-Part-1.pdfLecture-1-Probability-Theory-Part-1.pdf
Lecture-1-Probability-Theory-Part-1.pdf
 
[Junoon - E - Jee] - Probability - 13th Nov.pdf
[Junoon - E - Jee] - Probability - 13th Nov.pdf[Junoon - E - Jee] - Probability - 13th Nov.pdf
[Junoon - E - Jee] - Probability - 13th Nov.pdf
 
Applied statistics and probability for engineers solution montgomery && runger
Applied statistics and probability for engineers solution   montgomery && rungerApplied statistics and probability for engineers solution   montgomery && runger
Applied statistics and probability for engineers solution montgomery && runger
 
Basic Concept Of Probability
Basic Concept Of ProbabilityBasic Concept Of Probability
Basic Concept Of Probability
 

Recently uploaded

Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 

Recently uploaded (20)

Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 

R4 m.s. radhakrishnan, probability &amp; statistics, dlpd notes.

  • 1. Study Material for Probability and Statistics AAOC ZC111 Distance Learning Programmes Division Birla Institute of Technology & Science Pilani – 333031 (Rajasthan) July 2003
  • 2. Course Developed by M.S.Radhakrishnan Word Processing & Typesetting by Narendra Saini Ashok Jitawat
  • 3. Contents Page No. INTRODUCTION, SAMPLE SPACES & EVENTS 1 Probability 1 Events 2 AXIOMS OF PROBABILITY 4 Some elementary consequences of the Axioms 4 Finite Sample Space (in which all outcomes are equally likely) 6 CONDITIONAL PROBABILITY 11 Independent events 11 Theorem on Total Probability 14 BAYE’S THEOREM 16 MATHEMATICAL EXPECTATION & DECISION MAKING 22 RANDOM VARIABLES 26 Discrete Random Variables 27 Binomial Distribution 28 Cumulative Binomial Probabilities 29 Binomial Distribution – Sampling with replacement 31 Mode of a Binomial distribution 31 Hyper Geometric Distribution (Sampling without replacement) 32 Binomial distribution as an approximation to the Hypergeometric Distribution 34 THE MEAN AND VARIANCE OF PROBABILITY DISTRIBUTIONS 36 The mean of a Binomial Distribution 37 Digression 37 Chebychevs theorem 39 Law of large numbers 41 Poisson Distribution 42 Poisson approximation to binomial distribution 42
  • 4. Cumulative Poisson distribution 43 Poisson Process 43 The Geometric Distribution 46 Multinomial Distribution 52 Simulation 54 CONTINUOUS RANDOM VARIABLES 56 Probability Density Function (pdf) 57 Normal Distribution 64 Normal Approximation to Binomial Distribution 69 Correction for Continuity 70 Other Probability Densities 71 The uniform Distribution 71 Gamma Function 73 Properties of Gamma Function 74 The Gamma Distribution 74 Exponential Distribution 74 Beta Distribution 78 The Log-Normal Distribution 79 JOINT DISTRIBUTIONS – TWO AND HIGHER DIMENSIONAL RANDOM VARIABLES 83 Conditional Distribution 86 Independence 87 Two-Dimensional Continuous Random Variables 88 Marginal and Conditional Densities 90 Independence 91 The Cumulative Distribution Function 93 Properties of Expectation 100 Sample Mean 101 Sample Variance 102
  • 5. SAMPLING DISTRIBUTION 115 Statistical Inference 115 Statistics 116 The Sampling Distribution of the Sample Mean X . 117 Inferences Concerning Means 128 Point Estimation 128 Estimation of n 130 Estimation of Sample proportion 143 Large Samples 143 Tests of Statistical Hypothesis 148 Notation 149 REGRESSION AND CORRELATION 164 Regression 164 Correlation 167 Sample Correlation Coefficient 167
  • 6. 1 INTRODUCTION, SAMPLE SPACES & EVENTS Probability Let E be a random experiment (where we ‘know’ all possible outcomes but can’t predict what the particular outcome will be when the experiment is conducted). The set of all possible outcomes is called a sample space for the random experiment E. Example 1: Let E be the random experiment: Toss two coins and observe the sequence of heads and tails. A sample space for this experiment could be { }TTHTTHHHS ,,,= . If however we only observe the number of heads got, the sample space would be S = {0, 1, 2}. Example 2: Let E be the random experiment: Toss two fair dice and observe the two numbers on the top. A sample space would be ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) −−−−−−−−−− −−−−− −−−−−− = )6,6(,1,6 | ,1,3 ,3,2,2,2,1,2 6,1,,3,1,2,1,1,1 S If however, we are interested only in the sum of the two numbers on the top, the sample space could be S = { 2, 3, …, 12}. Example 3: Let E be the random experiment: Count the number of machines produced by a factory until a defective machine is produced. A sample space for this experiment could be { }−−−−−−= ,3,2,1S .
  • 7. 2 Example 4: Let E be the random experiment: Count the life length of a bulb produced by a factory. Here S will be { } ).,0[0| ∞=≥tt Events An event is a subset of the sample space. Example 5: Suppose a balanced die is rolled and we observe the number on the top. Let A be the event: an even number occurs. Thus in symbols, { } { }6,5,4,3,2,16,4,2 =⊂= SA Two events are said to be mutually exclusive if they cannot occur together; that is there is no element common between them. In the above example if B is the event: an odd number occurs, i.e. B = { }5,3,1 , then A and B are mutually exclusive. Solved Examples Example 1: A manufacturer of small motors is concerned with three major types of defects. If A is the event that the shaft size is too large, B is the event that the windings are improper and C is the event that the electrical connections are unsatisfactory, express in words what events are represented by the following regions of the Venn diagram given below: (a) region 2 (b) regions 1 and 3 together (c) regions 3, 5, 6 and 8 together.
  • 8. 3 Solution: (a) Since this region is contained in A and B but not in C, it represents the event that the shaft is too large and the windings improper but the electrical connections are satisfactory. (b) Since this region is common to B and C, it represents the event that the windings are improper and the electrical connections are unsatisfactory. (c) Since this is the entire region outside A, it represents the event that the shaft size is not too large. Example 2: A carton of 12 rechargeable batteries contain one that is defective. In how many ways can the inspector choose three of the batteries and (a) get the one that is defective (b) not get the one that is defective. Solution: (a) one defective can be chosen in one way and two good ones can be chosen in 55 2 11 = ways. Hence one defective and two good can be chosen in 1 x 55 = 55 ways. (b) Three good ones can be chosen in 165 3 11 = ways 8 A 7 B 2 5 1 4 3 C 6
  • 9. 4 AXIOMS OF PROBABILITY Let E be a random experiment. Suppose to each event A, we associate a real number P(A) satisfying the following axioms: (i) ( ) 10 ≤≤ AP (ii) ( ) 1=SP (iii) If A and B are any two mutually exclusive events, then ( ) ( ) ( )BPAPBAP +=∪ (iv) If {A1, A2 - - - - - -An , …} is a sequence of pair- wise mutually exclusive events, then ...)(...)()(...)...( 2121 ++++=∪∪∪∪ nn APAPAPAAAP We call P(A) the probability of the event A. Axiom 1 says that the probability of an event is always a number between 0 and 1. Axiom 2 says that the probability of the certain event S is 1. Axiom 3 says that the probability is an additive set function. Some elementary consequences of the Axioms 1. ( ) 0=φP Proof: S= φ∪S .. Now S and φ are disjoint. Hence .0)()()()( =+= φφ PPSPSP Q.E.D. 2. If nAAA ,...,, 21 are any n pair-wise mutually exclusive events, then ( ) ( ) = =∪∪∪ n i in APAAAP 1 21 ... . Proof: By induction on n. Def.: If A is an event A′ the complementary event = S-A (It is the shaded portion in the figure below) A
  • 10. 5 3. )(1)( APAP −=′ Proof: AAS ′∪= Now )()()( APAPSP ′+= as A and A′ are disjoint or 1 = )()( APAP ′+ . Thus )(1)( APAP −=′ . Q.E.D. 4. Probability is a subtractive set function; i.e. If BA ⊂ , then )()()( APBPABP −=− . 5. Probability is a monotone set function: i.e. )()( BPAPBA ≤⊂ Proof: ( )ABAB −∪= where A, B-A are disjoint. Thus ).()()()( APABPAPBP ≥−+= BA ∩ 6. If A, B are any two events, ( ) ( )BAPBPAPBAP ∩−+=∩ )()( Proof: ( ) ( )BAABA ∩′∪=∪ ) where A and BA ∩′ are disjoint Hence ( ) ( )BAPAPBAP ∩′+=∪ )( But ( ) ( ),BABAB ∩′∪∩= union of two disjoint sets ( ) ( ) ( ) ( ) ( ). )( BAPBPBAPor BAPBAPBP ∩−=∩′ ∩′+∩= ( ) ( )BAPBPAPBAP ∩−+=∪∴ )()( . Q.E.D. 7. If A, B, C are any three events, ( ) )CBA(P)AC(P)CB(P)BA(P)C(P)B(P)A(PCBAP ∩∩+∩−∩−∩−++=∪∪ . B A A B BA ∩′
  • 11. 6 Proof: ( )( ) )CBA(P)CB(P)CA(P)BA(P)C(P)B(P)A(P ))CB()CA((P)BA(P)C(P)B(P)A(P )C)BA((P)C(P)BA(P)B(P)A(P CBAP)C(P)BA(P)CBA(P ∩∩+∩−∩−∩−++= ∩∪∩−∩−++= ∩∪−+∩−+= ∩∪−+∪=∪∪ More generally, 8. If nAAA ,...,, 21 are any n events. )AAA(P)1( ...)AAA(P)AA(P)A(P )A...AA(P n21 1n nkji1 kji nj1i ji n 1i I n21 ∩−−−−−−−∩∩−+ −∩∩+∩−= ∪∪∪ − <≤<≤≤<≤= Finite Sample Space (in which all outcomes are equally likely) Let E be a random experiment having only a finite number of outcomes. Let all the (finite no. of) outcomes be equally likely. If { }naaaS ,...,, 21= ( naaa ,...,, 21 are equally likely outcomes), { } { } { }n21 a.......aaS ∪= .a union of m.e. events. Hence { } { }( )naPaPaPSP −−−+= 21})({)( But P({a1})=P({a2})= …= P({an}) = p (say) Hence 1 = p+ p+ . . . +p (n terms) or p = 1/n Hence if A is a subset consisting of ‘k’ of these outcomes, A ={a1, a2………ak}, then n k AP =)( = outcomesofno.Total outcomesfavorableofNo. .
  • 12. 7 Example 1: If a card is drawn from a well-shuffled pack of 52 cards find the probability of drawing (a) a red king Ans: 52 2 (b) a 3, 4, 5 or 6 Ans: 52 16 (c) a black card Ans: 2 1 (d) a red ace or a black queen Ans: 52 4 Example 2: When a pair of balanced die is thrown, find probability of getting a sum equal to (a) 7. Ans: 6 1 36 6 = (Total number of equally likely outcomes is 36 & the favourable number of outcomes = 6, namely (1,6), (2,5),, …(6,1).) (b) 11 Ans: 36 2 (c) 7 or 11 Ans: 36 8 (d) 2, 3 or 12 Ans: = 36 4 36 1 36 2 36 1 =++ . Example 3: 10 persons in a room are wearing badges marked 1 through 10. 3 persons are chosen at random and asked to leave the room simultaneously and their badge nos are noted. Find the probability that (a) the smallest badge number is 5. (b) the largest badge number is 5.
  • 13. 8 Solution: (a) 3 persons can be chosen in 10 C3 equally likely ways. If the smallest badge number is to be 5, the badge numbers should be 5 and any two of the 5 numbers 6, 7, 8, 9,10. Now 2 numbers out of 5 can be chosen in 5 C2 ways. Hence the probability that the smallest badge number is 5 is 5 C2 /10 C3 . (b) Ans. 4 C2 /10 C3 . Example 4: A lot consists of 10 good articles, 4 articles with minor defects and 2 with major defects. Two articles are chosen at random. Find the probability that (a) both are good Ans: 2 16 2 10 C C (b) both have major defects Ans: 2 16 2 2 C C (c) At least one is good Ans: 1 – P(none is good) = 2 1 16 6 1 c c − (d) Exactly one is good Ans: 2 11 16 6.10 c cc (e) At most one is good Ans. P(none is good) + P(exactly one is good) = 2 11 2 2 16 6.10 16 6 c cc c c + (f) Neither has major defects Ans: 2 2 16 14 c c (g) Neither is good Ans: 2 2 16 6 c c
  • 14. 9 Example 5: From 6 positive and 8 negative integers, 4 integers are chosen at random and multiplied. Find the probability that their product is positive. Solution: The product is positive if all the 4 integers are positive or all of them are negative or two of them are positive and the other two are negative. Hence the probability is ++ 4 14 2 8 2 6 4 14 4 8 4 14 4 6 Example 6: If, A, B are mutually exclusive events and if P(A) = 0.29, P(B) = 0.43, then (a) 0.710.291)AP( =−=′ (b) P(A∪B) = 0.29 + 0.43 = 0.72 (c) P( ) m.e.)areBandAsince,BofsubsetaisA(as[0.29BA ′==′∩ )A(P (d) 0.280.721B)P(A1)BAP( =−=∪−=′∩′ Example 7: P(A) = 0.35, P(B) = 0.73, P 0.14B)(A =∩ . Find (a) P (A ∪ B) = P(A) + P(B) - P( A ∩ B) = 0.94. (b) 0.59B)P(AP(B)B)A(P =∩−=∩′ (c) 0.21B)P(AP(A))B(AP =∩−=′∩ (d) 0.860.141B)P(A1)BAP( =−=∩−=′∪′ Example 8: A, B, C are 3 mutually exclusive events. Is this assignment of probabilities possible? P(A) = 0.3, P(B) = 0.4, P(C) = 0.5
  • 15. 10 Ans. P(A ∪ B ∪ C) = P(A) + P(B) + P(C) >1 NOT POSSIBLE Example 9: Three newspapers are published in a city. A recent survey of readers indicated the following: 20% read A 8% read A and B 2% read all 16% read B 5% read A and C 14% read C 4% read B and C Find probability that an adult chosen at random reads (a) none of the papers. Ans. 0.65 100 2458141620 1C)BP(A1 = +−−−+++ −=∪∪− (b) reads exactly one paper. P (Reading exactly one paper) 0.22 100 769 = ++ = (c) reads at least A and B given he reads at least one of the papers. P (At least reading A and B given he reads at least one of the papers) = 35 8 C)BP(A B)P(A = ∪∪ ∩ A B C 9 6 3 6 22 7
  • 16. 11 CONDITIONAL PROBABILITY Let, A, B be two events. Suppose P(B) ≠ 0. The conditional probability of A occurring given that B has occurred is defined as P(A | B) = probability of A given B = . P(B) B)P(A ∩ Similarly we define P(B | A) = P(A) B)P(A ∩ if P(A) ≠ 0. Hence we get the multiplication theorem 0)P(A)(if)P(A).P(B/AB)P(A ≠=∩ ) 0)P(B)(if)P(B).P(A/B ≠= Example 10 A bag contains 4 red balls and 6 black balls. 2 balls are chosen at random one by one without replacement. Find the probability that both are red. Solution Let A be the event that the first ball drawn is red, B the event the second ball drawn is red. Hence the probability that both balls drawn are red = 15 2 9 3 10 4 A)|P(BP(A)B)P(A =×=×=∩ Independent events: Definition: We say two events A, B are independent if P(A∩ B) = P(A). P(B) Equivalently A and B are independent if P(B | A) = P(B) or P(A | B) = P(A) Theorem If, A, B are independent, then (a) A′ , B are independent (b) A, B′ are independent (c) B,A ′′ are independent
  • 17. 12 Proof B)A(B)(AB ∩′∪∩= Mutually exclusive B)AP(B)P(AP(B) ∩′+∩= B)P(A-P(B)B)AP( ∩=∩′ = P(B) – P(A) (P/B) = P(B) [1-P(A)] = P(B) P( )A′ ∴A, B′ are also independent. By the same reasoning, A′ and B are independent. So again A′ and B′ are independent. Example 11 Find the probability of getting 8 heads in a row in 8 tosses of a fair coin. Solution If Ai is the event of getting a head in the ith toss, A1, A2, …, A8 are independent and P(Ai) = 2 1 for all i. Hence P(getting all heads) = P(A1) P(A2)…P(An) = 8 2 1 Example 12 It is found that in manufacturing a certain article, defects of one type occur with probability 0.1 and defects of other type occur with probability 0.05. Assume independence between the two types of defects. Find the probability that an article chosen at random has exactly one type of defect given that it is defective. A B A∩B BA ∩′
  • 18. 13 Let A be the event that article has exactly one type of defect. Let B be the event that the article is defective. Required P(B) B)P(A B)|P(A ∩ = P(B) = P(D ∪ E) where D is the event it has type one defect E is the event it has type two defect = P(D) + P(E) – P(D ∩ E) = 0.1 + 0.05 - (0.1) (0.05) = 0.145 P(A∩ B) = P (article is having exactly one type of defect) = P(D) + P(E) – 2 P(D ∩ E) = 0.1 + 0.05 - 2 (0.1) (0.05) = 0.14 ∴Probability = 145.0 14.0 [Note: If A and B are two events, probability that exactly only one of them occurs is P(A) + P(B) – 2P(A∩ B)] Example 13 An electronic system has 2 subsystems A and B. It is known that P (A fails) = 0.2 P (B fails alone) = 0.15 P (A and B fail) = 0.15 Find (a) P (A fails | B has failed) (b) P (A fails alone)
  • 19. 14 Solution (a) P(A fails | B has failed) 2 1 0.30 0.15 failed)P(B failed)BandP(A === (b) P (A fails alone) = P (A fails) – P (A and B fail) = 0.02-0.15 = 0.05 Example 14 A binary number is a number having digits 0 and 1. Suppose a binary number is made up of ‘n’ digits. Suppose the probability of forming an incorrect binary digit is p. Assume independence between errors. What is the probability of forming an incorrect binary number? Ans 1- P (forming a correct no.) = 1 – (1-p)n . Example 15 A question paper consists of 5 Multiple choice questions each of which has 4 choices (of which only one is correct). If a student answers all the five questions randomly, find the probability that he answers all questions correctly. Ans 5 4 1 . Theorem on Total Probability Let B1, B2, …, Bn be n mutually exclusive events of which one must occur. If A is any other event, then ( ) )BP(A......BAP)BP(AP(A) 21 n∩++∩+∩= )B|P(A)P(B ii 1i= = n (For a proof, see your text book.) Example 16 There are 2 urns. The first one has 4 red balls and 6 black balls. The second has 5 red balls and 4 black balls. A ball is chosen at random from the 1st and put in the 2nd . Now a ball is drawn at random from the 2nd urn. Find the probability it is red.
  • 20. 15 Solution: Let B1 be the event that the first ball drawn is red and B2 be the event that the first ball drawn is black. Let A be the event that the second ball drawn is red. By the theorem on total probability, P(A) = P(B1) P(A | B1) + P(B2) P(A | B2) = 100 54 10 5 10 6 10 6 10 4 =×+× =0.54. Example 17: A consulting firm rents cars from three agencies D, E, F. 20% of the cars are rented from D, 20% from E and the remaining 60% from F. If 10% of cars rented from D, 12% of cars rented from E, 4% of cars rented from F have bad tires, find the probability that a car rented from the consulting firm will have bad tires. Ans. (0.2) (0.1) + (0.2) (0.12) + (0.6) (0.04) Example 18: A bolt factory has three divisions B1, B2, B3 that manufacture bolts. 25% of output is from B1, 35% from B2 and 40% from B3. 5% of the bolts manufactured by B1 are defective, 4% of the bolts manufactured by B2 are defective and 2% of the bolts manufactured by B3 are defective. Find the probability that a bolt chosen at random from the factory is defective. Ans. 100 2 100 40 100 4 100 35 100 5 100 25 ×+×+×
  • 21. 16 BAYES’ THEOREM Let B1, B2, ……….Bn be n mutually exclusive events of which one of them must occur. If A is any event, then )B|)P(AP(B )B|)P(AP(B P(A) )BP(A A)|P(B ii 1i kkk k n = = ∩ = Example 19 Miss ‘X’ is fond of seeing films. The probability that she sees a film on the day before the test is 0.7. Miss X is any way good at studies. The probability that she maxes the test is 0.3 if she sees the film on the day before the test and the corresponding probability is 0.8 if she does not see the film. If Miss ‘X’ maxed the test, find the probability that she saw the film on the day before the test. Solution Let B1 be the event that Miss X saw the film before the test and let B2 be the complementary event. Let A be the event that she maxed the test. Required. P(B1 | A) )B|P(AP(B))B|P(A)P(B )B|)P(AP(B 211 11 ×+× = Example 20 At an electronics firm, it is known from past experience that the probability a new worker who attended the company’s training program meets the production quota is 0.86. The corresponding probability for a new worker who did not attend the training program is 0.35. It is also known that 80% of all new workers attend the company’s training 8.03.03.07.0 3.07.0 ×+× × =
  • 22. 17 program. Find probability that a new worker who met the production quota would have attended the company’s training programme. Solution Let B1 be the event that a new worker attended the company’s training programme. Let B2 be the complementary event, namely a new worker did not attend the training programme. Let A be the event that a new worker met the production quota. Then we want P(B1 | A) = 35.02.086.08.0 8.08.0 ×+× × . Example 21 A printing machine can print any one of n letters L1, L2,……….Ln. It is operated by electrical impulses, each letter being produced by a different impulse. Assume that there is a constant probability p that any impulse prints the letter it is meant to print. Also assume independence. One of the impulses is chosen at random and fed into the machine twice. Both times, the letter L1 was printed. Find the probability that the impulse chosen was meant to print the letter L1. Solution: Let B1 be the event that the impulse chosen was meant to print the letter L1. Let B2 be the complementary event. Let A be the event that both the times the letter L1 was printed. P(B1) = n 1 . P(A|B1) = p2 . Now the probability that an impulse prints a wrong letter is (1- p). Since there are n-1 ways of printing a wrong letter, P(A|B2) = 1 1 − − n p . Hence P(B1|A) = )B|P(A)P(B)B|P(A)P(B )B|P(A)P(B 2211 11 ×+× × 2 2 2 1 11 1 1 1 − − −+ = n p n p n p n . This is the required probability.
  • 23. 18 Miscellaneous problems 1 (a). Suppose the digits 1,2,3 are written in a random order. Find probability that at least one digit occupies its proper place. Solution There are 3! = 6 ways of arranging 3 digits (See the figure), out of which in 4 arrangements , at least one digit occupies its proper place. Hence the probability is 3! 4 = 6 4 . 123 213 312 132 231 321 (Remark. An arrangement like 231, where no digit occupies its proper place is called a derangement.) (b) Same as (a) but with 4 digits 1,2,3,4 Ans. 24 15 (Try proving this.) Solution Let A1 be the Event 1st digit occupies its proper place A2 be the Event 2nd digit occupies its proper place A3 be the Event 3rd digit occupies its proper place A4 be the Event 4th digit occupies its proper place P(at least one digit occupies its proper place) =P(A1∪A2 ∪A3 ∪A4) =P(A1) + P(A2) + P(A3) + P(A4) (There are 4C1 terms each with the same probability) )AAP(...)AP(A)AP(A)AP(A 43413121 ∩−−∩−∩−∩− (There are 4C2 terms each with the same probability) )AAP(A...)AAP(A).AAP(A 432421321 ∩∩++∩∩+∩∩+ (There are 4C3 terms each with the same probability) - )AAAAP( 4321 ∩∩∩ 4! 0! 4c 4! 1! 4c 4! 2! 4c 4! 3! 4c 4321 −+−=
  • 24. 19 24 1 6 1 2 1 1 −+−= 24 15 24 141224 = −+− = (c) Same as (a) but with n digits. Solution Let A1 be the Event 1st digit occupies its proper place A2 be the Event 2nd digit occupies its proper place …………………… An be the Event nth digit occupies its proper place P(at least one digit occupies its proper place) = P(A1∪A2 ∪ … ∪An) = ! 1 (-1)......- n! 3)!(n nc n! 2)!(n nc n! 1)!(n nc 1-n 321 n + − + − − − ! 1 1)(.......... 4! 1 3! 1 2! 1 1 1n n − −−+−= ≈ ! e1 − − (for n large). 2. In a party there are ‘n’ married couples. If each male chooses at random a female for dancing, find the probability that no man chooses his wife. Ans 1-( ! 1 1)(.......... 4! 1 3! 1 2! 1 1 1n n − −−+− ). 3. A and B play the following game. They throw alternatively a pair of dice. Whosoever gets sum of the two numbers on the top as seven wins the game and the game stops. Suppose A starts the game. Find the probability (a) A wins the game (b) B wins the game.
  • 25. 20 Solution A wins the game if he gets seven in the 1st throw or in the 3rd throw or in the 5th throw or …. Hence P(A wins) = 6 1 6 5 6 5 6 5 6 5 6 1 6 5 6 5 6 1 ××××+××+ + … = . 11 6 36 2536 6 1 6 5 1 6 1 2 = − = − P(B wins) = complementary probability = 11 5 . 4. Birthday Problem There are n persons in a room. Assume that nobody is born on 29th Feb. Assume that any one birthday is as likely as any other birth day. Find the probability that no two persons will have same birthday. Solution If n > 365, at least two will have the same birthday and hence the probability that no two will have the same birthday is 0. If n ≤ 365, the desired probability is ( )[ ] n (365) 1n365.........364365 −−×× = . 5. A die is rolled until all the faces have appeared on top. (a) What is probability that exactly 6 throws are needed? Ans. 6 6 !6 (b) What is probability that exactly ‘n’ throws are needed? ( )6n >
  • 26. 21 6. Polya’s urn problem An urn contains g green balls and r red balls. A ball is chosen at random and its color is noted. Then the ball is returned to the urn and c more balls of same color are added. Now a ball is drawn. Its color is noted and the ball is replaced. This process is repeated. (a) Find probability that 1st ball drawn is green. Ans. rg g + (b) Find the probability that the 2nd ball drawn is green. Ans. rg g crg g rg r crg cg rg g + = +++ + ++ + × + (c) Find the probability that the nth ball drawn is green. The surprising answer is rg g + . 7. There are n urns and each urn contains a white and b red balls. A ball is chosen from Urn 1 and put into Urn 2. Now a ball is chosen at random from urn 2 and put into urn 3 and this is continued. Finally a ball drawn from Urn n. Find the probability that it is white. Solution Let pr = Probability that the ball drawn from Urn r is white. ∴ 1 )p(1 1 1 pp 1r1rr ++ ×−+ ++ + ×= −− aa a ba a ; r = 1, 2, …, n. This is a recurrence relation for pr. Noting that p1 = ba a + , we can find pn.
  • 27. 22 MATHEMATICAL EXPECTATION & DECISION MAKING Suppose we roll a die n times. What is the average of the n numbers that appear on the top? Suppose 1 occurs on the top n1 times Suppose 2 occurs on the top n2 times Suppose 3 occurs on the top n3 times Suppose 4 occurs on the top n4 times Suppose 5 occurs on the top n5 times Suppose 6 occurs on the top n6 times Total of the n numbers on the top = 621 n....6..........n1n1 ×+×+× ∴Average of the n numbers, nnnn 621621 n 6... n 2 n 1 n6..........n2n1 ×++×+×= ××+× = Here clearly n1, n2, …, n6 are unknown. But by the relative frequency definition of probability, we may approximate n n1 by P(getting 1 on the top) = 6 1 , n n2 by P(getting 2 on the top) = 6 1 , and so on. So we can ‘expect’ the average of the n numbers to be 5.3 2 7 = . We call this the Mathematical Expectation of the number on the top. Definition Let E be a random experiment with n outcomes a1, a2 ……….an. Suppose P({a1})=p1, P({a2})=p2, …, P({an})=pn. Then we define the mathematical expectation as nn2211 pa.........papa ×+×+×
  • 28. 23 Problems 1. If a service club sells 4000 raffle tickets for a cash prize of $800, what is the mathematical expectation of a person who buys one of these tickets? Solution. 2.0 5 1 )(0 4000 1 800 ==×+× 2. A charitable organization raises funds by selling 2000 raffle tickets for a 1st prize worth $5000 and a second prize $100. What is mathematical expectation of a person who buys one of the tickets? Solution. )(0 2000 1 100 2000 1 5000 ×+×+× 3. A game between 2 players is called fair if each player has the same mathematical expectation. If some one gives us $5 whenever we roll a 1 or a 2 with a balanced die, what we must pay him when we roll a 3, 4, 5 or 6 to make the game fair? Solution. If we pay $x when we roll a 3, 4, 5, or 6 for the game to be fair, 6 2 5 6 4 ×=×x or x = 10. That is we must pay $10. 4. Gambler’s Ruin A and B are betting on repeated flips of a balanced coin. At the beginning, A has m dollars and B has n dollars. After each flip the loser pays the winner 1 dollar and the game stops when one of them is ruined. Find probability that A will win B’s n dollars before he loses his m dollars. Solution. Let p be the probability that A wins (so that 1-p is the probability that B wins). Since the game is fair, A’s math exp = B’s math exp. Thus ( ) 0.pp)m(1p10pn +−=−+× or nm m p + =
  • 29. 24 5. An importer is offered a shipment of machines for $140,000. The probability that he will sell them for $180,000, $170,000 (or) $150,000 are respectively 0.32, 0.55, and 0.13. What is his expected profit? Solution. Expected profit = 13.0000,1055.0000,3032.0000,40 ×+×+× =$30,600 6. The manufacturer of a new battery additive has to decide whether to sell her product for $80 a can and for $1.2 a can with a ‘double your money back if not satisfied’ guarantee. How does she feel about the chances that a person will ask for double his/her money back if (a) she decides to sell the product for $0.80 (b) she decides to sell the product for $1.20 (c) she can not make up her mind? Solution. In the 1st case, she gets a fixed amount of $0.80 a can In the 2nd case, she expects to get for each can (1.20) (1-p) + (-1.2) (p) = 1.20 – (2.4) p Let p be the prob that a person will ask for double his money back. (a) happens if 0.80 > 1.20 –2.40 p p > 1/6 (b) happens if p < 1/6 (c) happens if p = 1/6
  • 30. 25 7. A manufacturer buys an item for $1.20 and sells it for $4.50. The probabilities for a demand of 0, 1, 2, 3, 4, “5 or more” items are 0.05, 0.15, 0.30, 0.25, 0.15, 0.10 respectively. How many items he must stock to maximize his expected profit? No. of items stocked No. sold with prob. Exp. profit 0 0 1 0 1 0 0.05 1 0.95 175.2 1.295.0 5.405.00 = −× +× 2 0 0.05 1 0.15 2 0.80 675.3 2.480.09 15.05.405.00 = −×+ ×+× 3 0 0.05 1 0.15 2 0.30 3 0.50 ˆˁˋˈˆˁˋˈˆˁˋˈˆˁˋˈ= −× +×+ ×+× 3.615.0 5.1330.09 15.05.405.00 4 2.85 5 0.525 6 0.45 Hence he must stock 3 items to maximize his expected profit. 8. A contractor has to choose between 2 jobs. The 1st job promises a profit of $240,000 with probability 0.75 and a loss of $60,000 with probability 0.25. The 2nd job promises a profit of $360,000 with probability 0.5 and a loss of $90,000 with probability 0.5. (a) Which job should the contractor choose to maximize his expected profit? i. Exp. profit for job1 = 000,155 4 1 000,60 4 3 000,240 =×−× ii. Exp. profit for job2 = 36,000 000,135 2 1 000,90 2 1 =×−× Go in for job1. (b) What job would the contractor probably choose if her business is in bad shape and she goes broke unless, she makes a profit of $300,000 on her next job. Ans:- She takes the job2 as it gives her higher profit.
  • 31. 26 RANDOM VARIABLES Let E be a random experiment. A random variable (r.v) X is a function that associates to each outcome s, a unique real number X (s). Example 1 Let E be the random experiment of tossing a fair coin 3 times. We see that there are 823 = outcomes TTT, HTT, THT, TTH, HHT, HTH, THH, HHH all of which are equally likely. Let X be the random variable that ‘counts’ the number of heads obtained. Thus X can take only 4 values 0,1,2,3. We note that ( ) ( ) ( ) ( ) . 8 1 3, 8 3 2, 8 3 1, 8 1 0 ======== XPXPXPXP This is called the probability distribution of the rv X. Thus the probability distribution of a rv X is the listing of the probabilities with which X takes all its values. Example 2 Let E be the random experiment of rolling a pair of balanced die. There are 36 possible equally likely outcomes, namely (1,1), (1,2)…… (6,6). Let X be the rv that gives the sum of the two nos on the top. Hence X take 11 values namely 2,3……12. We note that the probability distribution of X is ( ) ( ) ( ) ( ) 36 2 11XP3XP, 36 1 12XP2XP ======== , ( ) ( ) , 36 3 10XP4XP ==== ( ) ( ) 36 4 9XP5XP ==== . ( ) ( ) ( ) . 6 1 36 6 7XP, 36 5 8XP6XP ======= Example 3 Let E be the random experiment of rolling a die till a 6 appears on the top. Let X be the no of rolls needed to get the “first” six. Thus X can take values 1,2,3…… Here X takes an infinite number of values. So it is not possible to list all the probabilities with which X takes its values. But we can give a formula.
  • 32. 27 ( ) ( ).....2,1 6 1 6 5 1 === − xxXP x (Justification: X = x means the first (x-1) rolls gave a number (other than 6) and the xth roll gave the first 6. Hence ( ) ) 6 1 6 5 6 1 6 5 ... 6 5 6 5 1 1 − − =×××== x timesx xXP Discrete Random Variables We say X is a discrete rv of it can take only a finite number of values (as in example 1,2 above) or a “countably” infinite values (as in example 3). On the other hand, the annual rainfall in a city, the lifelength of an electronic device, the diameter of washers produced by a factory are all continuous random variables in the sense they can take (theoretically at least) all values in an ‘interval’ of the x-axis. We shall discuss continuous rvs a little later. Probability distribution of a Discrete RV Let X be a discrete rv with values ......, 21 xx Let ( ) ( )( ).....2,1ixXPxf ii === We say that ( ){ } ....2,1iixf = is the probability distribution of the rv X. Properties of the probability distribution (i) ( ) .....2,1iallfor0xf i =≥ (ii) ( ) 1xf i i = The first condition follows from the fact that the probability is always .0≥ The second condition follows from the fact that the probability of the certain event = 1.
  • 33. 28 Example 4 Determine whether the following can be the probability distribution of a rv which can take only 4 values 1,2,3 and 4. (a) ( ) ( ) ( ) ( ) 26.0426.0326.0226.01 ==== ffff . No as the sum of all the “probabilities” > 1. (b) ( ) ( ) ( ) ( ) 28.0429.03,28.0215.01 ==== ffff . Yes as these are all 0≥ and add up to 1. (c) ( ) 4,3,2,1 16 1 = + = x x xf . No as the sum of all the probabilities < 1. Binomial Distribution Let E be a random experiment having only 2 outcomes, say ‘success’ and ‘failure’. Suppose that P(success) = p and so P(failure) = q (=1-p). Consider n independent repetitions of E (This means the outcome in any one repetition is not dependent upon the outcome in any other repetition). We also make the important assumption that P(success) = p remains the same for all such independent repetitions of E. Let X be the rv that ’counts’ the number of successes obtained in n such independent repetitions of E. Clearly X is a discrete rv that can take n+1 values namely 0,1,2,….n. We note that there are n 2 outcomes each of which is a ‘string’ of n letters each of which is an S or F (if n =3, it will be FFF, SFF, FSF, FFS, SSF, SFS, FSS, SSS). X = x means in any such outcome there are x successes and (n-x) failures in some order. One such will be xnx FFFFSSSS − .... . Since all the repetitions are independent prob of this outcome will be xnx qp − . Exactly the same prob would be associated with any other outcome for which X = x. But x successes can occur out of n repetitions in x n mutually exclusive ways. Hence ( ) ( )....n1,0,xqp x n xXP xnx === −
  • 34. 29 We say X has a Binomial distribution with parameters n (≡ the number of repetitions) and p (Prob of success in any one repetition). We denote ( ) ( )p,n;xbxXP by= to show its dependence on x, n and p. The letter ‘b’ stands for binomial. Since all the above (n+1) probabilities are the (n+1) terms in the expansion of the binomial ( )n pq + , X is said to have a binomial distribution. We at once see that the sum of all the binomial probabilities = ( ) .11 ==+ nn pq The independent repetitions are usually referred to as the “Bernoulli” trials. We note that ( ) ( )q,n;xnbp,n;xb −= (LHS = Prob of getting x successes in n Bernoulli trials = prob of getting n-x failures in n Bernoulli trials = R.H.S.) Cumulative Binomial Probabilities Let X have a binomial distribution with parameters n and p. ( ) ( ) P0XPxXP +==≤ ( ) ( )xXP......1X =+= = ( )p,n;kb x 0k= is denoted by ( )pnxB ,; and is called the cumulative Binomial distribution function. This is tabulated in Table 1 of your text book. We note that ( ) ( ) ( ) ( ) ( ) ( )p,n;1xBp,n;xB 1xXPxXPxXpp,n;xb −−= −≤−≤=== Thus ( )60.00,12;9b = ( ) ( )60.0,12;860.0,12;9 BB − 1419.0 7747.09166.0 = −= (You can verify this by directly calculating ( )).9;12,0.60b
  • 35. 30 Example 5 (Exercise 4.15 of your book) During one stage in the manufacture of integrated circuit chips, a coating must be applied. If 70% of the chips receive a thick enough coating find the probability that among 15 chips. (a) At least 12 will have thick enough coatings. (b) At most 6 will have thick enough coatings. (c) Exactly 10 will have thick enough coatings. Solution Among 15 chips, let X be the number of chips that will have thick enough coatings. Hence X is a rv having Binomial distribution with parameters n =15 and p = 0.70. (a) ( ) ( )11XP112XP ≤−=≥ ( ) 3969.07031.01 70.0,15;111 =−= −= B (b) ( ) ( )70.0,15;6B6XP =≤ 0152.0= (c) ( ) ( ) ( )70.0,15;9B70.0,15;10B10XP −== 2065.0 2784.04849.0 = −= Example 6 (Exercise 4.19 of your text book) A food processor claims that at most 10% of her jars of instant coffee contain less coffee than printed on the label. To test this claim, 16 jars are randomly selected and contents weighed. Her claim is accepted if fewer than 3 of the 16 jars contain less coffee (note that 10% of 16 = 1.6 and rounds to 2). Find the probability that the food processor’s claim will be accepted if the actual percent of the jars containing less coffee is (a) 5% (b) 10% (c) 15% (d) 20% Solution: Let X be the number of jars that contain less coffee (than printed on the label) (among the 16 jars randomly chosen. Thus X is a random variable having a Binomial distribution
  • 36. 31 with parameters n = 16 and p (the prob of “success” = The prob that a jar chosen at random will have less coffee) (a) Here p = 5% = 0.05 Hence P (claim is accepted) = ( ) ( ) .9571.005.0,16;2B2XP ==≤ (b) Here p = 10% = 0.10 Hence P (claim is accepted) = ( ) 7892.001.0,16;2 =B (c) Here p = 15% = 0.15. Hence P (claim is accepted) = B( ) 5614.015.0,16;2 = (d) Here p = 20% = 0.20 Hence P(claims accepted) = ( ) 3518.029.0,16,2 =B Binomial Distribution – Sampling with replacement Suppose there is an urn containing 10 marbles of which 4 are white and the rest are black. Suppose 5 marbles are chosen with replacement. Let X be the rv that counts the no of white marbles drawn. Thus X = 0,1,2,3,4 or 5 (Remember that we replace each marble in the urn before drawing the next one. Hence we can draw 5 white marbles) P (“Success”) = P (Drawing a white marble in any one of the 5 draws) = 10 4 (remember we draw with replacement). Thus X has a Binomial distribution with parameters n = 5 and 10 4 =p Hence ( ) == 10 4 ,5;xbxXP Mode of a Binomial distribution We say 0x is the mode of the Binomial distribution with parameters n and p if ( )0xXP = is the greatest. From the binomial tables given in the book we can easily see that
  • 37. 32 When ( ) .mod55, 2 1 ,10 etheisorgreatesttheisXPpn === Fact ( ) ( ) ( )pnpxif p p n xn pnxb pnxb −−<> − × + − = + 11 11;; ,;1 ( ) ( )ppnnif pnpxif −−>< −−== 11 11 Thus so long as x <np – (1-p) the binomial probabilities increase and if x> np-(1-p) they decrease. Hence if np-(1-p) = x0 is an integer, then the mode is .100 +xandx If n – (1-p) in not an integer and if 0x = smallest integer ( )pnp −−≥ 1 , the mode is .x0 Hypergeometric Distribution (Sampling without replacement) An urn contains 10 marbles of which 4 are white. 5 marbles are chosen at random without replacement. Let X be the rv that counts the number of white marbles drawn. Thus X can take 5 values names 0,1,2,3,4. What is P (X = x)? Now out of 10 marbles 5 can be chosen in 5 10 equally like ways, out of which there will be − xx 5 64 ways of drawing x white marbles (and so 5-x read marbles) (Reason out of 4 white marbles, x can be chosen in x 4 ways and out of 6 red marbles, 5-x can be chosen in − x5 6 ways). Hence ( ) .4,3,2,1,0 5 10 5 64 = − == x xx xXP We generalize the above result. A box contains N marbles out of which a are white. n marbles are chosen without replacement. Let X be the random variable that counts the number of white marbles drawn. X can take the values 0,1,2……. n.
  • 38. 33 ( ) ....2,1,0= − − == x n N xn aN x a aXP n (Note x must be less than or equal to a and n-x must be less than or equal to N-a) We say the rv X has a hypergeometric distribution with parameters n,a and N. We denote P(X=x) by h (x;n,a,N). Example 7 (Exercise 4.22 of your text book) Among the 12 solar collectors on display, 9 are flat plate collectors and the other three are concentrating collectors. If a person choses at random 4 collectors, find the prob that 3 are flat plate ones. Ans ( )= 4 12 1 3 3 9 12,9,4;3h Example 8 (Exercise 4.24 of your text book) If 6 of 18 new buildings in a city violate the building code, what is the probability that a building inspector, who randomly selects 4 of the new buildings for inspection, will catch (a) None of the new buildings that violate the building code Ans ( )= 4 18 4 12 18,6,4;1h (b) One of the new buildings that violate the building code
  • 39. 34 Ans ( )= 4 18 3 12 1 6 18,6,4;1h (c) Two of the new buildings that violate the building code Ans ( )= 4 18 2 12 2 6 18,6,4;2h (d) At least three of the new buildings that violate the building code Ans ( ) ( )18,6,4;418,6,4;3 hh + (Note: We choose 4 buildings out of 18 without replacement. Hence hypergeometric distribution is appropriate) Binomial distribution as an approximation to the Hypergeometric Distribution We can show that ( ) ( ) ∞→→ NaspnxbNanxh ,;,,; (Where )"" successaofprob N a p == . Hence if N is large the hypergeometric probability ( )N,a,n;xh can be approximated by the binomial probability ( )p,n;xb where . N a p = Example 9 (exercise 4.26 of your text) A shipment of 120 burglar alarms contains 5 that are defective. If 3 of these alarms are randomly selected and shipped to a customer, find the probability that the customer will get one defective alarm. (a) By using the hypergemetric distribution (b) By approximating the hypergeometric probability by a binomial probability.
  • 40. 35 Solution Here N = 120 (Large!) a = 5 n = 3 x =1 (a) Reqd prob = ( )120,5,3;1h 1167.0 280840 65555 3 120 2 115 1 5 = × == (b) ( )≈ 120 5 ,3;1120,5,3;1 bh 1148.0 120 5 1 120 5 1 3 2 =−= Example 10 (Exercise 4.27 of your text) Among the 300 employees of a company, 240 are union members, while the others are not. If 8 of the employees are chosen by lot to serve on the committee which administrates the provident fund, find the prob that 5 of them will be union members while the others are not. (a) Using hypergemoretric distribution (b) Using binomial approximation Solution Here N = 300, a = 240, n = 8 x = 5 (a) ( )300,240,8;5h (b) ≈ 300 240 8,5;b
  • 41. 36 THE MEAN AND VARIANCE OF PROBABILITY DISTRIBUTIONS We know that the equation of a line can be written as .cmxy += Here m is the slope and c is the y intercept. Different m,c give different lines. Thus m and c characterize a line. Similarly we define certain numbers that characterize a probability distribution. The mean of a probability distribution is simply the mathematical expectation of the corresponding r.v. If a rv X takes on the values .....xx 2,1 with probabilities ( ) ( )....,xf,xf 21 its mathematical expectation or expected value is ( ) ( ) ( ) obabilityvaluexxPxxxfxxfx ii i Pr......2211 ×===++ We use the symbol µ to denote the mean of X. Thus ( ) ( )ii xxPxXE ===µ (Summation over all xi in the Range of X) Example 11 Suppose X is a rv having the probability distribution X 1 2 3 Prob 2 1 3 1 6 1 Hence the mean µ of the prob distribution (of X) is 3 5 6 1 3 3 1 2 2 1 1 =×+×+×=µ Example 12 Let X be the rv having the distribution X 0 1 Prob q p
  • 42. 37 where .10Thus.1 ppqpq =×+×=−= µ The mean of a Binomial Distribution Suppose X is a rv having Binomial distribution with parameters n and p. Then Mean of .npX =µ= (Read the proof on pages 107-108 of your text book) The mean of a hypergeometric Distribution If X is a rv having hypergeometric distribution with parameters .,,, N a nthenanN =µ Digression The mean of a rv x give the “average” of the values taken by the rv. X. Thus the average marks in a test is 40 means the students would have got marks less than 40 and greater than 40 but it averages out to be 40. But we do not get an idea about the spread (≡deviation from the mean) of the marks. This spread is measured by the variance. Informally speaking by the average of the squares of deviation from the mean. Variance of a Probability Distribution of X is defined as the expected value of ( )2 X µ− Variance of 2 X σ= ( ) ( ) Xi i 2 i Rx xXPx ∈ =−= Note that R.H.S is always 0≥ (as it is the sum of non-ve numbers) The positive square root 2 of σσ is called the standard deviation of X and has the same units as X and .µ
  • 43. 38 Example 13 For the rv X having the prob distribution given in example 11, the variance is 9 5 6 1 9 16 3 1 9 1 2 1 9 4 6 1 3 5 3 3 1 3 5 2 2 1 3 5 1 222 =×+×+= ×−+×−+×− x We could have also used the equivalent formula ( ) ( ) ( ) . 9 5 9 25 3 10 3 10 18 60 6 9 3 4 2 1 6 1 3 3 1 2 2 1 1XEHere XEXE 2 2222 2222 =−=σ∴ ==++=×+×+×= µ−=µ−=σ Example 14 For the probability distribution of example 12, ( ) ( ) pqp1ppp pp1qoXE 22 222 =−=−=σ∴ =×+×= Variance of the Binomial Distribution npq=2 σ Variance of the Hypergeometric Distribution . 1 .12 − − −= N nN N a N a nσ
  • 44. 39 CHEBYCHEV’S THEOREM Suppose X is a rv with mean µ and variance 2 σ . Chebychev’s theorem states that: If k is a constant > 0, ( ) 2 k 1 k|X|P ≤σ≥µ− In words the prob of getting a value which deviates from its mean µ by at least σk is at most 2 1 k . Note: Chebyshev’s Theorem gives us an upper bound of the prob of an event. Mostly it is of theoretical interest. Example 15 (Exercise 4.44 of your text) In one out of 6 cases, material for bullet proof vests fails to meet puncture standards. If 405 specimens are tested, what does Chebyshev theorem tell us about the prob of getting at most 30 or at least 105 cases that do not meet puncture standards? Here 2 135 6 1 405 =×==npµ 2 15 6 5 6 1 4052 =∴ ××== σ σ qpn Let X = no of cases out of 405 that do not meet puncture standards Reqd ( )105Xor30XP ≥≤ Now 2 75 X30X −≤µ−≤ 2 75 X105X ≥µ−≥ Thus σ=≥µ−≥≤ 5 2 75 |X|105Xor30X
  • 45. 40 ( ) ( ) 04.0 25 1 5 1 5|X|P105Xor30XP 2 ==≤σ≥µ−=≥≤∴ Example 16 (Exercise 446 of your text) How many times do we have to flip a balanced coin to be able to assert with a prob of at most 0.01 that the difference between the proportion of tails and 0.50 will be at least 0.04? Solution: Suppose we flip the coin n times and suppose X is the no of tails obtained. Thus the proportion of tails = = flipsofNoTotal tailsofNo n X . We must find n so that 10.00.040.50 n X P ≤≥− Now X = no of tails among n flips of a balanced coin is a rv having Binomial distribution with parameters n and 0.5. Hence ( ) 50.0nnpXE ×===µ ( )50.050.0 ==×== qpasnqpnσ Now 04.050.0 n X ≥− is equivalent to .n04.050.0nX ≥×− We know ( ) 2 k 1 kXP ≤σ≥µ− Here nk 04.0=σ n n n k 08.0 50.0 04.0 = × =∴
  • 46. 41 ( ) ( ) ( ) 15625 08. 100 nor .100n08.ifor100 01.0 1 kif 01.0 k 1 k|X|P 04.050.0 n X P 2 22 2 =≥ ≥=≥ ≤≤σ≥µ−= ≥−∴ Law of large Numbers Suppose a factory manufactures items. Suppose there is a constant prob p that an item is defective. Suppose we choose n items at random and let X be the no of defectives found. Then X is a rv having binomial distribution with parameters n and p. ( ) npqiancevar,npXEmean 2 =σ==µ∴ Let ε be any no > 0. Now ε≥− p n X P ( ) ( )σ≥µ−=ε≥−= kxPnnpXP (where ε=σ nk ) ≤ .nas0 n pq n npq n )theorems'Chebyshevby( k 1 22222 2 2 ∞→→ ε = ε = ε σ = Thus we can say that the prob that the proportion of defective items differs from the actual prob. p by any + ve no ∞→→ nas0ε . (This is called the Law of Large numbers) This means “most of the times” the proportion of defectives will be close to the actual (unknown) prob p that an item is defective for large n. So we can estimate n X byp , the (Sample) proportion of defectives.
  • 47. 42 POISSON DISTRIBUTION A random variable X is said to have a Poisson distribution with parameter 0>λ if its probability distribution is given by ( ) ( ) ......2,1,0 ! ; ==== − x x exfxXP x λ λ λ We can easily show: mean of λ=µ=X and variance of .X 2 λ=σ= Also ( )xXP = is largest when λλ−λ= ifand1x is an integer and when [ ]λ=x = the greatest integer λ≤ (when λ is not an integer). Also note that ( ) .0 ∞→→= xasxXP POISSON APPROXIMATION TO BINOMIAL DISTRIBUTION Suppose X is a rv having Binomial distribution with parameters n and p. We can easily show ( ) ( ) ( ) ∞→→== nasx;fxXPpn,x;b in such a way that np remains a constant .λ Hence for n large, p small, the binomial prob ( )pnxb ,; can be approximated by the Poisson prob ( )λ;xf where .np=λ Example 17 ( )03.0,100;3b ( ) !3 3 3;3 33− =≈ e f Example 18 (Exercise 4.54 of your text) If 0.8% of the fuses delivered to an arsenal are defective, use the Poisson approximation to determine the probability that 4 fuses will be defective in a random sample of 400. Solution If X is the number of defectives in a sample of 400, X has the binomial distribution with parameters n = 400 and p = 0.8% = 0.008.
  • 48. 43 Thus P (4 out of 400 are defective) ( ) ( ) ( ) ( ) 603.0781.0 !4 2.3 e 2.3008.0400Where;4f008.0,400;4b 4 2.3 −= = =×=λλ≈= − (from table 2 at the end of the text) = 0.178 Cumulative Poisson Distribution Function If X is a rv having Poisson Distribution with parameter ,λ the cumulative Poisson Prob ( ) ( ) ( ) ( )λ===≤=λ= == ;kfkXPxXP;xF x 0k x 0k For various ( )λλ ;xF,xand has been tabulated in table 2 (of your text book on page 581 to 585) .We use the table 2 as follows. ( ) ( ) ( ) ( ) ( ) ( )λ−−λ= −≤−≤===λ ;1xF;xF 1xXPxXPxXP;xf Thus ( ) ( ) ( ) .178.0603.0781.02.3;32.3;42.3;4 =−=−= FFf Poisson Process There are many situations in which events occur randomly in regular intervals of time. For example in a time period t, let tX be the number of accidents at a busy road junction in New Delhi; tX be the number of calls received at a telephone exchange; tX be the number of radio active particles emitted by a radioactive source etc. In all such examples we find tX is a discrete rv which can take non-ve integral values 0,1,2,….. The important thing to note is that all such random variables have “same” distribution except that the parameter(s) depend on time t. The collection of random variables ( )tX t > 0 is said to constitute a random process. If each ( )tX has a Poisson Distribution, we say ( )tX is a Poisson process. Now we show the rvs ( )tX which counts the number of occurrences of a random phenomena in a time
  • 49. 44 period t constitute a Poisson process under suitable assumptions. Suppose in a time period t, a random phenomenon which we call “success” occurs. We let Xt = number of successes in time period t. We assume : 1. In a small time period ,t∆ either no success or one success occurs. 2. The prob of a success in a small time period t∆ is proportional to t∆ i.e. say ( ) tXP t ∆==∆ α1 . ( →α constant of proportionality) 3. The prob of a success during any time period does not depend on what happened prior to that period. Divide the time period t into n small time periods each of length t∆ . Hence by assumptions above, we note that Xt = no of successes in time period t is a rv having Binomial distribution with parameters n and tp ∆= α . Hence ( ) ( )t,n;xbxXP t ∆α== ( ) tn.where nasx;f = ∞→→ So we can say that Xt = no of successes in time period t is a rv having Poisson distribution with parameter .tα Meaning of the proportaratility constant α Since mean of tisXt α=λ , We find α = mean no of successes in unit time. (Note: For a more rigorous derivation of the distribution of Xt, you may see Meyer, Introductory probability and statistical applications, pages 165-169). Example 19 (Exercise 4.56 of your text) Given that the switch board of a consultant’s office receives on the average 0.6 call per minute, find the probability that (a) In a given minute there will be at least one call. (b) In a 4-minute interval, there will be at least 3 calls.
  • 50. 45 Solution Xt= no of calls in a t-minute interval is a rv having Poisson distribution with parameter tt 6.0=α (a) ( ) ( ) .451.0549.01e10XP11XP 6.0 11 =−=−==−=≥ − (b) ( ) ( ) ( ) 430.0570.014.2;2F12XP13XP 44 =−=−=≤−=≥ Example 20 Suppose that Xt, the number of particles emitted in t hours from a radio – active source has a Poisson distribution with parameter 20t. What is the probability that exactly 5 particles are emitted during a 15 minute period? Solution 15 minutes = hour 4 1 Hence 4 1Xif = no of particles emitted in hour 4 1 ( ) )2tablefrom(176.0440.0616.0 !5 5 e !5 20 4 1 e5XP 5 5 5 204 1 4 1 =−= = × == −×−
  • 51. 46 THE GEOMETRIC DISTRIBUTION Suppose there is a random experiment having only two possible outcomes, called ‘success’ and ‘failure’. Assume that the prob of a success in any one ‘trial’ (≡repetition of the experiment) is p and remains the same for all trials. Also assume the trials are independent. The experiment is repeated till a success is got. Let X be the rv that counts the number of trials needed to get the 1st success. Clearly X = x if the first (x-1) trials were failures and the xth trial gave the first success. Hence ( ) ( ) ( ) ( )......2,1xpqpp1p;xgxXP 1x1x ==−=== −− We say X has a geometric distribution with parameter p (as the respective probabilities form a geometric progression with common ratio q). We can show the mean of this distribution is p 1 =µ and the variance is 2 2 p q =σ (For example suppose a die is rolled till a 6 is got. It is reasonable to expect on an average we will need 6 1 6 1 = rolls as there are 6 nos!) Example 21 (Exercise 4.60 of your text) An expert hits a target 95% of the time. What is the probability that the expert will miss the target for the first time on the fifteenth shot? Solution Here ‘Success’ means the expert misses the target. Hence ( ) 05.0%5 === SuccessPp . If X is the rv that counts the no. of shots needed to get ‘a success’, we want ( ) ( ) .05.095.015 1414 ×=×== pqXP
  • 52. 47 Example 22 The probability of a successful rocket launching is 0.8. Launching attempts are made till a successful launching has occurred. Find the probability that exactly 6 attempts will be necessary. Solution ( ) 8.02.0 5 × Example 23 X has a geometric distribution with parameter p. show (a) ( ) ,.........2,11 ==≥ − rqrXP r (b) ( ) ( )tXPsxtsxP ≥=>+≥ | Solution (a) ( ) .q q1 pq .pqrXP 1r 1r rx 1x − −∞ = − = − ==≥ (b) ( ) ( ) ( ) ( ).1 1 tXPq q q sXP tsXP sXtsXP t s ts ≥=== > +≥ =>+≥ − −+ Application to Queuing Systems Service facility Customers arrive in a Depart after service Poisson Fashion There is a service facility. Customers arrive in a random fashion and get service if the server is idle. Else they stand in a Queue and wait to get service. Examples of Queuing systems 1. Cars arriving at a petrol pump to get petrol 2. Men arriving at a Barber’s shop to get hair cut. 3. Ships arriving at a port to deliver goods. S
  • 53. 48 Questions that one can ask are : 1. At any point of time on an average how many customers are in the system (getting service and waiting to get service)? 2. What is the mean time a customer waits in the system? 3. What proportion of time a server is idle? And so on. We shall consider only the simplest queueing system where there is only one server. We assume that the population of customers is infinite and that there is no limit on the number of customers that can wait in the queue. We also assume that the customers arrive in a ‘Poission fashion’ at the mean rate of α . This means that tX the number of customers that arrive in a time period t is a rv having Poisson distribution with parameter tα . We also assume that so long as the service station is not empty, customers depart in a Poisson fashion at a mean rate of β . This means, when there is at least one customer, tY , the number of customers that depart (after getting service) in a time period t is a r.v. having Poisson distribution with parameter tβ (where αβ > ). Further assumptions are : In a small time interval ,t∆ there will be a single arrival or a single departure but not both. (Note that by assumptions of Poisson process in a small time interval ,t∆ there can be at most one arrival and at most one departure). Let at time t, tN be the number of customers in the system. Let ( ) ( ).tpnNP nt == We make another assumption: ( ) nnn .tastp π∞→π→ is known as the steady state probability distribution of the number of customers in the system. It can be shown: ( )...,2,1,01 1 =−= −= n n n o β α β α π β α π Thus L = Mean number of customers in the system getting service and waiting to get service)
  • 54. 49 αβ α π − == ∞ = n n n. 0 qL = Mean no of customers in the queue (waiting to get service) ( ) ( ) β α αββ α π −= − =−= ∞ = Ln n n 2 1 1 W = mean time a customer spends in the system ααβ L = − = 1 qW = Mean time a customer spends in the queue. ( ) . 1 βααββ α −== − = W Lq (For a derivation of these results, see Operations Research Vol. 3 by Dr. S. Venkateswaran and Dr. B Singh, EDD Notes of BITS, Pilani). Example 24 (Exercise 4.64 of your text) Trucks arrive at a receiving dock in a Poisson fashion at a mean rate of 2 per hour. The trucks can be unloaded at a mean rate of 3 per hour in a Poisson fashion (so long as the receiving dock is not empty). (a) What is the average number of trucks being unloaded and waiting to get unloaded? (b) What is the mean no of trucks in the queue? (c) What is the mean time a truck spends waiting in the queue? (d) What is the prob that there are no trucks waiting to be unloaded? (e) What is the prob that an arriving truck need not wait to get unloaded?
  • 55. 50 Solution Here α = arrival rate = 2 per hour β = departure rate = 3 per hour. Thus (a) 2 23 2 = − = − = αβ α L (b) ( ) ( ) 3 4 13 222 == − = αββ α qL (c) ( ) hrWq 3 2 = − = αββ α (d) P (no trucks are waiting to be unloaded) = (No of trucks in the dock is 0 or 1) 3 2 3 2 1 3 2 11110 −+−=−+−=+= β α β α β α ππ 9 5 9 2 3 1 =+= (e) P (arriving truck need not wait) = P (dock is empty) = 3 1 0 =π Example 25 With reference to example 24, suppose that the cost of keeping a truck in the system is Rs. 15/hour. If it were possible to increase the mean loading rate to 3.5 trucks per hour at a cost of Rs. 12 per hour, would this be worth while?
  • 56. 51 Solution In the old scheme, 2,3,2 === Lβα ∴ Mean cost per hour to the dock = 2 x 15 = 30/hr. In the new scheme 3 4 L,3,2 ==β=α verify! ∴ Net cost per hour to the dock = .hr/321215 3 4 =+× Hence it is not worthwhile to go in for the new scheme.
  • 57. 52 MULTINOMIAL DISTRIBUTION Consider a random experiment E and suppose it has k possible outcomes .,...., 21 kAAA Suppose ( ) ii pAP = for all i and that pi remains the same for all independent repetitions of E. Consider n independent repetitions of E. Suppose A1 occurs X1 times, A2 occurs X2 times, …, Ak occurs Xk times. Then ( )kk xXxXxXP === ,...., 2211 = kx k xx k ppp xxx n ..... !!......! ! 21 21 21 for all non-ve integers xxxxwithxxx kk =+++ .....,, 2121 Proof. The probability of getting 11 xA times, 22 xA times, kk xA times in any one way is kx k xx ppp ......21 21 as all the repetitions are independent. Now among the n repetitions 1A occurs 1x times in ( )!! ! 111 xnx n x n − = ways. From the remaining 1xn − repetitions 2A can occur 2x times in ( ) ( )!! ! 212 1 2 1 xxnx xn x xn −− − = − ways and so on. Hence the total number of ways of getting 11 xA times, 22 xA times, …. kk xA times will be ( ) ( ) ( ) ( ) ( )!....! !..... ... !! ! !! ! 121 121 212 1 11 kkk k xxxxnx xxxn xxnx xn xnx n −−− −− × −− − × − − − 1!0..... !!......! ! 21 21 ==++= andnxxxas xxx n k k Hence ( ) kx k xx k kk ppp xxx n xXxXxXP .... !!....! ! ,....., 21 21 21 2211 ====
  • 58. 53 Example 26 A die is rolled 30 times. Find the probability of getting 1 2 times, 2 3 times, 3 4 times, 4 6 times, 5 7 times and 6 8 times. Ans 876432 6 1 6 1 6 1 6 1 6 1 6 1 !8!7!6!4!3!2 !30 × Example 27 (See exercise 4.72 of your text) The probabilities are, respectively, 0.40, 0.40, and 0.20 that in city driving a certain type of imported car will average less than 10 kms per litre, anywhere between 10 and 15 kms per litre, or more than 15 kms per litre. Find the probability that among 12 such cars tested, 4 will average less than 10 kms per litre, 6 will average anywhere from 10 to 15 kms per litre and 2 will average more than 15 kms per litre. Solution ( ) ( ) ( ) .20.40.40. !2!6!4 !12 264 Remark 1. Note that the different probabilities are the various terms in the expansion of the multinomial ( )n kppp ......21 ++ . Hence the name multinomial distribution. 2. The binomial distribution is a special case got by taking k =2. 3. For any fixed ( ) iXkii ≤≤1 (the number of ways of getting iA ) is a random variable having binomial distribution with parameters n and pi. Thus ( ) ii pnXE = and ( ) ( ) ...k1,2.......i.p1npXV iii =−=
  • 59. 54 SIMULATION Nowadays simulation techniques are being applied to many problems in Science and Engineering. If the processes being simulated involve an element of chance, these techniques are referred to as Monte Carlo methods. For example to study the distribution of number of calls arriving at a telephone exchange, we can use simulation techniques. Random Numbers : In simulation problems one uses the tables of random numbers to “generate” random deviates (values assumed by a random variable). Table of random numbers consists of many pages on which the digits 0,1,2….. 9 are distributed in such a was that the probability of any one digit appearing is the same, namely 10 1 1.0 = . Use of random numbers to generate ‘heads’ and ‘tails’. For example choose the 4th column of the four page of table 7, start at the top and go down the page. Thus we get 6,2,7,5,5,0,1,8,6,3….. Now we can interpret this as H,H,T, T,T, H, T, H, H,T, because the prob of getting an odd no. = the propagating an even number = 0.5 Thus we associate head to the occurrence of an even number and tail to that of an odd no. We can also associate a head if we get 5,6,7,8, or 9 and tail otherwise. The use can say we got H,T,H,H,H,T,T,H,H,T….. In problems on simulation we shall adopt the second scheme as it is easy to use and is easily ‘extendable’ for more than two outcomes. Suppose for example, we have an experiment having 4 outcomes with prob. 0.1, 0.2, 0.3 and 0.4 respectively. Thus to simulate the above experiment, we have to allot one of the 10 digits 0,1….9 to the first outcome, two of them to the second outcome, three of them to the third outcome and the remaining four to the fourth outcome. Though this can be done in a variety of ways, we choose the simplest way as follows: Associate the first digit 0 to the 1st outcome 10 Associate the next 2 digits 1,2 to the 2nd outcome 20 Associate the next 3 digits 3,4,5 to the 3rd outcome 30 . And associate the last 4 digits 6,7,8,9 to the 4th outcome 40 . Hence the above sequence 6,2,7,5,5,0,1,8,6,3… of random numbers would correspond to the sequence of outcomes ..............,,,,,,,,, 3442133424 OOOOOOOOOO Using two and higher – digit Random numbers in Simulation
  • 60. 55 Suppose we have a random experiment with three outcomes with probabilities 0.80, 0.15 and 0.05 respective. How can we now use the table of random numbers to simulate this experiment? We now read 2 numbers at a time : say (starting from page 593 room 12, column 4) 84,71,14,24,20,31,78, 03………….. Since P (anyone digit) = 10 1 , P (any two digits) = 01.0 10 1 10 1 =× . Thus each 2 digit random number occurs with prob 0.01. Now that there will be 100 2 digit random numbers : 00, 01, …, 10, 11, …, 20, 21, …, 98, 99. Thus we associate the first 80 numbers 00,01…79 to the first out come, the next 15 numbers (80, 81, …94) to the second outcome and the last 5 numbers (95, 96, …, 99) to the 3rd outcome. Thus the above sequence of 2 digit random numbers would simulate the outcomes: .......,,,,,,, 11111112 OOOOOOOO We describe the above scheme in a diagram as follows: Outcome Probability Cumulative Probability* Random Numbers** O1 0.80 0.80 00-79 O2 0.15 0.95 80-94 O3 0.05 1.00 95-99 * Cumulative prob is got by adding all the probabilities at that position and above thus cumulative prob at O2 = Prob of O1 + Prob O2 = 0.80 + 0.15 = 0.95. ** You observe the beginning random number is 00 for the 1st outcome; and for the remaining outcomes, it is one more than the ending random numbers of the immediately preceding outcome. Also the ending random number for each outcome is “one less than the cumulative probability”. Similarly three digit random numbers are used if the prob of an outcome has 3 decimal places. Read the example on page 133 of your text book.
  • 61. 56 Exercise 4.97 on page 136 Starting with page 592, Row 14, Column 7, we read of the 4 digit random nos as : R No. Polluting spics R.No. Polluting spics 5095 1 2631 1 0150 0 3033 1 8043 2 9167 3 9079 3 4998 1 6440 2 7036 2 CONTINOUS RANDOM VARIABLES In many situations, we come across random variables that take all values lying in a certain interval of the x axis. Example (1) life length X of a bulb is a continuous random variable that can take all non-ve real values. (2) The time between two consecutive arrivals in a queuing system is a random variable that can take all non-ve real values. No. of polluting spices Probability Cumulative Probability Random Numbers 0 0.2466 0.2466 0000-2465 1 0.3452 0.5918 2466-5917 2 0.2417 0.8335 5918-8334 3 0.1128 0.9463 8335-9462 4 0.0395 0.9858 9463-9857 5 0.0111 0.9969 9858-9968 6 0.0026 0.9995 9969-9994 7 0.0005 1.0000 9995-9999
  • 62. 57 (3) The distance R of the point (where a dart hits) (from the centre) is a continuous random variable that can take all values in the interval (0,a) where a is the radius of the board. It is clear that in all such cases, the probability that the random variable takes any one particular value is meaningless. For example, when you buy a bulb, you ask the question? What are the chances that it will work for at least 500 hours? Probability Density function (pdf) If X is a continuous random variable, the questions about the probability that X takes values in an interval (a,b) are answered by defining a probability density function. Def Let X be a continuous rv. A real function f(x) is called the prob density function of X if (1) ( ) xallforxf 0≥ (2) ( ) 1= ∞ ∞− dxxf (3) ( ) ( ) .dxxfbXaP b a =≤≤ Condition (1) is needed as probability is always .0≥ Condition (2) says that the probability of the certain event is 1. Condition (3) says to get the prob that X takes a value between a and b, integrate the function f(x) between a and b. (This is similar to finding the mass of a rod by integrating its density function). Remarks 1. ( ) ( ) ( ) 0==≤≤== dxxfaXaPaXP a a 2. Hence ( ) ( ) ( ) ( )bXaPbXaPbXaPbXaP <<=<≤=≤<=≤≤ Please note that unlike discrete case, it is immaterial whether we include or exclude one or both the end points. 3. ( ) ( ) xxfxxXxP ∆≈∆+≤≤
  • 63. 58 This is proved using Mean value theorem. Definition (Cumulative Distribution function) If X is a continuous rv and if f(x) is its density, ( ) ( ) ( )dttfxXPxXP x ∞− =≤<∞−=≤ We denote the above by F(x) and call it the cumulative distribution function (cdf) of X. Properties of cdf 1. ( ) .10 xallforxF ≤≤ 2. ( ) ( )2121 xFxFxx ≤< i.e., F(x) is a non-decreasing function of x. 3. ( ) ( ) ( ) ( ) .1;0 limlim ==∞+==∞− ∞→−∞→ xFfxfF xx 4. ( ) ( ) ( )xfdttf dx d xF dx d x == ∞− (Thus we can get density function f(x) by differentiating the distribution function F(x)). Example 1 (Exercise 5.2 of your book) If the prob density of a rv is given by ( ) 102 <<= xkxxf (and 0 elsewhere) find the value of k and the probability that the rv takes on a value (a) Between 4 3 4 1 and (b) Greater than 3 2 Find the distribution function F(x) and hence answer the above questions.
  • 64. 59 Solution ( ) 1= ∞ ∞− dxxf gives ( ) ( )( ) .31 3 1 1.. 1001 2 1 0 1 0 === ><== korkordxkxei orxifxfasdxxf Thus ( ) .0103 2 otherwiseandxxxf ≤≤= 32 13 64 26 4 1 4 3 3 4 3 4 1 33 24 3 4 1 ==−==<< dxxXP 27 19 3 2 1 31 3 2 3 2 3 3 2 1 3 2 =−= =<<=> dxxXPXP Distribution function ( ) ( )dttfxF x ∞− = Case (i) 0≤x . In this case ( ) 0=tf between ( ) 0=∴∞− xFxand Case (ii) 0<x<1. In this case ( ) 2 3ttf = between 0 and x and 0 for t<0. ( ) ( ) .3 32 0 xdttdttfxF xx ===∴ ∞− Case (iii) x > 1 Now ( ) 10 >= tfortf
  • 65. 60 ( ) ( ) ( ) )(1 1 iicasebydttfdttfxF x ===∴ ∞−∞− Hence we can say the distribution function ( ) > ≤< ≤ = 01 10 00 3 x xx x xF Now ≤−<=<< 4 1 4 3 4 3 4 1 XPXPXP = ≤−≤ 4 1 4 3 XPXP = 32 13 4 1 4 3 4 1 4 3 33 =−=− FF 27 19 3 2 1 3 2 1 3 2 1 3 2 3 =−=−= ≤−=> F XPXP Example 2 (Exercise 5.4 of your book) The prob density of a rv X is given by ( ) <≤− << = elsewhere xx xx xf 0 212 10 Find the prob that the rv takes a value (a) between 0.2 and 0.8 (b) between 0.6 and 1.2 Find the distribution function and answer the same questions.
  • 66. 61 Solution (a) ( ) ( )dxxfXP =<< 8.0 2.0 8.02.0 = 3.0 2 2.0 2 8.0 22 8.0 2.0 =−=dxx (b) ( ) ( )dxxfXP =<< 2.1 6.0 2.16.0 = ( ) ( ) ( )? 2.1 4 1 6.0 whydxxfdxxf + = ( ) 2.1 1 222 2.1 1 1 6.0 2 2 2 6.0 2 1 2 − −+−=−+ x dxxdxx ( ) 5.018.032.0 2 8. 2 1 32.0 2 =+==+= To Find the distribution function ( ) ( ) ( )dttfxxPxF x ∞− =≤= Case (i) 0≤x In this case ( ) xtfortf ≤= 0 ( ) ( ) .0==∴ ∞− dttfxF x Case (ii) 10 ≤< x In this case ( ) xtfortandttfortf ≤==≤= 00 Hence ( ) ( ) ( ) ( )dttfdttfdttfxF xx +== ∞−∞− 0 0 1 = 2 0 2 0 x dtt x =+ Case (iii) 21 ≤< x In this case ( ) 00 ≤= ttf xtt tt ≤<− ≤< 12 10
  • 67. 62 ( ) ( )dttfxF x ∞− =∴ = ( ) ( )dttfdttf x + ∞− `1 1 = ( ) ( )dttiicaseby x −+ 2 2 1 1 = ( ) ( ) 2 2 1 2 2 2 1 2 1 22 xx − −= − −+ Case (iv) x > 2 In this case ( ) xtfortf <<= 20 ( ) ( )dttfxF x ∞− =∴ ( ) ( ) ( ) 101 2 2 2 =+= += ∞− dtiiicaseby dttfdttf x x Thus ( ) ( ) > ≤< − − ≤< ≤ = 21 21 2 2 1 10 2 00 2 2 x x x x x x xF ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) 5.0 2 6.0 2 8.0 1 6.02.1 6.02.1 6.02.12.16.0 22 = −−= −= ≤−≤= ≤−<=<<∴ FF XPXP XPXPXP
  • 68. 63 ( ) ( ) ( ) ( ) 02.0 2 2. 118.11 8.118.1 2 =−−=−= ≤−=> F XPXP The mean and Variance of a continuous r.v Let X be a continuous rv with density f(x) We define its mean as ( ) ( )dxxfxXE ∞ ∞− ==µ We define its variance 2 σ as ( ) ( ) ( ) ( ) 22 22 µ µµ −= −=− ∞ ∞− XE dxxfxxE Here ( ) ( )dxxfxXE 22 ∞ ∞− = Example 3 The density of a rv X is ( ) ( )elsewhereandxxxF 0103 2 <<= Its mean ( ) ( ) . 4 3 3. 2 1 0 ==== ∞ ∞− dxxxdxxfxXEµ ( ) ( ) 5 3 3. 22 1 0 22 == = ∞ ∞− dxxx dxxfxXE Hence 0375.0 4 3 5 3 2 2 =−=σ Hence its sd is .1936.0=σ
  • 69. 64 Example 4 The density of a rv X is ( ) > = − elsewhere xe xf x 0 0 20 1 20/ ( ) ( ) dxexdxxfxXE x 20/ 0 20 1 . − ∞∞ ∞− ===µ Integrating by parts we get ( )[ ] .20 20. 0 20/20/ = −−= ∞−− xx eex ( ) ( ) dxex dxxfxXE x 20/2 0 22 20 1 − ∞ ∞ ∞− = = On integrating by parts we get ( ) ( )( ) ( )[ ] ( ) .20 400400800 800 400.2202 222 0 20/20/20/2 =∴ =−=−=∴ = −+−− ∞−−− σ µσ XE eexex xxx NORMAL DISTRIBUTION A random variable X is said to have the normal distribution (or Gaussian Distribution) if its density is ( ) ( ) ∞<<∞−= − − xexf x 2 2 22 2 1 ,; σ µ σπ σµ Hence σµ, are fixed (called parameters) and .0>σ The graph of the normal density is a bell shaped curve:
  • 70. 65 Figure It is symmetrical about the line µ=x and has points of inflection at .σµ ±=x One can use integration and show that ( ) 1= ∞ ∞− dxxf . We also see that ( ) µ=XE and variance of ( ) .22 σµ =−= XEX If ,1,0 == σµ we say that X has standard normal distribution. We usually use the symbol Z to denote the variable having standard normal distribution. Thus when Z is standard normal, its density is ( ) ., 2 1 2 2 ∞<<∞−= − zezf z π The cumulative distribution function of Z is ( ) ( ) dtezZPzF tz 2 2 2 1 − ∞− =≤= π and represents the area under the density upto z. It is the shaded portion in the figure. Figure We at once see from the symmetry of the graph that ( ) 5.0 2 1 0 ==F ( ) ( )zFzF −=− 1
  • 71. 66 F(z) for various positive z has been tabulated at in table 3 (at the end of your book). We thus see from Table 3 that ( ) ( ) 95.0645.1,6443.037.0 == FF ( ) ( ) 3199.033.2 ≥≈= zforzFF Hence ( ) 3557.06443.0137.0 =−=−F ( ) etcF 05.095.01645.1 =−=− Definition of αz If Z is standard normal, we define αz to be that number such that ( ) ( ) .1 αα αα −==> zForzZP Since F(1.645) = 0.95 = 1-0.05, we see that 645.105.0 =z Similarly 33.201.0 =z we also note αα zz −=−1 Thus 645.105.095.0 −=−= zz .33.201.099.0 −=−= zz Important If X is normal with mean µ and variance ,2 σ it can be shown that the standardized r.v. σ µ− = X Z has standard normal distribution. Thus questions about the prob that X assumes a value between say a and b can be translated into the prob that Z assumes values in a corresponding range. Specifically : ( )bXaP <<
  • 72. 67 − − − = − << − = − < − < − = σ µ σ µ σ µ σ µ σ µ σ µ σ µ a F b F b Z a P bXa P Example 1 (See Exercise 5.24 on page 152) Given that X has a normal distribution with mean 2.16=µ and variance ,5625.12 =σ find the prob that it will take on a value (a) > 16.8 (b) < 14.9 (c) between 13.6 and 18.8 (d) between 16.5 and 16.7 Here 25.15625.1 ==σ Thus ( ) − > − => 25.1 2.168.16 8.16 σ µX PXP ( ) ( ) ( ) 3156.06844.01 48.0148.01 48.0 25.1 6. =−= −=≤−= >=>= FzP ZPZP (b) ( ) − < − =< 25.1 2.169.14 9.14 σ µX PXP ( ) ( ) ( ) 1492.8508.0104.1104.1 04.1 25.1 3.1 =−=−=−= −<=−<= FF ZPZP ( ) − < − < − << 25.1 2.168.18 25.1 2.166.13 8.186.13 σ µX P XP
  • 73. 68 ( ) ( ) ( ) ( ) ( )( ) ( ) 9624.19812.02108.22 08.2108.208.208.2 08.208.2 25.1 6.2 25.1 6.2 =−×=−= −−=−−= <<−=<<−= F FFFF ZPZP (Note that ( ) ( ) 012 >−=<<− cforcFcZcP ) ( ) ( ) ( ) ( ) 606.05948.06554.0 24.04.04.024.0 25.1 5. 25.1 3. 25.1 2.167.16 25.1 2.165.16 7.165.16 =−= −=<<= <<= − < − < − =<< FFzP ZP X PXP σ µ Example 2 A rv X has a normal distribution with .10=σ If the prob is 0.8212 that it will take on a value < 82.5, what is the prob that it will take on a value > 58.3? Solution Let the mean (unknown) be µ . Given ( ) 8212.05.82 =<XP Thus 8212.0 10 5.82 = − < − µ σ µX P Or 8212.0 10 5.82 = − < µ ZP 8212.0 10 5.82 = − µ F From table 3, 92.0 10 5.82 = − µ Or 3.732.95.82 =−=µ Hence ( )3.58>XP
  • 74. 69 ( )5/1 10 3.733.58 >= − > − = ZP X P σ µ ( ) ( ) ( )( ) ( ) 9332.05.15.111 5.115.11 ==−−= −−=−≤−= FF FZP Example 3 (See Exercise 5.33 on page 152) In a Photographic process the developing time of prints may be looked upon as a r.v. X having normal distribution with 28.16=µ seconds and s.d. of 0.12 second. For which value is the prob 0.95 that it will be exceeded by the time it takes to develop one of the prints. Solution That is find a number c so that ( ) 95.0=> cXP i.e 95.0 2.1 28.16 = − > − cX P σ µ i.e. 95.0 2.1 28.16 = − > c ZP Hence 05.0 2.1 28.16 = − ≤ c ZP .306.14645.12.128.16 645.1 2.1 28.16 =×−=∴ = − ∴ c c NORMAL APPROXIMATION TO BINOMIAL DISTRIBUTION Suppose X is a r.v. having Binomial distribution with parameters n and p. Then it can be shown that ( ) ( ) .∞→=≤→≤ − naszFzZPz npq npX P i.e in words, standardized binomial tends to standard normal.
  • 75. 70 Thus when n is large, the binomial probabilities can be approximated using normal distribution function. Example 4 (See Exercise 5.36 on page 153) A manufacturer knows that on the average 2% of the electric toasters that he makes will require repairs within 90 days after they are sold. Use normal approximation to the binomial distribution to determine the prob that among 1200 of these toasters at least 30 will require repairs within the first 90 days after they are sold? Solution Let X = No. of toasters (among 1200) that require repairs within the first 90 days after they are sold. Hence X is a rv having Binomial Distribution with parameters n = 1200 and .02. 100 2 ==p Required ( ) − ≥ − =≥ 85.4 2430 30 npq npX PXP ( ) ( ) ( ) 1075.08925.0124.11 24.1124.1 =−=−= <−≥≈ F ZPZP Correction for Continuity Since for continuous rvs ( ) ( )czPczP >=≥ (which is not true for discrete rvs), when we approximate binomial prob by normal prob, we must ensure that we do not ‘lose’ the end point. This is achieved by what we call continuity correction: In the previous example, ( )30≥XP also = ( )5.29≥XP (Read the justification given in your book on page 150 line 1to 7). ( ) ( ) ( ) 1292. 878.0113.1113.11 13.1 85.4 5.5 85.4 245.29 = −=−=≤−= ≥=≥≈ − ≥ − = FZP ZPZP npq npX P (probably better answer).
  • 76. 71 Example 5 (See Exercise 5.38 on page 153) A safety engineer feels that 30% of all industrial accidents in her plant are caused by failure of employees to follow instructions. Find approximately the prob that among 84 industrial accidents anywhere from 20 to 30 (inclusive) will be due to failure of employees to follow instructions. Solution Let X = no. of accidents (among 84) due to failure of employees to follow instructions. Thus X is a rv having Binomial distribution with parameters n = 84 and p = 0.3. Thus 2.42.25 == npqandnp Required ( )3020 ≤≤ XP ( )5.305.19 ≤≤= XP (continuity correction) ( ) ( ) ( ) ( ) ( ) 8093.019131.08962.0 136.126.136.126.1 26.136.1 2.4 2.255.30 2.4 2.255.19 =−+= −+=−−= ≤≤−≈ − ≤ − ≤ − = FFFF ZP npq npX P OTHER PROBABILITY DENSITIES The Uniform Distribution A r.v X is said to have uniform distribution over the interval ( )βα, if its density is given by ( ) << −= elsewhere x xf 0 1 βα αβ
  • 77. 72 Thus the graph of the density is a constant over the interval ( )βα, If βα <<< dc ( ) − − = − =<< d c cd dxdXcP αβαβ 1 and thus is proportional to the length of the interval ( )., dc You may verify that The mean of ( ) 2 βα µ + === XEX (mid point of the interval ( )βα, ) The variance of ( ) 12 2 2 αβ σ − ==X . The cumulative distribution function is ( ) > ≤< − − ≤ = β βα αβ α α x x x x xf 1 0 Example 6 (See page 165 exercise 546) In certain experiments, the error X made in determining the solubility of a substance is a rv having the uniform density with 025.0025.0 =−= βα and . What is the prob such an error will be (a) between 0.010 and 0.015? (b) between –0.012 and 0.012? Solution (a) ( ) ( )025.0025.0 010.0015.0 015.0010.0 −− − =<< XP 1.0 050.0 005.0 == (b) ( ) ( ) ( )025.0025.0 012.0012.0 012.0012.0 −− −− =<<− XP 48.0 25 12 ==
  • 78. 73 Example 7 (See exercise 5.47 on page 165) From experience, Mr. Harris has found that the low bid on a construction job can be regarded as a rv X having uniform density ( ) << = elsewhere Cx C Cxf 0 2 3 2 4 3 where C is his own estimate of the cost of the job. What percentage should Mr. Harris add to his cost estimate when submitting bids to maximize his expected profit? Solution Suppose Mr. Harris adds k% of C when submitting his bid. Thus Mr. Harris gets a profit 100 kC if he gets the contract which happens if the lowest bid (by others) +≥ 100 kC C and gets no profit if the lowest bid 100 kC C +< . Thus the prob that he gets the bid −=+−×=<<+= 100 1 4 3 100 2 4 3 2 100 kkC CC C CX kC CP Thus the expected profit of Mr. Harris is ( )....0 100 1 4 3 100 ×+−× kkC −= 100400 3 2 k k C which is maximum (by using calculus) when k =50. Thus Mr. Harris’s expected profit is a maximum when he adds 50% of C to C, when submitting bids. Gamma Function This is one of the most useful functions in Mathematics. If x > 0, it is shown that the improper integral dtte xt 1 0 −− ∞ converges to a fuite real number which we denote by ( )xΓ (Capital gamma of x). Thus for all real no x > 0, we define ( ) .1 0 dttex xt −− ∞ =Γ
  • 79. 74 Properties of Gamma Function 1. ( ) ( )xxx Γ=+Γ 1 , x > 0 2. ( ) 11 =Γ 3. ( ) ( ) ( ) ( ) !212223,1112 =×=Γ=Γ=Γ=Γ More generally ( ) !1 nn =+Γ whenever n is a +ve integer or zero. 4. Γ 2 1 .π= 5. ( )xΓ decreases in the interval (0,1) and increases in the interval ( )∞,2 and has a minimum somewhere between 1 and 2. THE GAMMA DISTRIBUTION Let βα1 be 2 +ve real numbers. A r.v X is said to have a Gamma Distribution with parameters βα1 if its density is ( ) ( ) > Γ= −− elsewhere xxe xf x 0 0 1 1. α α β αβ It can be shown that Mean of ( ) αβµ === XEX (See the working on Page 159 of your text book) Variance of .22 αβσ ==X Exponential Distribution If ,1=α we say X has exponential distribution. Thus X has an exponential distribution (with parameter 0>β ) if its density is ( ) > = − elsewhere xe xf x 0 0 1 β β
  • 80. 75 We also see easily that: 1. Mean of ( ) β== XEX 2. Variance of 22 βσ ==X 3. The cumulative distribution function of X is ( ) >− = − elsewhere xe xF x 0 01 β 4. X has the memoryless property: ( ) ( ) 0,.,| >>=>+> tstXPsXtsXP Proof of (4): ( ) ( )sXPsXP ≤−=> 1 ( ) β s esF − =−=1 (by (3)) ( ) ( ) ( )( ) ( )sXP sXtsXP sXtsXP > >∩+> =>+> | ( ) ( ) ( ) ( )QEDtxPe e e sXP tsXP t s ts . / >=== > +> = − − +− β β β Example 8 (See exercise 5.54 on page 166) In a certain city, the daily consumption of electric power (in millions of kw hours) can be treated as a r.v. X having a Gamma distribution with .2,3 == βα If the power plant in the city has a daily capacity of 12 million kw hrs, what is the prob. that the power supply will be inadequate on any given day? Solution The power supply will be inadequate if demand exceeds the daily capacity. Hence the prob that the power supply is inadequate ( ) ( ) ∞ =>= 12 12 dxxfXP
  • 81. 76 Now as ( ) ( ) 132 3 32 1 ,2,3 − − Γ === xexf x βα 22 16 1 x ex − = Hence ( ) ∞ − => 12 22 10 1 12 dxexXP x Integrating by parts, we get [ ] 062.025 10 400 16128122 16 1 82422 10 1 66 6662 12 2222 === +××+××= −+−−= −− −−− ∞ −−− ee eee eexex xxx Example 9 (see exercise 5.58 on Page 166) The amount of time that a surveillance camera will run without having to be reset is a r.v. X having exponential distribution with 50=β days. Find the prob that such a camera (a) will have to be reset in less than 20 days. (b) will not have to be reset in at least 60 days. Solution The density of X is ( ) )0(0 50 1 50 elsewhereandxexf x >= − (a) P (The camera has to be reset in < 20 days) = P (the running time < 20)
  • 82. 77 ( ) 3297.011 50 1 20 5 2 50 20 20 0 20 0 5050 =−=−= −==<= −− −− ee edxeXP xx (b) P (The camera will not have to be reset in at least 60 days.) ( ) 3012.0 50 1 60 5 6 60 50 60 50 ==−= =>= − ∞ − ∞ − ee dxeXP x x Example 10 (See exercise 5.61 on page 166) Given a Poisson process with the average α arrivals per unit time, find the prob density of the inter arrival time (i.e the time between two consecutive arrivals). Solution Let T be the time between two consecutive arrivals. Thus clearly T is a continuous r.v. with values > 0. Now T > t No arrival in time period t. Thus ( ) ( )0==> tXPtTP ( tX = Number of arrivals in time period t) t e α− = (as tX has a Poisson distribution with parameter tαλ = ) Hence the distribution function of T ( ) ( ) ( ) 011 >−=>−=≤== tettPtTPtF tα ( )( )00 ≤= tallforclearlytF
  • 83. 78 Hence the density of ( ) ( )tF dt d tfT =, > = − elsewhere tife t 0 0α α Hence we would say the IAT is a continuous rv. with exponential density with parameter α 1 . The Beta Function If x,y>0 the beta function, ( )yxB , (read capital Beta x,y), is defined by ( ) ( ) −− −= 1 0 11 1, dtttyxB yx It is well-known that ( ) ( ) ( ) ( ) .0,,, > +Γ ΓΓ = yx yx yx yxB BETA DISTRIBUTION A r.v. X is said to have a Beta distribution with parameter 0, >βα if its density is ( ) ( ) ( ) elsewhere xxx B xf 0 101, , 1 11 <<−= −− βα βα It is easily shown that (1) ( ) βα α µ + ==XE (2) ( ) ( ) ( )1 2 2 +++ == βαβα αβ σXV
  • 84. 79 Example 11 (See Exercise 5.64) If the annual proportion of erroneous income tax returns can be looked upon as a rv having a Beta distribution with ,9,2 == βα what is the prob that in any given year, there will be fewer than 10% of erroneous returns? Solution Let X = annual proportion of erroneous income tax returns. Thus X has a Gamma density with .9,2 == βα ( ) ( )=<∴ 1.0 0 1.0 dxxfXP (Note the proportion can not be < 0) ( ) ( ) −− −= 1.0 0 1912 1 9,2 1 dxxx B ( ) ( ) ( ) ( ) 990 1 11109 1 !11 !81 11 92 9,2 = ×× = × = Γ ΓΓ =B ( ) ( ) ( )[ ]−−−=− 1.0 0 1.0 0 988 111. dxxxdxxx ( ) ( ) ( ) ( ) ( ) ( ) 00293.0 900 19 9. 90 1 10 1 9 1 9 1 10 9. 9. 10 1 10 9. 9 1 9 9. 10 1 9 1 99 1091.0 0 109 = ×−=−+−= −++ − = − − − − − = xx The Log –Normal Distribution A r.v X is said to have a log normal distribution if its density is ( ) ( ) >> = −−− elsewhere xex xf x 0 0,0 2 1 22 2/ln1 β βπ βα
  • 85. 80 It can be shown that if X has log-normal distribution, XlnY = has a normal distribution with mean αµ = and s.d. .βσ = Thus ( )bXaP << ( )bXap lnlnln <<= − − − = − << − = β α β α β α β α a F b F b Z a p lnlnlnln Where ( ) cdfzF = of the standard normal variable Z. Lengthy calculations show that if X has log-normal distribution, its mean ( ) 2 2β α + = eXE and its variance = ( )1 22 2 −+ ββα ee More problems on Normal Distribution Example 12 Let X be normal with mean .sdand Determine c as a function of σµ and such that ( ) ( )cXPcXP ≥=≤ 2 Solution ( ) ( )cxPcXP ≥=≤ 2 Implies ( ) ( )( )cXPcXP <−=≤ 12 Let ( ) pcXP =≤ Thus 3 2 23 == porp Now ( ) 6667. 3 2 == − = − ≤ − =≤ σ µ σ µ σ µ c F cX PcXP Implies 43.0= − σ µc (approx from Table 3) σµ 43.0+=∴c
  • 86. 81 Example 13 Suppose X is normal with mean 0 and sd 5. Find ( )41 2 << XP Solution ( ) ( ) <−<=<<= <<= << 5 1 5 2 5 2 5 1 21 41 2 ZPZPZP XP XP −=−−−= 5 1 5 2 21 5 1 21 5 2 2 FFFF ( )5793.06554.02 −= from Table 3 ( ) 1522.00761.2 =×= Example 14 The annual rain fall in a certain locality is a r.v. X having normal distribution with mean 29.5” and sd 2.5”. How many inches of rain (annually) is exceeded about 5% of the time? Solution That is we have to find a number C such that ( ) 6125.33 645.15.25.29 645.1 5.2 5.29 05.0 5.2 5.29 . 05.0 05.0 = ×+=∴ == − = − > − => C z C Hence CX Pei CXP σ µ
  • 87. 82 Example 15 A rocket fuel is to contain a certain percent (say X) of a particular compound. The specification calls for X to lie between 30 and 35. The manufacturer will make a net profit on the fuel per gallon which is the following function of X. ( ) ≤<<≤ << = 3025403505.0$ 353010.0$ XorXifgallonper Xifgallonper XT -$0.10 per gallon elsewhere. If X has a normal distribution with mean 33and s.d. 3, find the prob distribution of T and hence the expected profit per gallon. Solution T = 0.10 if 30 < X < 35 ( ) ( ) ( ) ( ) 5899.018413.07486.0 11 3 2 1 3 2 3 2 1 3 3335 3 3330 353010.0 =−+= −+=−−= <<−= − < − < − = <<==∴ FFFF ZP X P XPTP σ µ ( ) ( ) ( ) ( ) ( )1 3 8 3 2 3 7 3 8 1 3 2 3 7 1 3 8 3 7 3 2 5 3330 3 3325 3 3340 3 3335 3025403505.0 FFFF FFFF ZPZP X P X P XPXPTP −+−= −−−+−= −≤< − +<≤= − < − < − + − < − ≤ − = ≤<+<≤== σ µ σ µ ( ) 0138.0 3963.05899.0110.0 3963.08413.09961.07486.09901.0 = −−=−= =−+−= TPHence Hence expected profit = E(T)
  • 88. 83 ( ) 077425.0$ 0138.010.03963.005.05899.10.0 = ×−+×+×= JOINT DISTRIBUTIONS – Two and higher dimensional Random Variables Suppose X,Y are 2 discrete rvs and suppose X can take values Yandxx ......., 21 can take values ........., 21 yy we refer to the function ( ) ( )yYxYPyxf === ,, as the joint prob distribution of X and Y. The ordered pair (X,Y) is sometimes referred to as a two – dimensional discrete r.v. Example 16 Two cards are drawn at random from a pack of 52 cards. Let X be the number of aces drawn and Y be the number of Queens drawn. Find the joint prob distribution of X and Y. Solution Clearly X can take any one of the three values 0,1,2 and Y one of the three values, 0,1,2. The joint prob distribution of X, and Y is depicted in the following 3 x 3 table x 0 1 2 0 2 52 2 44 2 52 2 44 1 4 2 52 2 4 1 2 52 1 44 1 4 2 52 1 4 1 4 0y 2 2 52 2 4 0 0
  • 89. 84 Justification ( )0,0 == yxP = P (no aces and no queens in t he 2 cards) = 2 52 2 44 ( )0,1 == YXP (the entry in the 2nd col and 1st row) =P (one ace and one other card which is neither ace nor a queen) . 2 52 1 44 1 44 etc= Can we write down the distribution of X? X can take any one of the 3 values 0,1,2 What is ( )?0=XP X = 0 means no ace is drawn but we might draw 2 queens, or 1 queen and one non queen or 2 cards which are neither aces nor queens. Thus ( ) ( ) ( ) ( ) ( )! 2 52 2 48 2 52 2 4 2 52 1 44 1 4 2 52 2 44 1.3 1,01,00,00 Verify colinprobtheofSum YXPYXPYXPXP =++ = ==+==+==== Similarly ( ) ( ) ( ) ( )2,11,10,11 ==+==+==== YXPYXPYXPXP
  • 90. 85 = Sum of the 3 probabilities in 2nd col. ( ) ( ) ( ) ( ) ( )2,21,20,22 ! 2 52 1 48 1 4 0 2 52 1 4 1 4 2 52 1 44 1 4 ==+==+==== =++= YXPYXPYXPXP Verify = Sum of the 3 probabilities in 3rd col =++= 2 52 2 4 00 2 52 2 4 The distribution of X derived from the joint distribution of X and Y is referred to as the marginal distribution of X.. Similarly the marginal distribution of Y are the 3 row totals. Example 17 The joint prob distribution of X and Y is given by x -1 0 1 -1 8 1 8 1 8 1 8 3 0 8 1 0 8 1 8 2 y 1 8 1 8 1 8 1 8 3 Marginal Distribution of X 8 3 8 2 8 3 Write the marginal distribution of X and Y. To get the marginal distribution of X, we find the column totals and write them in the (bottom) margin. Thus the (marginal) distribution of X is X -1 0 1 Prob 8 3 8 2 8 3
  • 91. 86 (Do you see why we call it the marginal distribution) Similarly to get the marginal distribution of Y, we find the 3 row totals and write them in the (right) margin. Thus the marginal distribution of y is Y Prob -1 8 3 0 8 2 1 8 3 Notation: If ( ) ( )yYxXPyxf === ,, is the joint prob distribution of the 2-dimensional discrete r.v (X.Y), we denote by g (x) the marginal distribution of X and by h(y) the marginal distribution of Y. Thus ( ) ( ) ( ) ( )yxfyYxXPxXPxg ,, 1 1 1 1 ====== All y all y And ( ) ( ) ( ) ( ) xallxall yxfyYxXPyYPyh ,, 1 1 1 1 ====== Conditional Distribution The conditional prob distribution of Y for a given X = x is defined as ( ) ( ) ( ) ( ) ( ) ( )xg yxf xXP yYxXP xXgivenyYofprobreadxXyYPxyh ,, )( = = == = ===== where g (x) is the marginal distribution of X. Thus in the above example 17, ( ) ( ) ( ) ( ) 3 1 1 0,1 1|01|0 8 3 8 1 == = == ==== XP YXP XYPh Similarly, the conditional prob distribution of X for a given Y = y is defined as
  • 92. 87 ( ) ( ) ( ) ( ) ( ) ( )yh yxf yYP yYxXP yYxXPyxg ,, || = = == ==== Where h(y) is the marginal distribution of Y. In the above example, ( ) ( ) ( ) ( ) 0 8 2 0 0 0,0 0|00|0 == = == ==== YP yYP YXPg Independence We say X,Y are independent if ( ) ( ) ( ) .,, yxallforyYPxXPyYxXP ===== Thus X,Y are independent if and only if ( ) ( ) ( ) yandxallforyhxgyxf =, which is the same as saying of g(x|y) =g(x) for all x and y which is the same as saying ( ) ( )yhxyh =| for all x,y. In the above example X,Y are not independent as ( ) ( ) ( )000,0 ==≠== YPXPYXP Example 18 The joint prob distribution of X and Y is given by X 2 0 1 2 0.1 0.2 0.1 0 0.05 0.1 0.15 Y 1 0.1 0.1 0.1 (a) Find the marginal distribution of x. Ans X 2 0 1 Prob 0.25 0.4 0.35
  • 93. 88 (b) Find the marginal distribution of Y Ans Y Prob 2 0.4 0 0.3 1 0.3 (c) Find ( )2=+YXP Ans ( ) ( ) ( )2,01,10,22 =======+ YXorYXorYXifYX Thus ( ) 35.02.01.005.02 =++==+YXP (d) Find ( )0=−YXP Ans : ( ) ( ) ( )1,10,02,20 =======− YXorYXorYXifYX ( ) 3.01.01.01.00 =++==−∴ YXP (e) Find ( )0≥XP Ans. 1 (f) Find ( ) 3.0 1 3.0 .00 =≥=− AnsXYXP (g) Find ( ) 3 1 6.0 2.0 .10 =≥=− AnsXYXP (h) Are X,Y independent? Ans No! ( ) ( ) ( ).111,1 ==≠== YPXPYXP Two-Dimensional Continuous Random Variables Let (X,Y) be a continuous 2-dimensional r.v. This means (X,Y) can take all values in a certain region of the X,Y plane. For example, suppose a dart is thrown at a circular board of radius 2. Then the position where the dart hits the board (X,Y) is a continuous two dimensional r.v as it can take all values (x,y) such that .422 ≤+ yx A function ( )yxf , is said to be the joint prob density of (X,Y) if (i) ( ) yxallforyxf ,0, ≥
  • 94. 89 (ii) ( ) 1, = ∞ ∞− ∞ ∞− dxdyyxf (iii) ( ) ( ) .,, dxdyyxfdYcbXaP b a d c =≤≤≤≤ Example 19(a) Let the joint prob density of (X,Y) be ( ) elsewhere yxyxf 0 20,20 4 1 , ≤≤≤≤= Find ( )1≤+YXP Ans : The region 1≤+ yx is given by the shaded portion. ( ) ( ) ( ) =−−=−= ≤+∴ − == 1 0 1 0 2 1 0 1 0 . 8 1 1 8 1 1 4 1 4 1 1 xdxx dxdyyxP x yx Example 19(b) The joint prob density of (X,Y) is ( ) ( ) ( )3,1 40,206 8 1 , << <<<<−−= YXPFind yxyxyxf Solution ( ) dxdyyxf yx , 3 2 1 0 ==
  • 95. 90 ( ) ( ) ( ) ( ) 8 3 18 2 5 2 25 8 1 2 5 2 6 8 1 2 5 6 8 1 2 6 8 1 6 8 1 1 0 2 1 0 3 2 21 0 3 2 1 0 =+−−=− − −= −−= −− −−= = = == x dxx dx y yx dxdyyx x x yx If ( )yxf , is the joint prob density of the 2-dimensional continuous rv (X,Y), we define the marginal prob density of X as ( ) ( )dyyxfxg , ∞ ∞− = That is fix x and integrate f(x,y) w.r.t y Similarly the marginal prob density of Y is ( ) ( )dxyxfyh , ∞ ∞− = The conditional prob density of Y for a given x is ( ) ( ) ( )xg yxf xyh , | = (Defined only for those x for which g(x) ≠ 0) The conditional prob density of X for a given y is ( ) ( ) ( ) ( ) )0( , | ≠= yhwhichforythoseforonlydefined yh yxf yxg Marginal and Conditional Densities
  • 96. 91 We say X,Y are independent if and only if ( ) ( ) ( )yhxgyxf =, which is the same as saying ( ) ( ) ( ) ( ).|| yhxyhorxgyxg == Example 20 Consider the density of (X,Y) as given in example 19. The marginal density of x = ( ) ( )dyyxxg y −−= = 6 8 1 4 2 ( ) 4 2 2 2 6 8 1 −− y yx ( )[ ] elsewhereand xx 0 20662 8 1 = <<−−= We verify this is a valid density. ( ) ( ) 20026 8 1 <<≥−= xforxxg Secondly ( ) ( )dxxdxxg 26 8 1 2 0 2 0 −= [ ] [ ] 1412 8 1 6 8 1 2 0 2 =−=−= xx The marginal density of Y is ( ) ( )dxyxyh x −− = 6 8 1 2 0 ( ) ( )[ ]262 8 1 2 6 8 1 2 0 2 −−=−−= = y x xy x Independence
  • 97. 92 = ( ) <<− elsewhere yory 0 4210 8 1 Again ( ) ( )dyyhandyh ≥ 4 2 0 ( ) [ ] [ ] 11220 8 1 10 8 1 210 8 1 4 2 2 4 2 =−=−=−= yydyy The conditional density of Y for X = 1 is ( ) ( ) ( ) ( ) ( ) ( ) 42,5 4 1 26 8 1 16 8 1 1 , 1| <<−= − − == yy y g yxf yh And 0 elsewhere Again this is a valid density as ( ) 01| ≥yh And ( ) ( )dyydyyh −= 5 4 1 1| 4 2 4 2 ( ) ( ) ( ) ( )3 3,1 3|1 1 2 1 2 9 4 1 2 5 4 1 4 2 2 < << =<< =−= − −= YP YXP YxP y Now 8 3 =Nr ( ) ( ) ( ) ( ) ( ) 8 5 2 4 2 9 4 1 2 5 4 1 5 4 1 210 8 1 3 3 2 23 2 3 2 3 2 =−= − −=−= −==<= y dyy dyydyyhYPDr
  • 98. 93 The conditional density of Y for X = 1 Is ( ) ( ) ( ) ( ) ( ) ( ) 425 4 1 26 8 1 16 8 1 1 ,1 1| <<−= − − == yy y g yf yh And 0 elsewhere Again this is a valid density as ( ) 01| ≥yh and ( ) ( )dyydyyh −= 5 4 1 1| 4 2 4 2 ( ) ( ) ( ) ( )3 3,1 3|1 1 2 1 2 9 4 1 2 5 4 1 4 2 2 < << =<< =−= − −= YP yxP YXP y Now Numerator 8 3 = ( ) ( ) ( ) ( ) ( ) 8 5 2 4 2 9 4 1 2 5 4 1 5 4 1 210 8 1 3rDenominato 3 2 23 2 3 2 3 2 =−= − −=−= −==<= y dyy dyydyyhYP Hence ( ) 5 3 8 5 8 3 3,1 ==<< YXP Let ( )yxf , be the joint density of (X,Y). We define the cumulative distribution function as ( ) ( )yYxXPyxF ≤≤= ,, ( ) ., dvduvuf yx ∞−∞− = The Cumulative Distribution Function
  • 99. 94 Example 21 (See Exercise 5.77 on page 180) The joint prob density of X and Y is given by ( ) ( ) <<<<+ = elsewhere yxyx yxf 0 10,10 , 2 5 6 Find the cumulative distribution function F(x,y) Case (i) x < 0 ( ) ( ) ( ) 0, 0,0 ,, < == = ∞−∞− vuany forvufas dvduvufyxF yx Case (ii) y < 0. Again ( ) 0, =yxF whatever be x. Case (iii) ( ) ( ) ( ) ( ) ( )( )000, ,, 10,10 2 5 6 00 <<=+= = <<<< == ∞− voruforvufasdvduvu dvduvufyxF yx y v x u y du v uv yx u 0 3 0 35 6 += = +=+= = 325 6 35 6 323 0 xyyx du y uy x u . Solution
  • 100. 95 Case (iv) 1,10 ≥<< yx ( ) ( ) ( ) +=+= += = = == ∞−∞− 325 6 3 1 5 6 5 6 ,, 2 0 2 1 00 xx duu dudvvu dudvvufyxF x u v x u yx Case (v) 10,1 <<≥ yx as in case (iii) we can show ( ) += 325 6 , 3 yy yxF Case (v) 1,1 ≥≥ yx ( ) ( ) ( )dvduvududvvufyxF vu yx 2 1 0 1 0 5 6 ,, +== ==∞−∞− 1 3 1 2 1 5 6 3 1 5 6 1 0 =+=+= − duu u (Did you anticipate this?) Hence ( )6.04.0,5.02.0 <<<< YXP ( ) ( ) ( ) ( ) ( )?4.0,2.0 4.0,5.06.0,2.0 6.0,5.0 WhyF FF F + −− =
  • 101. 96 ( ) ( ) ( )( ) ( ) ( ) ( )( ) ( ) ( ) ( )( ) ( ) ( ) ( )( ) 3 4.02.0 2 4.02.0 3 4.05.0 2 4.05.0 3 6.02.0 2 6.02.0 3 6.05.0 2 6.05. 5 6 3232 3232 ++−− −−+= ( ) ( ) ( ) ( )( ) ( ) ( ) ( ) ( )− −×−−+×= 3 4.06.0 2.01.2.04.06.0 3 5.0 15.0 5 6 2 2332 ( ) ( )[ ] ( ) ( ) ( )[ ][ ] ( ) ( ) ( ) ( )[ ] [ ] 04344.0 362.01.0 5 6 4.06.02.05.01.0 5 6 4.06.01.01.02.05.0 5 6 3322 3322 = ××= −+−×= −+×−= Example 22 The joint density of X and Y is ( ) ( ) <<<<+ = elsewhere yxyx yxf 0 10,10 , 2 5 6 (a) Find the conditional prob density g (x | y) (b) Find 2 1 |xg (c) Find the mean of the conditional density of X given that 2 1 =Y Solution ( ) ( ) ( ) ( )yhwhere yh yxf yxg , | = is the marginal density of y.
  • 102. 97 Thus ( ) ( ) ( ) .10 2 1 5 6 , 2 2 5 6 1 0 1 0 <<+= +== == yy dxyxdxyxfyh xx Hence ( ) ( ) ( ) ( ) 10, 4 1 3 4 2 1 | 0 .10,| 4 1 2 1 4 1 2 2 1 2 2 2 1 5 6 2 5 6 <<+= + + =∴ << + + = + + = xx x xg elsewhereand x y yx y yx yxg Hence dxx dxxgx yxE +×= = = 4 1 3 4 2 1 | 2 1 | 1 0 1 0 8 11 8 1 3 1 3 4 833 4 1 0 23 =+=+= xx
  • 103. 98 Example 23 (X,Y) has a joint density which is uniform on the rhombus find (a) Marginal density of X. (b) Marginal density of Y (c) The conditional density of Y given 2 1 =X Solution (X,Y) has uniform density on the rhombus means ( ) bushomrtheofArea 1 y,xf = bushomrtheover 2 1 = and 0 elsewhere (a) Marginal Density of X Case (i) 0<x<1 ( ) ( )xdyxf x xy −== − −= 1 2 1 1 1 Case (ii) –1<x<0 ( ) xdyxf x xy +== + −−= 1 2 1 1 1 Thus ( ) <<− <<−+ = elsewhere0 1x0x1 0x1x1 xg (b) By symmetry marginal density of Y is
  • 104. 99 ( ) <<− <<−+ = elsewhere yy yy yh 0 101 011 (c) 2 1 2 1 , 2 1 tofromrangesyxfor −= Thus conditional density of Y for is 2 1 X = ( ) ( ) ( ) <<− == elsewhere0 y1 f ,xf |yh 2 1 2 1 2 1 2 1 2 1 for 3 2 to 3 2 fromrangsY 3 1 x −= ( ) <<−= =∴ elsewhere0 3 2 y 3 2 4 3 |yh 3 2 2 1 3 1
  • 105. 100 PROPERTIES OF EXPECTATIONS Let X be a r.v. a,b be constants Then (a) ( ) ( ) bxEabaXE +=+ (b) ( ) ( )XVarabaXVar 2 =+ If nXXX ......, 21 are any n rvs, ( ) ( ) ( ) ( )n21n21 XE....XEXEX.......XXE +++=+++ But if nareX,.....X n1 indep rvs then ( ) ( ) ( ) ( )n21n21 XVar....XVarXVarX.....XXVar +++=+++ In particular if X,Y are independent ( ) ( ) ( ) ( )YVarXVarYXVarYXVar +=−=+ Please note : whether we add X and Y or subtract Y from X, we always must add their variances. If X,Y are two rvs, we define their covariance ( ) ( )( )[ ] ( ) ( )YE,XEWhere YXEY,XCOV 21 21 =µ=µ µ−µ−= Th. If X,Y are indep, ( ) ( ) ( ) ( ) 0Y,XCOVandYEXEXYE ==