Axioms
Axiom 1: , -
It states that the probability (mass) is nonnegative
Axiom 2: , -
It states that there is a fixed total amount of probability (mass),
namely 1 unit.
Axiom 3: , - , - , -
It states that the total probability (mass) in two disjoint objects
is the sum of the individual probabilities (masses).
Axiom 4:
[⋃ ] ∑ , -
Corollaries
P[A
c
]=1-P[A] 1=P[S]= P[A
c
]+P[A]
P[A]≤1 P[ ]=0
, - , - , -
, -
If A⊂B, then P[A]≤P[B]
, - , - , -
Computing probabilities by counting methods
 Sampling with Replacement and with Ordering:
Number of distinct ordered k-tuples =
 Sampling without Replacement and with Ordering:
Number of distinct ordered k-tuples = ( ) ( )
Permutations of n Distinct Objects
Number of permutations of n objects = ( ) ( )( ) ≜ (We
refer to n! as n factorial.)
 Sampling without Replacement and without Ordering
( ) ( )
( ) ( )
( )
≜( )
 Sampling with Replacement and without Ordering
( ) ( )
Conditional probability
, -
, -
, -
Addition Rule:
, - , - , - , -
If A & B are mutually exclusive then; , - , - , -
Multiplication Rule:
, - , - , - , - , -
If A & B are independent then; , - , - , -
Probability Laws
Total probability law:
, - , - , - , - OR
, - , - , - , - , - , - , - OR
, - , - , - , - , -
Bayes’ Law
, -
, -
, -
, -
, -
, -
, -
, -
, - , -
∑ , - , -
Probability definition:
A & B are mutually exclusive if , -
A & B are independent If , - , - & , - , - &
, - , - , -
Discrete Probability Distributions
Expected value or mean of a discrete random variable X:
, - ∑ ( ) ∑ ( )
Variance of the random variable X
, - ,( ) - ∑( ) ( )
, -
Standard deviation of the random variable X:
, - , - ⁄
X X counts PX Values of x E[X] VAR[X]
Bernoulli
Equals one if the event A
occurs, and zero otherwise.
P0=1-p, p1=p 0,1 p P(1-p)
Binomial Numbers of successes in
fixed n trials
. / ( ) 0,1,. . .,n np np(1-p)
Geometric Number of trials up through
1st success ( ) ( ) 0, 1, .. 1, 2, …
Uniform outcomes are equally likely 1,2,..,L
Poisson number of events that occur
in fixed time period
0,1,2 α α
Probability Formula Sheet Haris H.
Cumulative distribution function: ( ) , - , for
Properties
1. ( ) .
2. ( ) .
3. ( ) .
4. ( ) is a non-decreasing function of x, that is, if
a<b, then ( ) ( )
5. ( ) is continuous from the right, that is for h>0,
( ) ( ) ( ).
6. , - ( ) ( )
7. , - ( ) ( )
8. , - ( )
CDF for continuous/discrete random variable:
( ) ∫ ( ) ∫ ∑ ( ) ( )
Probability density function of X:
( )
( )
∑ ( ) ( )
Properties:
( ) ; , - ∫ ( ) ,
( ) ∫ ( ) ; 1 ∫ ( )
Expected value or mean of a random variable X
, - ∫ ( )
Variance of the random variable X:
, - ,( , -) - , - , -
The nth moment of the random variable X:
, - ∫ ( )
Properties of Expected value and Variance:
, - ; , - , - ; , - , -
, - ; , - ; , - , -
, - , -
X X counts FX E[X] VAR[X]
uniform Outcomes with equal
density
{ { ∫
( )
exponential Time until an event { {
Gaussian
(normal)
Number of components
becomes large
( )
√
( )
0
Where
( ) ( )
√
∫ ; ( ) & ( ) ( ) ; ( ) ( ) ( ) ( )
Bounds for probability
 Markov inequality states that
, -
, -
 Chebyshev inequality states that
, - Where is Variance
Pairs of Random Variables
Joint probability mass function:
( ) ,* + * +- ≜ [{ } * +] ( ) ∑ ∑ ( )
Marginal probability mass functions:
( ) [ ] [ ] [* + * + ]
∑ ( ) And similarly for ( ).
Joint CDF of X and Y: ( ) , -
Properties:
 ( ) ( )
 ( ) ( ) ( )  ( ) ( ) ( ) ( )
 , - ( ) ( ) ( ) ( )
Joint PDF of two continuous Random Variables: ( )
( )
Marginal pdf: ( ) ∫ ( ) and ( ) ∫ ( )
Properties:
Let ( ) {
1. If x<0 or y<0, then CDF ( )
2. If (x, y) is inside the unit interval,
( ) ∫ ∫
3. Similarly, If and x>1, ( )
4. If and y>1,
( ) ∫ ∫
5. Finally if x>1 and y>1,
( ) ∫ ∫
Independence of two variables:
if X and Y are independent discrete random variables, then the joint pmf is equal to the product of the marginal pmf’s.
( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( )
Expected Value of a Function of Two Random Variables:
, - {
∫ ∫ ( ) ( )
∑ ∑ ( ) ( )
Joint moment of X and Y:
, - {
∫ ∫ ( )
∑ ∑ ( )
If E[XY]=0, then we say that X and Y are orthogonal.
Covariance of X and Y: COV(X,Y)=E[XY]-E[X]E[Y]
Pairs of independent random variables have covariance zero.
Correlation coefficient of X and Y:
( )
, where √ ( ) √ ( )
X and Y are said to be uncorrelated if
Conditional Probability:
( )
( )
( )
( )
, -
, -
( ) ( )
Conditional Expectation:
, - ∫ ( )
, - ∑ ( )
Central Limit Theorem
Let Sn be the sum of n iid random variables with finite mean E[X]= and finite variance and let Zn be the zero-
mean, unit-variance random variable defined by
Zn
√
For more visit tricntip.blogspot.com
Haris H.

Probability Formula sheet

  • 1.
    Axioms Axiom 1: ,- It states that the probability (mass) is nonnegative Axiom 2: , - It states that there is a fixed total amount of probability (mass), namely 1 unit. Axiom 3: , - , - , - It states that the total probability (mass) in two disjoint objects is the sum of the individual probabilities (masses). Axiom 4: [⋃ ] ∑ , - Corollaries P[A c ]=1-P[A] 1=P[S]= P[A c ]+P[A] P[A]≤1 P[ ]=0 , - , - , - , - If A⊂B, then P[A]≤P[B] , - , - , - Computing probabilities by counting methods  Sampling with Replacement and with Ordering: Number of distinct ordered k-tuples =  Sampling without Replacement and with Ordering: Number of distinct ordered k-tuples = ( ) ( ) Permutations of n Distinct Objects Number of permutations of n objects = ( ) ( )( ) ≜ (We refer to n! as n factorial.)  Sampling without Replacement and without Ordering ( ) ( ) ( ) ( ) ( ) ≜( )  Sampling with Replacement and without Ordering ( ) ( ) Conditional probability , - , - , - Addition Rule: , - , - , - , - If A & B are mutually exclusive then; , - , - , - Multiplication Rule: , - , - , - , - , - If A & B are independent then; , - , - , - Probability Laws Total probability law: , - , - , - , - OR , - , - , - , - , - , - , - OR , - , - , - , - , - Bayes’ Law , - , - , - , - , - , - , - , - , - , - ∑ , - , - Probability definition: A & B are mutually exclusive if , - A & B are independent If , - , - & , - , - & , - , - , - Discrete Probability Distributions Expected value or mean of a discrete random variable X: , - ∑ ( ) ∑ ( ) Variance of the random variable X , - ,( ) - ∑( ) ( ) , - Standard deviation of the random variable X: , - , - ⁄ X X counts PX Values of x E[X] VAR[X] Bernoulli Equals one if the event A occurs, and zero otherwise. P0=1-p, p1=p 0,1 p P(1-p) Binomial Numbers of successes in fixed n trials . / ( ) 0,1,. . .,n np np(1-p) Geometric Number of trials up through 1st success ( ) ( ) 0, 1, .. 1, 2, … Uniform outcomes are equally likely 1,2,..,L Poisson number of events that occur in fixed time period 0,1,2 α α Probability Formula Sheet Haris H.
  • 2.
    Cumulative distribution function:( ) , - , for Properties 1. ( ) . 2. ( ) . 3. ( ) . 4. ( ) is a non-decreasing function of x, that is, if a<b, then ( ) ( ) 5. ( ) is continuous from the right, that is for h>0, ( ) ( ) ( ). 6. , - ( ) ( ) 7. , - ( ) ( ) 8. , - ( ) CDF for continuous/discrete random variable: ( ) ∫ ( ) ∫ ∑ ( ) ( ) Probability density function of X: ( ) ( ) ∑ ( ) ( ) Properties: ( ) ; , - ∫ ( ) , ( ) ∫ ( ) ; 1 ∫ ( ) Expected value or mean of a random variable X , - ∫ ( ) Variance of the random variable X: , - ,( , -) - , - , - The nth moment of the random variable X: , - ∫ ( ) Properties of Expected value and Variance: , - ; , - , - ; , - , - , - ; , - ; , - , - , - , - X X counts FX E[X] VAR[X] uniform Outcomes with equal density { { ∫ ( ) exponential Time until an event { { Gaussian (normal) Number of components becomes large ( ) √ ( ) 0 Where ( ) ( ) √ ∫ ; ( ) & ( ) ( ) ; ( ) ( ) ( ) ( ) Bounds for probability  Markov inequality states that , - , -  Chebyshev inequality states that , - Where is Variance Pairs of Random Variables Joint probability mass function: ( ) ,* + * +- ≜ [{ } * +] ( ) ∑ ∑ ( ) Marginal probability mass functions: ( ) [ ] [ ] [* + * + ] ∑ ( ) And similarly for ( ). Joint CDF of X and Y: ( ) , - Properties:  ( ) ( )  ( ) ( ) ( )  ( ) ( ) ( ) ( )  , - ( ) ( ) ( ) ( ) Joint PDF of two continuous Random Variables: ( ) ( )
  • 3.
    Marginal pdf: () ∫ ( ) and ( ) ∫ ( ) Properties: Let ( ) { 1. If x<0 or y<0, then CDF ( ) 2. If (x, y) is inside the unit interval, ( ) ∫ ∫ 3. Similarly, If and x>1, ( ) 4. If and y>1, ( ) ∫ ∫ 5. Finally if x>1 and y>1, ( ) ∫ ∫ Independence of two variables: if X and Y are independent discrete random variables, then the joint pmf is equal to the product of the marginal pmf’s. ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) Expected Value of a Function of Two Random Variables: , - { ∫ ∫ ( ) ( ) ∑ ∑ ( ) ( ) Joint moment of X and Y: , - { ∫ ∫ ( ) ∑ ∑ ( ) If E[XY]=0, then we say that X and Y are orthogonal. Covariance of X and Y: COV(X,Y)=E[XY]-E[X]E[Y] Pairs of independent random variables have covariance zero. Correlation coefficient of X and Y: ( ) , where √ ( ) √ ( ) X and Y are said to be uncorrelated if Conditional Probability: ( ) ( ) ( ) ( ) , - , - ( ) ( ) Conditional Expectation: , - ∫ ( ) , - ∑ ( ) Central Limit Theorem Let Sn be the sum of n iid random variables with finite mean E[X]= and finite variance and let Zn be the zero- mean, unit-variance random variable defined by Zn √ For more visit tricntip.blogspot.com Haris H.