EDWIN OKOAMPA BOADU
DEFINITION Definition 2.3.3. Let  X  be a random variable with cdf  F X .  The  moment generating function (mgf)  of  X  (or  F X ), denoted  M X (t) , is provided that the expectation exists for  t  in some neighborhood of 0.  That is, there is an  h >0 such that, for all  t  in – h < t < h ,  E [ e tX ] exists.
If the expectation does not exist in a neighborhood of 0, we say that the moment generating function does not exist. More explicitly, the moment generating function can be defined as:
Theorem 2.3.2: If  X  has mgf  M X ( t ), then where we define
First note that  e tx  can be approximated around zero using a Taylor series expansion:
Note for any moment n: Thus, as  t  0
Leibnitz’s Rule: If  f ( x ,θ),  a (θ), and  b (θ) are differentiable with respect to θ, then
Berger and Casella proof:  Assume that we can differentiate under the integral using Leibnitz’s rule, we have
Letting  t ->0, this integral simply becomes This proof can be extended for any moment of the distribution function.
Moment Generating Functions for Specific Distributions Application to the Uniform Distribution:
Following the expansion developed earlier, we have:
Letting  b =1 and  a =0, the last expression becomes: The first three moments of the uniform distribution are then:
 
Application to the Univariate Normal Distribution
Focusing on the term in the exponent, we have
The next state is to complete the square in the numerator.
The complete expression then becomes:
The moment generating function then becomes:
Taking the first derivative with respect to  t , we get: Letting  t ->0, this becomes:
The second derivative of the moment generating function with respect to  t  yields: Again, letting  t ->0 yields
Let  X  and  Y  be independent random variables with moment generating functions  M X ( t ) and  M Y ( t ).  Consider their sum  Z = X + Y  and its moment generating function:
We conclude that the moment generating function for two independent random variables is equal to the product of the moment generating functions of each variable.
Skipping ahead slightly, the multivariate normal distribution function can be written as: where Σ is the variance matrix and μ is a vector of means.
In order to derive the moment generating function, we now need a vector  t .  The moment generating function can then be defined as:
Normal variables are independent if the variance matrix is a diagonal matrix.  Note that if the variance matrix is diagonal, the moment generating function for the normal can be written as:
 

Moment generating function

  • 1.
  • 2.
    DEFINITION Definition 2.3.3.Let X be a random variable with cdf F X . The moment generating function (mgf) of X (or F X ), denoted M X (t) , is provided that the expectation exists for t in some neighborhood of 0. That is, there is an h >0 such that, for all t in – h < t < h , E [ e tX ] exists.
  • 3.
    If the expectationdoes not exist in a neighborhood of 0, we say that the moment generating function does not exist. More explicitly, the moment generating function can be defined as:
  • 4.
    Theorem 2.3.2: If X has mgf M X ( t ), then where we define
  • 5.
    First note that e tx can be approximated around zero using a Taylor series expansion:
  • 6.
    Note for anymoment n: Thus, as t  0
  • 7.
    Leibnitz’s Rule: If f ( x ,θ), a (θ), and b (θ) are differentiable with respect to θ, then
  • 8.
    Berger and Casellaproof: Assume that we can differentiate under the integral using Leibnitz’s rule, we have
  • 9.
    Letting t->0, this integral simply becomes This proof can be extended for any moment of the distribution function.
  • 10.
    Moment Generating Functionsfor Specific Distributions Application to the Uniform Distribution:
  • 11.
    Following the expansiondeveloped earlier, we have:
  • 12.
    Letting b=1 and a =0, the last expression becomes: The first three moments of the uniform distribution are then:
  • 13.
  • 14.
    Application to theUnivariate Normal Distribution
  • 15.
    Focusing on theterm in the exponent, we have
  • 16.
    The next stateis to complete the square in the numerator.
  • 17.
  • 18.
    The moment generatingfunction then becomes:
  • 19.
    Taking the firstderivative with respect to t , we get: Letting t ->0, this becomes:
  • 20.
    The second derivativeof the moment generating function with respect to t yields: Again, letting t ->0 yields
  • 21.
    Let X and Y be independent random variables with moment generating functions M X ( t ) and M Y ( t ). Consider their sum Z = X + Y and its moment generating function:
  • 22.
    We conclude thatthe moment generating function for two independent random variables is equal to the product of the moment generating functions of each variable.
  • 23.
    Skipping ahead slightly,the multivariate normal distribution function can be written as: where Σ is the variance matrix and μ is a vector of means.
  • 24.
    In order toderive the moment generating function, we now need a vector t . The moment generating function can then be defined as:
  • 25.
    Normal variables areindependent if the variance matrix is a diagonal matrix. Note that if the variance matrix is diagonal, the moment generating function for the normal can be written as:
  • 26.