Moments and Moment-Generating Functions

7.3 Moments and Moment-Generating Functions

  In this section, we concentrate on applications of moment-generating functions. The obvious purpose of the moment-generating function is in determining moments of random variables. However, the most important contribution is to establish distributions of functions of random variables.

  If g(X) = X r for r = 0, 1, 2, 3, . . . , Definition 7.1 yields an expected value called the rth moment about the origin of the random variable X, which we denote

  by μ r . Definition 7.1: The rth moment about the origin of the random variable X is given by

  ⎧ r

  ⎨

  f (x),

  if X is discrete,

  −∞ x r f (x) dx, if X is continuous.

  ∞

  Since the first and second moments about the origin are given by μ 1 = E(X) and

  μ 2 = E(X 2 ), we can write the mean and variance of a random variable as

  Although the moments of a random variable can be determined directly from Definition 7.1, an alternative procedure exists. This procedure requires us to utilize

  a moment-generating function.

  Definition 7.2: The moment-generating function of the random variable X is given by E(e tX )

  and is denoted by M X (t). Hence,

  ⎧

  ⎨ tX e tx f (x),

  if X is discrete,

  M X (t) = E(e )=

  ⎩ x ∞

  −∞ e tx f (x) dx, if X is continuous.

  Moment-generating functions will exist only if the sum or integral of Definition

  7.2 converges. If a moment-generating function of a random variable X does exist, it can be used to generate all the moments of that variable. The method is described in Theorem 7.6 without proof.

  Theorem 7.6: Let X be a random variable with moment-generating function M X (t). Then

  Example 7.6: Find the moment-generating function of the binomial random variable X and then

  use it to verify that μ = np and σ 2 = npq.

  Solution : From Definition 7.2 we have

  n

  n

  n n

  M (t) =

  e tx

  p x q X n −x

  (pe t ) x q n −x .

  x

  x

  7.3 Moments and Moment-Generating Functions

  Recognizing this last sum as the binomial expansion of (pe t + q) n , we obtain

  = n(pe t + q) n −1 pe t

  2 = np[e t (n

  t

  − 1)(pe t + q) n −2 pe t + (pe t + q) n −1 e ].

  dt

  Setting t = 0, we get

  μ 1 = np and μ 2 = np[(n − 1)p + 1].

  Therefore,

  μ=μ 1 = np and σ 2 =μ 2 −μ 2 = np(1 − p) = npq,

  which agrees with the results obtained in Chapter 5.

  Example 7.7: Show that the moment-generating function of the random variable X having a

  normal probability distribution with mean μ and variance σ 2 is given by

  Solution : From Definition 7.2 the moment-generating function of the normal random variable

  Completing the square in the exponent, we can write

  2 − (μ + tσ 2 )] 2 2 − 2(μ + tσ 4 − 2μtσ −t σ

  x 2 2 )x + μ 2 = [x

  and then

  ∞

  − 2μtσ −t M σ

  − (μ + tσ 2 )]σ; then dx = σ dw and

  2 1 1 M X (t) = exp μt + σ 2 t 2 μt +

  1 ∞

  2 √ 2 e −w dw = exp σ t ,

  Chapter 7 Functions of Random Variables (Optional)

  since the last integral represents the area under a standard normal density curve and hence equals 1.

  Although the method of transforming variables provides an effective way of finding the distribution of a function of several variables, there is an alternative and often preferred procedure when the function in question is a linear combination of independent random variables. This procedure utilizes the properties of moment- generating functions discussed in the following four theorems. In keeping with the mathematical scope of this book, we state Theorem 7.7 without proof.

  Theorem 7.7: (Uniqueness Theorem) Let X and Y be two random variables with moment-

  generating functions M X (t) and M Y (t), respectively. If M X (t) = M Y (t) for all

  values of t, then X and Y have the same probability distribution.

  Theorem 7.8: M X+a (t) = e at M X (t). Proof : M X+a (t) = E[e t(X+a) ]=e at E(e tX )=e at M X (t).

  Theorem 7.9: M aX (t) = M X (at). Proof : M aX (t) = E[e t(aX) ] = E[e (at)X ]=M X (at).

  Theorem 7.10: If X 1 ,X 2 ,...,X n are independent random variables with moment-generating func-

  tions M X 1 (t), M X 2 (t), . . . , M X n (t), respectively, and Y = X 1 +X 2 + · · ·+X n , then

  M Y (t) = M X 1 (t)M X 2 (t) ···M X n (t).

  The proof of Theorem 7.10 is left for the reader.

  Theorems 7.7 through 7.10 are vital for understanding moment-generating func- tions. An example follows to illustrate. There are many situations in which we need to know the distribution of the sum of random variables. We may use Theorems

  7.7 and 7.10 and the result of Exercise 7.19 on page 224 to find the distribution of a sum of two independent Poisson random variables with moment-generating functions given by

  respectively. According to Theorem 7.10, the moment-generating function of the

  random variable Y 1 =X 1 +X 2 is

  1 (t)M X 2 +μ (t) = e 1 −1) e 2 −1) =e 1 2 −1) ,

  which we immediately identify as the moment-generating function of a random

  variable having a Poisson distribution with the parameter μ 1 +μ 2 . Hence, accord-

  ing to Theorem 7.7, we again conclude that the sum of two independent random

  variables having Poisson distributions, with parameters μ 1 and μ 2 , has a Poisson

  distribution with parameter μ 1 +μ 2 .

  7.3 Moments and Moment-Generating Functions

  Linear Combinations of Random Variables

  In applied statistics one frequently needs to know the probability distribution of

  a linear combination of independent normal random variables. Let us obtain the

  distribution of the random variable Y = a 1 X 1 +a 2 X 2 when X 1 is a normal variable with mean μ 1 and variance σ 2 1 and X 2 is also a normal variable but independent of X 1 with mean μ 2 and variance σ 2 . First, by Theorem 7.10, we find

  M Y (t) = M a 1 X 1 (t)M a 2 X 2 (t),

  and then, using Theorem 7.9, we find

  M Y (t) = M X 1 (a 1 t)M X 2 (a 2 t).

  Substituting a 1 t for t and then a 2 t for t in a moment-generating function of the

  normal distribution derived in Example 7.7, we have

  M Y (t) = exp(a 1 μ 1 t+a 2 1 σ 2 1 t 2 2 + a 2 μ 2 t+a 2 σ 2 t 2 2) = exp[(a μ +a μ )t + (a 2 2 2 2 1 2 1 2 2 1 σ 1 +a σ )t 2],

  which we recognize as the moment-generating function of a distribution that is

  normal with mean a 1 μ 1 +a 2 μ 2 and variance a 2 1 σ 2 1 +a 2 σ 2 .

  Generalizing to the case of n independent normal variables, we state the fol- lowing result.

  Theorem 7.11: If X 1 ,X 2 ,...,X n are independent random variables having normal distributions

  with means μ 1 ,μ 2 ,...,μ n and variances σ 2 2 1 2 ,σ ,...,σ n , respectively, then the ran-

  dom variable

  Y=a 1 X 1 +a 2 X 2 + ···+a n X n

  has a normal distribution with mean

  μ Y =a 1 μ 1 +a 2 μ 2 + ···+a n μ n

  and variance

  σ 2 =a 2 σ 2 +a 2 σ Y 2 1 1 +

  ···+a 2

  n σ 2 n .

  It is now evident that the Poisson distribution and the normal distribution possess a reproductive property in that the sum of independent random variables having either of these distributions is a random variable that also has the same type of distribution. The chi-squared distribution also has this reproductive property.

  Theorem 7.12: If X 1 ,X 2 ,...,X n are mutually independent random variables that have, respec-

  tively, chi-squared distributions with v 1 ,v 2 ,...,v n degrees of freedom, then the

  random variable

  Y=X 1 +X 2 + ···+X n

  has a chi-squared distribution with v = v 1 +v 2 + ···+v n degrees of freedom.

  Proof : By Theorem 7.10 and Exercise 7.21,

  M Y (t) = M X

  1 (t)M X 2 (t) ···M X n (t) and M X i (t) = (1 − 2t) −v i , i = 1, 2, . . . , n.

  Chapter 7 Functions of Random Variables (Optional)

  −(v 1 − 2t) +v − 2t) · · · (1 − 2t) − 2t) 2 + ···+v n )2 ,

  which we recognize as the moment-generating function of a chi-squared distribution

  with v = v 1 +v 2 + ···+v n degrees of freedom. Corollary 7.1: If X 1 ,X 2 ,...,X n are independent random variables having identical normal dis-

  tributions with mean μ and variance σ 2 , then the random variable

  has a chi-squared distribution with v = n degrees of freedom. This corollary is an immediate consequence of Example 7.5. It establishes a re-

  lationship between the very important chi-squared distribution and the normal distribution. It also should provide the reader with a clear idea of what we mean by the parameter that we call degrees of freedom. In future chapters, the notion of degrees of freedom will play an increasingly important role.

  Corollary 7.2: If X 1 ,X 2 ,...,X n are independent random variables and X i follows a normal dis-

  tribution with mean μ i and variance σ 2 i for i = 1, 2, . . . , n, then the random

  has a chi-squared distribution with v = n degrees of freedom.

  Exercises

  7.1 Let X be a random variable with probability

  the joint multinomial distribution

  Find the probability distribution of the random vari- able Y = 2X − 1.

  for x 1 = 0, 1, 2; x 2 = 0, 1, 2; x 1 +x 2 ≤ 2; and zero elsewhere. Find the joint probability distribution of 7.2 Let X be a binomial random variable with prob- Y 1 =X 1 +X 2 and Y 2 =X 1 −X 2 .

  ability distribution

  7.4 Let X 1 and X 2 be discrete random variables with

  3 2 x 3 3−x

  joint probability distribution

  Find the probability distribution of the random vari-

  able Y = X 2 .

  Find the probability distribution of the random vari-

  able Y = X 1 X 2 .

  7.3 Let X 1 and X 2 be discrete random variables with

  Exercises

  7.5 Let X have the probability distribution

  7.10 The random variables X and Y , representing

  the weights of creams and toffees, respectively, in 1-

  0 < x < 1,

  kilogram boxes of chocolates containing a mixture of

  f (x) =

  0, elsewhere.

  creams, toffees, and cordials, have the joint density function

  Show that the random variable Y =

  f (x, y) = 24xy, 0 squared distribution with 2 degrees of freedom. ≤ x ≤ 1, 0 ≤ y ≤ 1, x + y ≤ 1,

  −2 ln X has a chi-

  elsewhere.

  7.6 Given the random variable X with probability (a) Find the probability density function of the random distribution

  variable Z = X + Y .

  2x, (b) Using the density function of Z, find the probabil- 0 < x < 1,

  f (x) =

  ity that, in a given box, the sum of the weights of

  elsewhere,

  creams and toffees accounts for at least 12 but less than 34 of the total weight.

  find the probability distribution of Y = 8X 3 .

  7.11 The amount of kerosene, in thousands of liters,

  7.7 The speed of a molecule in a uniform gas at equi- in a tank at the beginning of any day is a random librium is a random variable V whose probability dis- amount Y from which a random amount X is sold dur- tribution is given by

  ing that day. Assume that the joint density function 2 of these variables is given by

  where k is an appropriate constant and b depends on Find the probability density function for the amount the absolute temperature and mass of the molecule. of kerosene left in the tank at the end of the day. Find the probability distribution of the kinetic energy

  of the molecule W , where W = mV 2 7.12 Let X 1 and X 2 be independent random variables

  each having the probability distribution

  7.8 A dealer’s profit, in units of 5000, on a new au-

  e −x , x > 0,

  tomobile is given by Y = X 2 , where X is a random

  f (x) =

  variable having the density function

  elsewhere.

  Show that the random variables Y 1 and Y 2 2(1 are inde-

  − x), 0 < x < 1,

  f (x) =

  pendent when Y 1 =X 1 +X 2 and Y 2 =X 1 (X 1 +X 2 ).

  elsewhere.

  7.13 A current of I amperes flowing through a resis-

  (a) Find the probability density function of the random tance of R ohms varies according to the probability

  variable Y .

  distribution

  6i(1

  (b) Using the density function of Y , find the probabil-

  − i), 0 < i < 1,

  ity that the profit on the next new automobile sold

  by this dealership will be less than 500.

  If the resistance varies independently of the current ac-

  7.9 The hospital period, in days, for patients follow- cording to the probability distribution ing treatment for a certain type of kidney disorder is a

  2r,

  0 < r < 1,

  random variable Y = X + 4, where X has the density

  g(r) =

  function

  elsewhere, 32 find the probability distribution for the power W =

  (x+4) 3 , x > 0,

  f (x) =

  I R watts.

  elsewhere.

  7.14 Let X be a random variable with probability distribution

  (a) Find the probability density function of the random

  variable Y .

  (b) Using the density function of Y , find the probabil-

  elsewhere.

  ity that the hospital period for a patient following Find the probability distribution of the random vari- this treatment will exceed 8 days.

  able Y = X 2 .

  Chapter 7 Functions of Random Variables (Optional)

  7.15 Let X have the probability distribution

  7.19 A random variable X has the Poisson distribu-

  tion p(x; μ) = e −μ μ x x! for x = 0, 1, 2, . . . . Show that

  2(x+1)

  9 , −1 < x < 2,

  the moment-generating function of X is

  μ(e X t (t) = e −1) . Find the probability distribution of the random vari- Using M X (t), find the mean and variance of the Pois-

  M

  able Y = X 2 .

  son distribution.

  7.16 Show that the rth moment about the origin of 7.20 The moment-generating function of a certain

  the gamma distribution is

  Poisson random variable X is given by

  β r = Γ(α + r) . M X (t) = e 4(e t μ −1) r .

  Γ(α)

  Find P (μ − 2σ < X < μ + 2σ).

  [Hint: Substitute y = xβ in the integral defining μ r

  and then use the gamma function to evaluate the inte- 7.21 Show that the moment-generating function of

  gral.]

  the random variable X having a chi-squared distribu- tion with v degrees of freedom is

  7.17 A random variable X has the discrete uniform distribution

  7.22 Using the moment-generating function of Exer- cise 7.21, show that the mean and variance of the chi-

  Show that the moment-generating function of X is

  squared distribution with v degrees of freedom are, re- spectively, v and 2v.

  If both X and Y , distributed independently, fol-

  k(1 −e t

  low exponential distributions with mean parameter 1, find the distributions of

  (a) U = X + Y ;

  7.18 A random variable X has the geometric distri- bution g(x; p) = pq x−1 for x = 1, 2, 3, . . . . Show that (b) V = X(X + Y ). the moment-generating function of X is

  7.24 By expanding e tx in a Maclaurin series and in-

  pe t

  tegrating term by term, show that

  and then use M X (t) to find the mean and variance of

  −∞

  the geometric distribution.

  t = 1 + μt + μ 2

  2 2! + ···+μ r t r + r! ···.

  Chapter 8

  Fundamental Sampling Distributions and Data Descriptions