SUMMARY AND DISCUSSION
2.8 SUMMARY AND DISCUSSION
Random variables provide the natural tools for dealing with probabilistic models in which the outcome determines certain numerical values of interest. In this
chapter, we focused on discrete random variables, and developed a conceptual framework and some relevant tools.
In particular, we introduced concepts such as the PMF, the mean, and the variance, which describe in various degrees of detail the probabilistic character
1 16 Discrete Random Variables Chap. 2 of a discrete random variable. \Ve showed how to use the PMF of a random
variable X to calculate the mean and the variance of a related random variable Y = g(X) without calculating the PI\IF of Y. In the special case where 9 is
a linear function. Y aX + b. the means and the variances of X and Y are
related by
E [Y] = aE[X] + b :
var(Y) = a2var(X ) .
\Ve also discussed several special random variables, and derived their PI\IF. mean, and variance. as summarized in the table that follows.
Summary of Results for Special Random Variables Discrete Uniform over [a b] :
if k = a, a + 1, . . . , b, otherwise,
(b - a) (b - a + 2)
var (X) =
Bernoulli with Parameter p: (Describes the success or failure in a single
trial. )
{ 1- p. If k
p,
� f k 1,
px (k) =
var(X) = = p(l - p).
E[X] p,
Binomial with Parameters p and n: (Describes the number of successes in n independent Bernoulli trials. )
px (k) =
k = 0, 1, . . . ( , n,
pk(l - p)n-k,
E[X] = np,
var(X) = np(l - p).
Geometric with Parameter p: (Describes the number of trials until the
first success, in a sequence of independent Bernoulli trials. )
px (k) = (1 - p)k-lp,
k = 1 , 2, . . . ,
= �. p
1-p
E[X]
var(X) =
Sec. 2.8 Summary and Discussion 117
Poisson with Parameter 'x: (Approximates the binomial PMF when n is large, P is small, and ,x = np.)
k = 0, 1 , . . . ,
E[X] = ,x,
var(X) = ,x.
We also considered multiple random variables, and introduced joint PMFs, conditional PMFs, and associated expected values. Conditional PMFs are often the starting point in probabilistic Illodels and can be used to calculate other quantities of interest, such as marginal or joint P1IFs and expectations, through
a sequential or a divide-and-conquer approach. In particular. given the condi
tional Pl\1F PXIY(x I y):
(a) The joint PMF can be calculated by
px.y(x. y) = py (y)pXIY ( .T I y).
This can be extended to the case of three or more random variables. as in
px.y.z(x. y. z) pz(z)PYlz(Y I z)PxlY.z(x y. z).
and is analogous to the sequential tree-based calculation method using the multiplication rule. discussed in Chapter 1.
(b) The marginal PMF can be calculated by
px(x) = LPY(Y)pxlY(x I y).
which generalizes the divide-and-conquer calculation method we discussed in Chapter 1.
(c) The divide-and-conquer calculation method in (b) above can be extended
to compute expected values using the total expectation theorem:
E[X] = LPy(y)E[X I Y = y].
We introduced the notion of independence of random variables, in analogy with the notion of independence of events. Among other topics. we focused on
random variables X obtained by adding several independent random variables
XI, . . . . Xn:
X = Xl +... + Xn.
1 18 Discrete Random Variables Chap. 2 We argued that the mean and the variance of the sum are equal to the sum of
the means and the sum of the variances, respectively: E[X] = E[XI] + . . . + E[XnJ,
var(X) = var(XI ) + . . . + var(Xn ) . The formula for the mean does not require independence of the Xi , but the
formula for the variance does. The concepts and methods of this chapter extend appropriately to general random variables (see the next chapter) , and are fundamental for our subject.
Problems 119