=E exp

Problem 15. * Suppose that a sequence YI , Y2 , ••• of random variables converges to

a real number c, with probability 1. Show that the sequence also converges to c in probability.

Solution. Let C be the event that the sequence of values of the random variables Yn converges to c. By assumption, we have P( C) = 1. Fix some € > 0, and let Ak be

the event that I Yn - c l < € for every n � k. If the sequence of values of the random variables Yn converges to c, then there must exist some k such that for every n � k,

Problems 293 this sequence of values is within less than E from c. Therefore, every element of C

belongs to Ak for some k, or

Note also that the sequence of events Ak is monotonically increasing, in the sense that Ak C Ak+l for all k. Finally, note that the event Ak is a subset of the event {I Y k - cl < E } . Therefore,

lim p ( I Y k - cl < E ) � lim P(Ak) = P(U�=I Ak) � P(C) = 1,

k-oo

k-oc

where the first equality uses the continuity property of probabilities (Problem 13 in Chapter 1). It follows that

lim p ( I Y k - cl � E ) = 0,

k-oc

which establishes convergence in probability. Problem 16. '" Consider a sequence Yn of nonnegative random variables and suppose

that

Show that Yn converges to 0, with probability 1. Note: This result provides a commonly used method for establishing convergence with

probability 1. To evaluate the expectation of 2::'=1 Yn, one typically uses the formula

[t. ] �

E Yn =

E[Yn).

The fact that the expectation and the infinite summation can be interchanged, for the case of nonnegative random variables, is known as the monotone convergence theorem. a fundamental result of probability theory, whose proof lies beyond the scope of this text.

Solution. We note that the infinite sum 2::'=1 Yn must be finite, with probability

1. Indeed. if it had a positive probability of being infinite, then its expectation would also be infinite. But if the sum of the values of the random variables Yn is finite, the sequence of these values must converge to zero. Since the probability of this event is

equal to 1, it follows that the sequence Yn converges to zero, with probability 1.

X n, and let pn =

Problem 17. '" Consider a sequence of Bernoulli random variables

P(Xn = 1) be the probability of success in the nth trial. Assuming that 2::'=1 pn < 00, show that the number of successes is finite, with probability 1. [Compare with Problem 48(b) in Chapter 1.]

Solution. Using the monotone convergence theorem (see above note) , we have

294 Limit Theorems Chap. 5 This implies that

00 L Xn < 00 ,

n=l

with probability 1 . We then note that the event { 2:::'1 Xn < oo} is the same as the event that there is a finite number of successes.

Problem 18. * The strong law of large numbers. Let Xl , X 2, . . . be a sequence of independent identically distributed random variables and assume that E[XtJ < 00.

Prove the strong law of large numbers.

Solution. We note that the assumption E[XtJ < 00 implies that the expected value of the Xi is finite. Indeed, using the inequality Ixl � 1 + X4, we have

E [IXil] � 1 + E[XtJ < 00 .

Let us assume first that E[XiJ = o. We will show that

< [ 00.

X (Xl + . . . + Xn)4

n=l

We have

(Xl + . . . + Xn)4 = 1

E[Xil Xi2X'3Xi4J·

n4

n4

'1 =1 '2=1 13=1 '4=1

Let us consider the various terms in this sum. If one of the indices is different from all of the other indices, the corresponding term is equal to zero. For example, if il is different from i2, i3, or i4, the assumption E[XiJ = 0 yields