TWO RANDOM VARIABLES

Chapter 7 TWO RANDOM VARIABLES

  There are many random experiments that involve more than one random variable. For example, an educator may study the joint behavior of grades and time devoted to study; a physician may study the joint behavior of blood pressure and weight. Similarly an economist may study the joint behavior of business volume and profit. In fact, most real problems we come across will have more than one underlying random variable of interest.

  7.1. Bivariate Discrete Random Variables In this section, we develop all the necessary terminologies for studying

  bivariate discrete random variables. Definition 7.1. A discrete bivariate random variable (X, Y ) is an ordered

  pair of discrete random variables.

  Definition 7.2. Let (X, Y ) be a bivariate random variable and let R X and

  R Y

  be the range spaces of X and Y , respectively. A real-valued function f:R X ⇥R Y ! IR is called a joint probability density function for X and Y

  if and only if

  f (x, y) = P (X = x, Y = y) for all (x, y) 2 R X ⇥R Y . Here, the event (X = x, Y = y) means the

  intersection of the events (X = x) and (Y = y), that is

  \ (X = x) (Y = y).

  Example 7.1. Roll a pair of unbiased dice. If X denotes the smaller and Y denotes the larger outcome on the dice, then what is the joint probability density function of X and Y ?

  Probability and Mathematical Statistics

  Answer: The sample space S of rolling two dice consists of

  The probability density function f(x, y) can be computed for X = 2 and Y = 3 as follows: There are two outcomes namely (2, 3) and (3, 2) in the sample S of 36 outcomes which contribute to the joint event (X = 2, Y = 3). Hence

  f (2, 3) = P (X = 2, Y = 3) = . 36

  Similarly, we can compute the rest of the probabilities. The following table shows these probabilities:

  These tabulated values can be written as

  Example 7.2. A group of 9 executives of a certain firm include 4 who are married, 3 who never married, and 2 who are divorced. Three of the

  Two Random Variables

  executives are to be selected for promotion. Let X denote the number of married executives and Y the number of never married executives among the 3 selected for promotion. Assuming that the three are randomly selected from the nine available, what is the joint probability density function of the random variables X and Y ?

  Answer: The number of ways we can choose 3 out of 9 is 9 3 which is 84.

  Similarly, we can find the rest of the probabilities. The following table gives the complete information about these probabilities.

  Definition 7.3. Let (X, Y ) be a discrete bivariate random variable. Let

  R X and R Y

  be the range spaces of X and Y , respectively. Let f(x, y) be the

  joint probability density function of X and Y . The function

  X

  f 1 (x) =

  f (x, y)

  y 2R Y

  Probability and Mathematical Statistics

  is called the marginal probability density function of X. Similarly, the func- tion

  is called the marginal probability density function of Y .

  The following diagram illustrates the concept of marginal graphically.

  Joint Density

  Marginal Density of

  Example 7.3. If the joint probability density function of the discrete random variables X and Y is given by

  then what are marginals of X and Y ? Answer: The marginal of X can be obtained by summing the joint proba-

  bility density function f(x, y) for all y values in the range space R Y of the random variable Y . That is

  = f(x, x) +

  f (x, y) +

  f (x, y)

  y>x

  y

  + (6 x)

  36 x = 1, 2, ..., 6.

  [13 2 x] ,

  Two Random Variables

  Similarly, one can obtain the marginal probability density of Y by summing

  over for all x values in the range space R X of the random variable X. Hence

  = f(y, y) +

  Example 7.4. Let X and Y be discrete random variables with joint proba- bility density function

  What are the marginal probability density functions of X and Y ? Answer: The marginal of X is given by

  Similarly, the marginal of Y is given by

  From the above examples, note that the marginal f 1 (x) is obtained by sum- ming across the columns. Similarly, the marginal f 2 (y) is obtained by sum-

  ming across the rows.

  Probability and Mathematical Statistics

  The following theorem follows from the definition of the joint probability density function.

  Theorem 7.1. A real valued function f of two variables is a joint probability density function of a pair of discrete random variables X and Y (with range

  spaces R X and R Y , respectively) if and only if

  (a) f(x, y)

  Example 7.5. For what value of the constant k the function given by

  is a joint probability density function of some random variables X and Y ? Answer: Since

  and the corresponding density function is given by

  As in the case of one random variable, there are many situations where one wants to know the probability that the values of two random variables are less than or equal to some real numbers x and y.

  Two Random Variables

  Definition 7.4. Let X and Y be any two discrete random variables. The

  real valued function F : IR 2 ! IR is called the joint cumulative probability

  distribution function of X and Y if and only if

  F (x, y) = P (X  x, Y  y)

  T(Y  y).

  for all (x, y) 2 IR 2 . Here, the event (X  x, Y  y) means (X  x)

  From this definition it can be shown that for any real numbers a and b

  Further, one can also show that

  where (s, t) is any pair of nonnegative numbers.

  7.2. Bivariate Continuous Random Variables In this section, we shall extend the idea of probability density functions

  of one random variable to that of two random variables. Definition 7.5. The joint probability density function of the random vari-

  ables X and Y is an integrable function f(x, y) such that

  (a) f(x, y)

  Example 7.6. Let the joint density function of X and Y be given by

  What is the value of the constant k ?

  Probability and Mathematical Statistics

  Answer: Since f is a joint probability density function, we have

  Hence k = 10.

  If we know the joint probability density function f of the random vari- ables X and Y , then we can compute the probability of the event A from

  Example 7.7. Let the joint density of the continuous random variables X and Y be

  What is the probability of the event (X  Y ) ?

  Two Random Variables

  Answer: Let A = (X  Y ). we want to find

  Definition 7.6. Let (X, Y ) be a continuous bivariate random variable. Let

  f (x, y) be the joint probability density function of X and Y . The function

  is called the marginal probability density function of X. Similarly, the func- tion

  is called the marginal probability density function of Y . Example 7.8. If the joint density function for X and Y is given by

  < 4 for 0 < y 2

  f (x, y) =

  : 0 otherwise,

  then what is the marginal density function of X, for 0 < x < 1? Answer: The domain of the f consists of the region bounded by the curve

  x=y 2 and the vertical line x = 1. (See the figure on the next page.)

  Probability and Mathematical Statistics

  Example 7.9. Let X and Y have joint density function

  What is the marginal density of X where nonzero?

  Two Random Variables

  Answer: The marginal density of X is given by

  Example 7.10. Let (X, Y ) be distributed uniformly on the circular disk

  centered at (0, 0) with radius p 2 ⇡ . What is the marginal density function of

  X where nonzero?

  Answer: The equation of a circle with radius p 2 ⇡ and center at the origin is

  Hence, solving this equation for y, we get

  Thus, the marginal density of X is given by

  Probability and Mathematical Statistics

  ⇡ 4 x 2 area of the circle Zp ⇡ 4 x 2 1

  Definition 7.7. Let X and Y be the continuous random variables with joint probability density function f(x, y). The joint cumulative distribution function F (x, y) of X and Y is defined as

  From the fundamental theorem of calculus, we again obtain

  Example 7.11. If the joint cumulative distribution function of X and Y is given by

  < 5 2x y+3x y

  then what is the joint density of X and Y ?

  Two Random Variables

  2x 3 y+3x 2 y 2

  Hence, the joint density of X and Y is given by

  5 x +2xy

  Example 7.12. Let X and Y have the joint density function

  What is P X + Y  1 X  1

  Answer: (See the diagram below.)

  Probability and Mathematical Statistics

  1X 2

  Example 7.13. Let X and Y have the joint density function

  What is P (2X  1 X + Y  1) ? Answer: We know that

  ⇥

  P X  1 T (X + Y  1)⇤

  P (2X

   1 X + Y  1) = 2

  P (X + Y  1)

  Z 1 Z 1x

  P [X + Y  1] =

  (x + y) dy dx

   x 2 x 3 (1 x) 3 1

  Two Random Variables

  7.3. Conditional Distributions First, we motivate the definition of conditional distribution using dis-

  crete random variables and then based on this motivation we give a general definition of the conditional distribution. Let X and Y be two discrete ran- dom variables with joint probability density f(x, y). Then by definition of the joint probability density, we have

  f (x, y) = P (X = x, Y = y).

  If A = {X = x}, B = {Y = y} and f 2 (y) = P (Y = y), then from the above

  equation we have

  If we write the P ({X = x} {Y = y}) as g(x y), then we have

  f (x, y)

  g(x y) =

  f 2 (y)

  Probability and Mathematical Statistics

  For the discrete bivariate random variables, we can write the conditional probability of the event {X = x} given the event {Y = y} as the ratio of the

  probability of the event {X = x} T

  {Y = y} to the probability of the event

  {Y = y} which is

  We use this fact to define the conditional probability density function given two random variables X and Y .

  Definition 7.8. Let X and Y be any two random variables with joint density

  f (x, y) and marginals f 1 (x) and f 2 (y). The conditional probability density

  function g of X, given (the event) Y = y, is defined as

  Similarly, the conditional probability density function h of Y , given (the event)

  X = x, is defined as

  Example 7.14. Let X and Y be discrete random variables with joint prob- ability function

  What is the conditional probability density function of X, given Y = 2 ? Answer: We want to find g(x2). Since

  we should first compute the marginal of Y , that is f 2 (2). The marginal of Y

  is given by

  X 3 1

  f 2 (y) =

  (x + y)

  x=1 21

  (6 + 3 y).

  Two Random Variables

  Hence f 2 (2) = 12 21 . Thus, the conditional probability density function of X,

  given Y = 2, is

  f (x, 2)

  g(x2) =

  Example 7.15. Let X and Y be discrete random variables with joint prob- ability density function

  What is the conditional probability of Y given X = x ? Answer:

  h(yx) =

  Thus, the conditional probability Y given X = x is

  ( x+y

  for x = 1, 2; y = 1, 2, 3, 4

  h(yx) =

  4x+10

  0 otherwise.

  Example 7.16. Let X and Y be continuous random variables with joint pdf

  ( 12x

  for 0 < y < 2x < 1

  f (x, y) =

  0 otherwise .

  Probability and Mathematical Statistics

  What is the conditional density function of Y given X = x ? Answer: First, we have to find the marginal of X.

  Thus, the conditional density of Y given X = x is

  f (x, y)

  h(yx) =

  and zero elsewhere.

  Example 7.17. Let X and Y be random variables such that X has density function

  ( 24x 2 for 0 < x < 1 2

  f 1 (x) =

  0 elsewhere

  Two Random Variables

  and the conditional density of Y given X = x is

  ( y

  2x 2 for 0 < y < 2x

  h(yx) =

  0 elsewhere .

  What is the conditional density of X given Y = y over the appropriate domain?

  Answer: The joint density f(x, y) of X and Y is given by

  f (x, y) = h(yx) f 1 (x)

  The marginal density of Y is given by

  Hence, the conditional density of X given Y = y is

  f (x, y)

  g(xy) =

  Thus, the conditional density of X given Y = y is given by

  for 0 < y < 2x < 1

  g(xy) =

  1y

  0 otherwise.

  Note that for a specific x, the function f(x, y) is the intersection (profile) of the surface z = f(x, y) by the plane x = constant. The conditional density

  f (yx), is the profile of f (x, y) normalized by the factor 1 f 1 (x) .

  Probability and Mathematical Statistics

  7.4. Independence of Random Variables In this section, we define the concept of stochastic independence of two

  random variables X and Y . The conditional probability density function g of X given Y = y usually depends on y. If g is independent of y, then the random variables X and Y are said to be independent. This motivates the following definition.

  Definition 7.8. Let X and Y be any two random variables with joint density

  f (x, y) and marginals f 1 (x) and f 2 (y). The random variables X and Y are

  (stochastically) independent if and only if

  f (x, y) = f 1 (x) f 2 (y)

  for all (x, y) 2 R X ⇥R Y .

  Example 7.18. Let X and Y be discrete random variables with joint density

  Are X and Y stochastically independent? Answer: The marginals of X and Y are given by

  = f(x, x) +

  for x = 1, 2, ..., 6

  = f(y, y) +

  for y = 1, 2, ..., 6.

  Two Random Variables

  =f 1 (1) f 2 (1),

  36 6 36 36 we conclude that f(x, y) 6= f 1 (x) f 2 (y), and X and Y are not independent.

  This example also illustrates that the marginals of X and Y can be determined if one knows the joint density f(x, y). However, if one knows the marginals of X and Y , then it is not possible to find the joint density of X and Y unless the random variables are independent.

  Example 7.19. Let X and Y have the joint density

  ( e (x+y)

  Are X and Y stochastically independent? Answer: The marginals of X and Y are given by

  e (x+y) dy = e x

  e (x+y) dx = e y .

  Hence

  f (x, y) = e (x+y) =e x e y =f 1 (x) f 2 (y).

  Thus, X and Y are stochastically independent.

  Notice that if the joint density f(x, y) of X and Y can be factored into two nonnegative functions, one solely depending on x and the other solely depending on y, then X and Y are independent. We can use this factorization approach to predict when X and Y are not independent.

  Example 7.20. Let X and Y have the joint density

  Are X and Y stochastically independent? Answer: Notice that

  f (x, y) = x + y

  ⇣

  y ⌘

  =x 1+

  x

  Probability and Mathematical Statistics

  Thus, the joint density cannot be factored into two nonnegative functions one depending on x and the other depending on y; and therefore X and Y are not independent.

  If X and Y are independent, then the random variables U = (X) and

  V = (Y ) are also independent. Here , : IR ! IR are some real valued functions. From this comment, one can conclude that if X and Y are inde-

  pendent, then the random variables e X and Y 3 +Y 2 +1 are also independent.

  Definition 7.9. The random variables X and Y are said to be independent and identically distributed (IID) if and only if they are independent and have the same distribution.

  Example 7.21. Let X and Y be two independent random variables with identical probability density function given by

  What is the probability density function of W = min{X, Y } ? Answer: Let G(w) be the cumulative distribution function of W . Then

  G(w) = P (W  w)

  = 1 P (W > w) = 1 P (min{X, Y } > w)

  = 1 P (X > w and Y > w) = 1 P (X > w) P (Y > w)

  (since X and Y are independent)

  =1e 2w . Thus, the probability density function of W is

  d d

  g(w) =

  G(w) =

  g(w) =

  0 elsewhere.

  Two Random Variables

  7.5. Review Exercises

  1. Let X and Y be discrete random variables with joint probability density function

  What are the marginals of X and Y ?

  2. Roll a pair of unbiased dice. Let X be the maximum of the two faces and Y be the sum of the two faces. What is the joint density of X and Y ?

  3. For what value of c is the real valued function

  a joint density for some random variables X and Y ?

  4. Let X and Y have the joint density

  ( e (x+y)

  What is P (X Y

  5. If the random variable X is uniform on the interval from 1 to 1, and the random variable Y is uniform on the interval from 0 to 1, what is the prob-

  ability that the the quadratic equation t 2 + 2Xt + Y = 0 has real solutions?

  Assume X and Y are independent.

  6. Let Y have a uniform distribution on the interval (0, 1), and let the conditional density of X given Y = y be uniform on the interval from 0 to

  p y. What is the marginal density of X for 0 < x < 1?

  Probability and Mathematical Statistics

  7. If the joint cumulative distribution of the random variables X and Y is

  what is the joint probability density function of the random variables X and Y , and the P (1 < X < 3, 1 < Y < 2)?

  8. If the random variables X and Y have the joint density

  what is the probability P (Y

  X 2 )?

  9. If the random variables X and Y have the joint density

  what is the probability P [max(X, Y ) > 1] ?

  10. Let X and Y have the joint probability density function

  What is the marginal density function of X where it is nonzero?

  11. Let X and Y have the joint probability density function

  What is the marginal density function of Y , where nonzero?

  12. A point (X, Y ) is chosen at random from a uniform distribution on the circular disk of radius centered at the point (1, 1). For a given value of X = x between 0 and 2 and for y in the appropriate domain, what is the conditional density function for Y ?

  Two Random Variables

  13. Let X and Y be continuous random variables with joint density function

  What is the conditional probability P (X < 1 | Y < 1) ?

  14. Let X and Y be continuous random variables with joint density function

  What is the conditional density function of Y given X = x ?

  15. Let X and Y be continuous random variables with joint density function

  What is the conditional probability P X < 1 2 1 |Y= 4 ?

  16. Let X and Y be two independent random variables with identical prob- ability density function given by

  What is the probability density function of W = max{X, Y } ?

  17. Let X and Y be two independent random variables with identical prob- ability density function given by

  < 3x ✓ 3 for 0  x  ✓

  f (x) = : 0 elsewhere,

  for some ✓ > 0. What is the probability density function of W = min{X, Y }?

  18. Ron and Glenna agree to meet between 5 P.M. and 6 P.M. Suppose that each of them arrive at a time distributed uniformly at random in this time interval, independent of the other. Each will wait for the other at most

  10 minutes (and if other does not show up they will leave). What is the probability that they actually go out?

  Probability and Mathematical Statistics

  19. Let X and Y be two independent random variables distributed uniformly on the interval [0, 1]. What is the probability of the event Y

  2 given that

  Y

  1 2X?

  20. Let X and Y have the joint density

  What is P (X + Y > 1) ?

  21. Let X and Y be continuous random variables with joint density function

  Are X and Y stochastically independent?

  22. Let X and Y be continuous random variables with joint density function

  Are X and Y stochastically independent?

  23. A bus and a passenger arrive at a bus stop at a uniformly distributed time over the interval 0 to 1 hour. Assume the arrival times of the bus and passenger are independent of one another and that the passenger will wait up to 15 minutes for the bus to arrive. What is the probability that the passenger will catch the bus?

  24. Let X and Y be continuous random variables with joint density function

  What is the probability of the event X  3

  1 given that Y

  25. Let X and Y be continuous random variables with joint density function

  What is the probability of the event X  1

  2 given that Y = 1?

  Two Random Variables

  26. If the joint density of the random variables X and Y is

  what is the probability of the event X  3

  27. If the joint density of the random variables X and Y is

  8 ⇥

  ⇤

  < e min {x,y}

  1 e (x+y) if 0 < x, y < 1

  f (x, y) =

  : 0 otherwise,

  then what is the marginal density function of X, where nonzero?

  Probability and Mathematical Statistics

  Product Moments of Bivariate Random Variables