, the random variable Y = ln(X) ⇠ N(0, 4).

\(0, 4), the random variable Y = ln(X) ⇠ N(0, 4).

  Hence

  P (1  X  12.1825) = P (ln(1)  ln(X)  ln(12.1825)) = P (0  Y  2.50)

  = P (0  Z  1.25) = P (Z  1.25) P(Z  0) = 0.8944 0.5000 = 0.4944.

  Probability and Mathematical Statistics

  Example 6.29. If the amount of time needed to solve a problem by a group

  of students follows the lognormal distribution with parameters µ and 2 ,

  then what is the value of µ so that the probability of solving a problem in 10

  minutes or less by any randomly picked student is 95 when 2 = 4?

  Answer: Let the random variable X denote the amount of time needed to a solve a problem. Then X ⇠ V

  \(µ, 4). We want to find µ so that

  P (X  10) = 0.95. Hence

  0.95 = P (X  10) = P (ln(X)  ln(10)) = P (ln(X) µ  ln(10) µ)

  ✓ln(X) µ ln(10) µ ◆

  ln(10) µ ◆

  =P Z 

  where Z ⇠ N(0, 1). Using the table for standard normal distribution, we get

  ln(10) µ = 1.65.

  Hence

  µ = ln(10) 2(1.65) = 2.3025 3.300 = 0.9975.

  6.6. Inverse Gaussian Distribution If a sufficiently small macroscopic particle is suspended in a fluid that is

  in thermal equilibrium, the particle will move about erratically in response to natural collisional bombardments by the individual molecules of the fluid. This erratic motion is called “Brownian motion” after the botanist Robert Brown (1773-1858) who first observed this erratic motion in 1828. Inde- pendently, Einstein (1905) and Smoluchowski (1906) gave the mathematical description of Brownian motion. The distribution of the first passage time in Brownian motion is the inverse Gaussian distribution. This distribution was systematically studied by Tweedie in 1945. The interpurchase times of toothpaste of a family, the duration of labor strikes in a geographical region, word frequency in a language, conversion time for convertible bonds, length of employee service, and crop field size follow inverse Gaussian distribution. Inverse Gaussian distribution is very useful for analysis of certain skewed data.

  Some Special Continuous Distributions

  Definition 6.10. A random variable X is said to have an inverse Gaussian distribution if its probability density function is given by

  8 q

  3 > (x µ)2 <

  e 2⇡ 2µ2x x 2 ,

  where 0 < µ < 1 and 0 < < 1 are arbitrary parameters.

  If X has an inverse Gaussian distribution with parameters µ and , then we write X ⇠ IG(µ, ).

  The characteristic function (t) of X ⇠ IG(µ, ) is

  1 1 2iµ2t

  =e

  µ

  Probability and Mathematical Statistics

  Using this, we have the following theorem. Theorem 6.10. If X ⇠ IG(µ, ), then

  E(X) = µ µ 3

  V ar(X) =

  Proof: Since (t) = E e itX , the derivative 0 (t) = i E Xe itX . Therefore

  0 (0) = i E (X). We know the characteristic function (t) of X ⇠ IG(µ, )

  1 1 2iµ2t

  Di↵erentiating (t) with respect to t, we have

  " h p 2iµ2t i

  1 1 2iµ2t

  2 µ 1 1 1 2iµ2t i✓ 2iµ t ◆ 2

  = iµ e

  Hence 0 (0) = i µ. Therefore, E(X) = µ. Similarly, one can show that

  µ 3

  V ar(X) =

  This completes the proof of the theorem.

  The distribution function F (x) of the inverse Gaussian random variable

  X with parameters µ and was computed by Shuster (1968) as

  is the distribution function of the standard normal distribution

  function.

  6.7. Logistics Distribution The logistic distribution is often considered as an alternative to the uni-

  variate normal distribution. The logistic distribution has a shape very close

  Some Special Continuous Distributions

  to that of a normal distribution but has heavier tails than the normal. The logistic distribution is used in modeling demographic data. It is also used as an alternative to the Weibull distribution in life-testing.

  Definition 6.11. A random variable X is said to have a logistic distribution if its probability density function is given by

  where 1 < µ < 1 and > 0 are parameters.

  If X has a logistic distribution with parameters µ and , then we write

  X ⇠ LOG(µ, ). Theorem 6.11. If X ⇠ LOG(µ, ), then

  E(X) = µ

  V ar(X) = 2

  Proof: First, we derive the moment generating function of X and then we

  Probability and Mathematical Statistics

  compute the mean and variance of it. The moment generating function is

  w 2 dw, where w = p

  z 1 dz, where z =

  t cosec

  We leave the rest of the proof to the reader.

  6.8. Review Exercises

  1. If Y ⇠ UNIF (0, 1), then what is the probability density function of X= ln Y ?

  2. Let the probability density function of X be

  Let Y = 1 e X . Find the distribution of Y .

  Some Special Continuous Distributions

  3. After a certain time the weight W of crystals formed is given approxi-

  mately by W = e X where X ⇠ N(µ, 2 ). What is the probability density

  function of W for 0 < w < 1 ?

  4. What is the probability that a normal random variable with mean 6 and standard deviation 3 will fall between 5.7 and 7.5 ?

  5. Let X have a distribution with the 75 th percentile equal to 1 3 and proba-

  bility density function equal to

  What is the value of the parameter ?

  6. If a normal distribution with mean µ and variance 2 > 0 has 46 th

  percentile equal to 20 , then what is µ in term of standard deviation?

  7. Let X be a random variable with cumulative distribution function

  What is P 0  e X 4?

  8. Let X have the density function

  where ↵ > 0 and > 0. If = 6 and ↵ = 5, what is the mean of the random variable (1 X) 1 ?

  9. R.A. Fisher proved that when n

  30 and Y has a chi-square distribution

  p

  p

  with n degrees freedom, then 2Y

  2n 1 has an approximate standard

  normal distribution. Under this approximation, what is the 90 th percentile of Y when n = 41 ?

  10. Let Y have a chi-square distribution with 32 degrees of freedom so that its variance is 64. If P (Y > c) = 0.0668, then what is the approximate value of the constant c?

  11. If in a certain normal distribution of X, the probability is 0.5 that X is less than 500 and 0.0227 that X is greater than 650. What is the standard deviation of X?

  Probability and Mathematical Statistics

  12. If X ⇠ N(5, 4), then what is the probability that 8 < Y < 13 where Y = 2X + 1?

  13. Given the probability density function of a random variable X as

  8 < ✓e ✓x

  if x > 0

  f (x) = : 0 otherwise,

  what is the n th moment of X about the origin?

  14. If the random variable X is normal with mean 1 and standard deviation

  2, then what is P X 2 2X  8 ?

  15. Suppose X has a standard normal distribution and Y = e X . What is

  the k th moment of Y ?

  16. If the random variable X has uniform distribution on the interval [0, a], what is the probability that the random variable greater than its square, that

  is P X > X 2 ?

  17. If the random variable Y has a chi-square distribution with 54 degrees of freedom, then what is the approximate 84 th percentile of Y ?

  18. Let X be a continuous random variable with density function

  X, what is the density function for Y where nonzero?

  19. If X is normal with mean 0 and variance 4, then what is the probability of the event X

  4 0, that is P X 4

  X X 0?

  20. If the waiting time at Rally’s drive-in-window is normally distributed with mean 13 minutes and standard deviation 2 minutes, then what percent- age of customers wait longer than 10 minutes but less than 15 minutes?

  21. If X is uniform on the interval from 5 to 5, what is the probability that

  the quadratic equation 100t 2 + 20tX + 2X + 3 = 0 has complex solutions?

  22. If the random variable X ⇠ Exp(✓), then what is the probability density

  p

  function of the random variable Y = X X?

  23. If the random variable X ⇠ N(0, 1), then what is the probability density

  p

  function of the random variable Y = |X|?

  Some Special Continuous Distributions

  24. If the random variable X ⇠ V

  \(µ, 2 ), then what is the probability

  density function of the random variable ln(X)?

  25. If the random variable X ⇠ V

  \(µ, 2 ), then what is the mode of X?

  V

  26. If the random variable X ⇠

  \(µ, 2 ), then what is the median of X?

  V

  27. If the random variable X ⇠

  \(µ, 2 ), then what is the probability that the quadratic equation 4t 2 + 4tX + X + 2 = 0 has real solutions?

  28. Consider the Karl Pearson’s di↵erential equation p(x) dy dx + q(x) y = 0

  where p(x) = a + bx + cx 2 and q(x) = x

  d. Show that if a = c = 0,

  b > 0, d >

  b, then y(x) is gamma; and if a = 0, b = c, d1 b < 1, d b > 1,

  then y(x) is beta.

  29. Let a, b, ↵,

  be any four real numbers with a < b and ↵, positive.

  If X ⇠ BETA(↵, ), then what is the probability density function of the random variable Y = (b a)X + a?

  30. A nonnegative continuous random variable X is said to be memoryless if P (X > s + tX > t) = P (X > s) for all s, t

  0. Show that the exponential

  random variable is memoryless.

  31. Show that every nonnegative continuous memoryless random variable is an exponential random variable.

  32. Using gamma function evaluate the following integrals: R

  dx; (ii) 2 1 xe x dx; (iii) 1 x 2 e x dx; (iv) 1 x 3 e x

  0 0 0 dx.

  33. Using beta function evaluate the following integrals:

  R 1 2 2 R 100

  R 1

  (i) 0 x (1 x) dx; (ii)

  x 5 (100 x) 7 dx; (iii)

  x 11 (1 x 3 ) 0 7 0 dx.

  34. If (z) denotes the gamma function, then prove that

  (1 + t) (1 t) = tcosec(t).

  35. Let ↵ and

  be given positive real numbers, with ↵ < . If two points

  are selected at random from a straight line segment of length , what is the probability that the distance between them is at least ↵ ?

  36. If the random variable X ⇠ GAM(✓, ↵), then what is the n th moment of X about the origin?

  Probability and Mathematical Statistics

  Two Random Variables