− () x x = e, x ≥ 0 to a data set can sometimes increase with the square of N, the
f X − () x x = e, x ≥ 0 to a data set can sometimes increase with the square of N, the
Determine the probability distribution for the following:
number of rows of data. Suppose that for a particular algo-
(a) Y = X 2 2 (b) Y = X 12 (c) Y = ln X rithm, the computation time is approximately T =. 0 004 N sec-
5-84.
The velocity of a particle in a gas is a random vari-
onds. Although the number of rows is a discrete measurement,
able V with probability distribution
assume that the distribution of N over a number of data sets can
fv V av e 2 − () bv = v> 0 be approximated with an exponential distribution with a mean
where b is a constant that depends on the temperature of the gas
of 10,000 rows. Determine the probability density function and
and the mass of the particle.
the mean of T.
(a) Determine the value of the constant a.
5-90. Power meters enable cyclists to obtain power meas-
(b) The kinetic energy of the particle is W mV 2 =
2. Determine
urements nearly continuously. The meters also calculate the
the probability distribution of W.
average power generated over a time interval. Professional
Section 5-6Moment-Generating Functions
riders can generate 6.6 watts per kilogram of body weight for
power is computed as the fourth root of the mean of Y = 4 X .
extended periods of time. Some meters calculate a normal-
Determine the following:
ized power measurement to adjust for the physiological effort
(a)
Mean and standard deviation of X
required when the power output changes frequently. Let the
(b)
fy Y ()
random variable
X denote the power output at a measure-
(c)
Mean and variance of Y
ment time and assume that
X has a lognormal distribution
(d)
Fourth root of the mean of Y
with parameters
θ=.
5 2933 and 2
ω =. 0 00995 . The normalized
(e) Compare [( EX 4 )] 14 to E X ( ) and comment.
5-6 Moment-Generating Functions
Suppose that X is a random variable with mean μ. Throughout this book we have used the idea
of the expected value of the random variable X, and in fact EX () = μ. Now suppose that we are interested in the expected value of a function of X, r gX () = X . The expected value of this
function, or r EgX [ ( )] = EX ( ) , is called the rth moment about the origin of the random vari-
able X, which we will denote by μ r ′ .
Definition of Moments
about the Origin
The rth moment about the origin of the random variable X is
⎧ ⎪ ∑ xfx ( ),
X discrete
X continuous
⎩ −∞
Notice that the first moment about the origin is just the mean, that is, EX () = μ ′ 1 . Fur- thermore, since the second moment about the origin is EX () 2 = μ′ , we can write the variance
of a random variable in terms of origin moments as follows:
σ 2 = EX ( ) [ ( )]
− ′ = μ −
The moments of a random variable can often be determined directly from the definition in Equation 5-32, but there is an alternative procedure that is frequently useful that makes use of
a special function.
Definition of a Moment-Generating
The moment-generating function of the random variable X is the expected value of
Function
e tX and is denoted by Mt
X ( ). That is,
⎧
⎪ ∑ efx ( ),
X discrete
X continuous
⎩ −∞
The moment-generating function Mt X ( ) will exist only if the sum or integral in the above defi-
nition converges. If the moment-generating function of a random variable does exist, it can be used to obtain all the origin moments of the random variable.
Let X be a random variable with moment-generating function Mt X ( ). Then
dMt r
μ′r =
X ()
t
0 (5-34)
dt
r
Chapter 5Joint Probability Distributions
Assuming that we can differentiate inside the summation and integral signs,
X ⎪ discretee ∑
X continuous
⎩ −∞ Now if we set t = 0 in this expression, we fi nd that