How to compute the values of a function of a random variable? Linearity property of the expectation function

Then, the dollars you win totally is also a random variable Z which may be expressed to be a function g of the two random variables X and Y as Z = gX, Y = 10X + Y. 2. How to compute the values of a function of a random variable? Given a discrete random variable X and a function gX of X, how do we compute E [gX]? 1 First way: by use of the definition of expectation. 2 Second way: by use of a proposition derived later. Example 4.9 computing function values of random variables by definition --- Let random variable X P {X P {X = 0} = 0.5, P{X = 1} = 0.3, and let gX = X 2 . Compute the expectation value E[gX]. Solution:  Let Y = gX = X 2 .  The pmf py of Y is: P {Y = 1} = p1 = P{X y = x 2 = 1  x = 1 = P{X = 1} + P{X = +1} by mutual exclusiveness = 0.2 + 0.3 = 0.5; P {Y = 0} = p0 = P{X = 0} = 0.5. y = x 2 = 0  x = 0  Therefore, E[X 2 ] = E[Y] = : 0 i y p y yp y   = 1 0.5 + 00.5 = 0.5. Proposition If random variable X takes the values x i , i  1, with respective probability px i , then for any real-valued function g, E [gX] = i i i g x p x  . Proof:  First, divide all the values of gx i into groups, each group being with identical values of gx i , denoted as y j .  Therefore, i i i g x p x  = 1 : 1 i i i g x y y p x   + 2 : 2 i i i g x y y p x   + … = y 1  1 : i i i g x y p x   + y 2  2 : i i i g x y p x   + … = y 1 P{gX = y 1 } + y 2 P{gX = y 2 } + … : j i i i g x y p x   is the sum of probabilities for the event gX = y j to occur { } j j j g X y y P    = E[gX]. by the definition of E[gX] Example 4.10 Let random variable X P {X 0.2, P{X = 0} = 0.5, P{X = 1} = 0.3, and let gX = X 2 , computer E[gX]. Solution:  By Proposition 4.1, we have E [X 2 2 p1 + 0 2 p0 + 1 2 p1 2 0.2 + 0 2 0.5 + 1 2 0.3 = 0.5 which is the same as that computed in Example 4.9

3. Linearity property of the expectation function

 Corollary 4.1 If a and b are two constants, then E [aX + b] = aE[X] + b. Proof: E [aX + b] = : 0 x p x ax b p x    by Proposition 4.1 : 0 : 0 x p x i p x a xp x b p x       = aE[X] + b. by the definition of expectation and Axiom 2: : 0 i p x p x   = 1  Noteμ the notation “i: px discrete values x with non-zero px are dealt with.  Comments:  The expectation function E[· ] may be regarded as a linear operator according to the above corollary.  E[X] is also called the first moment of X.

4. Definition of the moment function