Bayesian Statistics
Chapter 18 Bayesian Statistics
2 18.1 For p = 0.1, b(2; 2, 0.1) = 2
For p = 0.2, b(2; 2, 0.2) = 2 (0.2) 2
2 = 0.04. Denote by
A : number of defectives in our sample is 2;
B 1 : proportion of defective is p = 0.1;
B 2 : proportion of defective is p = 0.2.
(0.6)(0.01) + (0.4)(0.04) and then by subtraction P (B 2 |A) = 1 − 0.27 = 0.73. Therefore, the posterior distribu-
tion of p after observing A is
π(p|x = 2) 0.27 0.73
for which we get p ∗ = (0.1)(0.27) + (0.2)(0.73) = 0.173.
9 2 18.2 (a) For p = 0.05, b(2; 9, 0.05) = 7
For p = 0.10, b(2; 9, 0.10) = 9
For p = 0.15, b(2; 9, 0.15) = 9
Denote the following events:
A : 2 drinks overflow;
B 1 : proportion of drinks overflowing is p = 0.05;
B 2 : proportion of drinks overflowing is p = 0.10;
B 3 : proportion of drinks overflowing is p = 0.15. Then
P (B 1 |A) =
P (B 2 |A) =
278 Chapter 18 Bayesian Statistics and P (B 3 |A) = 1 − 0.12 − 0.55 = 0.33. Hence the posterior distribution is
π(p|x = 2) 0.12 0.55 0.33
(b) p ∗ = (0.05)(0.12) + (0.10)(0.55) + (0.15)(0.33) = 0.111.
18.3 (a) Let X = the number of drinks that overflow. Then
f (x|p) = b(x; 4, p) =
(1 − p) 4−x , for x = 0, 1, 2, 3, 4.
Since
f (1, p) = f (1|p)π(p) = 10 p(1 − p) = 40p(1 − p) , for 0.05 < p < 0.15,
then
Z 0.15
3 4 g(1) = 40 0.15 p(1 − p) dp = −2(1 − p) (4p + 1)|
π(p|x = 1) = 40p(1 − p) 3 /0.2844.
(b) The Bayes estimator
18.4 Denote by
A : 12 condominiums sold are units;
B 1 : proportion of two-bedroom condominiums sold 0.60;
B 2 : proportion of two-bedroom condominiums sold 0.70. For p = 0.6, b(12; 15, 0.6) = 0.0634 and for p = 0.7, b(12; 15, 0.7) = 0.1701. The prior
distribution is given by
π(p) 1/3 2/3
So, P (B (1/3)(0.0634)
1 |A) = (1/3)(0.0634)+(2/3)(0.1701) = 0.157 and P (B 2 |A) = 1 − 0.157 = 0.843. Therefore, the posterior distribution is
Solutions for Exercises in Chapter 18 279
π(p|x = 12) 0.157 0.843
(b) The Bayes estimator is p ∗ = (0.6)(0.157) + (0.7)(0.843) = 0.614.
18.5 n = 10, ¯ x = 9, σ = 0.8, µ 0 = 8, σ 0 = 0.2, and z 0.025 = 1.96. So, s
(10)(0.04) + 0.64 To calculate Bayes interval, we use 8.3846 ± (1.96)(0.1569) = 8.3846 ± 0.3075 which
yields (8.0771, 8.6921). Hence, the probability that the population mean is between 8.0771 and 8.6921 is 95%.
18.6 n = 30, ¯ x = 24.90, s = 2.10, µ 0 = 30 and σ 0 = 1.75.
nσ 2 +σ 2 0 = 96.285 = 0.3745, and z 0.025 = 1.96. Hence, the 95% Bayes interval is calculated by 25.13 ± (1.96)(0.3745) which yields $23.40 < µ < $25.86. (c) P (24 < µ < 26) = P 24−25.13 0.3745 <Z< 26−25.13 0.3745 = P (−3.02 < Z < 2.32) = 0.9898 − 0.0013 = 0.9885.
18.7 (a) P (71.8 < µ < 73.4) = P 71.8−72 √ 5.76 <Z< 73.4−72 √ 5.76 = P (−0.08 < Z < 0.58) = 0.2509.
(b) n = 100, ¯ x = 70, s 2 = 64, µ 0 = 72 and σ 2 0 = 5.76. Hence, (100)(70)(5.76) + (72)(64)
Hence, the 95% Bayes interval can be calculated as 70.2 ± (1.96)(0.759) which yields 68.71 < µ < 71.69.
(c) P (71.8 < µ < 73.4) = P 71.8−70.2
= P (2.11 < Z < 4.22) = 0.0174.
18.8 Multiplying the likelihood function
1 1 X x i −µ
f (x 1 ,x 2 ,...,x n |µ) =
by the prior π(µ) = 1 60 for 770 < µ < 830, we obtain
1 1 X x i −µ
f (x 1 ,x 2 ,...,x ( )
n , µ) =
25 exp −
= Ke
2 i=1
280 Chapter 18 Bayesian Statistics where K is a function of the sample values. Since the marginal distribution
√ g(x 1 ,x 2 ,...,x n )= 2π(20)K
√ Z 830
2 ( 100 √ ) e dµ = 2π(13.706)K.
Hence, the posterior distribution
f (x 1 ,x 2 ,...,x n )
π(µ|x 1 ,x 2 ,...,x n )=
g(x 1 ,x 2 ,...,x n )
for 770 < µ < 830.
18.9 Multiplying the likelihood function and the prior distribution together, we get the joint density function of θ as
f (t 1 ,t 2 ,...,t n , θ) = 2θ exp −θ
Then the marginal distribution of (T 1 ,T 2 ,...,T n ) is
g(t 1 ,t 2 ,...,t n )=2
2Γ(n + 1) = i=1
dθ
P −(n+1) P
since the integrand in the last term constitutes a gamma density function with pa-
rameters α = n + 1 and β = 1/ t i +2 . Hence, the posterior distribution of θ
X π(θ|t 1 ,...,t n )=
f (t 1 ,...,t n , θ)
g(t 1 ,...,t n )
Γ(n + 1)
i=1
for θ > 0, which is a gamma distribution with parameters α = n + 1 and β = P n
1/ t i +2 .
i=1
Solutions for Exercises in Chapter 18 281
18.10 Assume that p(x
i |λ) = e x i ! ,x i = 0, 1, . . . , for i = 1, 2, . . . , n and π(λ) = 2 4 λ e −λ/2 , for λ > 0. The posterior distribution of λ is calculated as
−λ λ xi
P n x i +2
P e n −(n+1/2)λ λ i=1 (n + 1/2) n¯ x+3 x i +2
λ i=1 π(λ|x −(n+1/2)λ
1 ,...,x n )=
x i +2
Γ(n¯ x + 3)
which is a gamma distribution with parameters α = n¯ x + 3 and β = (n + 1/2) −1 , n¯ with mean x+3
n+1/2 . Hence, plug the data in we obtain the Bayes estimator of λ, under squared-error loss, is λ ∗
= 57+3
10+1/2 = 5.7143.
18.11 The likelihood function of p is x−1 5 4 x−5 p (1 − p) and the prior distribution is π(p) = 1. Hence the posterior distribution of p is
p 5 (1 − p) x−5
Γ(x + 2)
5 π(p|x) = x−5 R
which is a Beta distribution with parameters α = 6 and β = x − 4. Hence the Bayes estimator, under the squared-error loss, is p ∗
x+2 .