One- and Two-Sample Estimation

Chapter 9 One- and Two-Sample Estimation

Problems

2 9.1 From Example 9.1 on page 271, we know that E(S 2 )=σ . Therefore,

9.2 (a) E(X) = np; E( ˆ P ) = E(X/n) = E(X)/n = np/n = p.

E(X)+ √ n/2

n→∞ 1+1/ n

9.4 n = 30, ¯ x = 780, and σ = 40. Also, z 0.02 = 2.054. So, a 96% confidence interval for the population mean can be calculated as

30) < µ < 780 + (2.054)(40/ 30), or 765 < µ < 795.

9.5 n = 75, ¯ x = 0.310, σ = 0.0015, and z 0.025 = 1.96. A 95% confidence interval for the population mean is

75) < µ < 0.310 + (1.96)(0.0015/ 75), or 0.3097 < µ < 0.3103.

9.6 n = 50, ¯ x = 174.5, σ = 6.9, and z 0.01 = 2.33.

(a) A 98% confidence interval for the population mean is √ √ 174.5 − (2.33)(6.9/

50) < µ < 174.5 + (2.33)(6.9/ 50), or 172.23 < µ < 176.77.

(b) e < (2.33)(6.9)/

104 Chapter 9 One- and Two-Sample Estimation Problems

9.7 n = 100, ¯ x = 23, 500, σ = 3900, and z 0.005 = 2.575. (a) A 99% confidence interval for the population mean is

23, 500 − (2.575)(3900/10) < µ < 23, 500 + (2.575)(3900/10), or

22, 496 < µ < 24, 504. (b) e < (2.575)(3900/10) = 1004.

9.8 n = [(2.05)(40)/10] 2 = 68 when rounded up.

9.9 n = [(1.96)(0.0015)/0.0005] 2 = 35 when rounded up.

9.10 n = [(1.96)(40)/15] 2 = 28 when rounded up.

9.11 n = [(2.575)(5.8)/2] 2 = 56 when rounded up.

9.12 n = 20, ¯ x = 11.3, s = 2.45, and t 0.025 = 2.093 with 19 degrees of freedom. A 95% confidence interval for the population mean is

20) < µ < 11.3 + (2.093)(2.45/ 20), or 10.15 < µ < 12.45.

9.13 n = 9, ¯ x = 1.0056, s = 0.0245, and t 0.005 = 3.355 with 8 degrees of freedom. A 99% confidence interval for the population mean is

1.0056 − (3.355)(0.0245/3) < µ < 1.0056 + (3.355)(0.0245/3), or 0.978 < µ < 1.033.

9.14 n = 10, ¯ x = 230, s = 15, and t 0.005 = 3.25 with 9 degrees of freedom. A 99% confidence interval for the population mean is

10) < µ < 230 + (3.25)(15/ 10), or 214.58 < µ < 245.42.

9.15 n = 12, ¯ x = 48.50, s = 1.5, and t 0.05 = 1.796 with 11 degrees of freedom. A 90% confidence interval for the population mean is

12) < µ < 48.50 + (1.796)(1.5/ 12), or 47.722 < µ < 49.278.

9.16 n = 12, ¯ x = 79.3, s = 7.8, and t 0.025 = 2.201 with 11 degrees of freedom. A 95% confidence interval for the population mean is

12) < µ < 79.3 + (2.201)(7.8/ 12), or 74.34 < µ < 84.26.

Solutions for Exercises in Chapter 9 105

9.17 n = 25, ¯ x = 325.05, s = 0.5, γ = 5%, and 1 − α = 90%, with k = 2.208. So, 325.05 ± (2.208)(0.5) yields (323.946, 326.151). Thus, we are 95% confident that this tolerance interval will contain 90% of the aspirin contents for this brand of buffered aspirin.

9.18 n = 15, ¯ x = 3.7867, s = 0.9709, γ = 1%, and 1 − α = 95%, with k = 3.507. So, by calculating 3.7867 ± (3.507)(0.9709) we obtain (0.382, 7.192) which is a 99% tolerance interval that will contain 95% of the drying times.

9.19 n = 100, ¯ x = 23,500, s = 3, 900, 1 − α = 0.99, and γ = 0.01, with k = 3.096. The

tolerance interval is 23,500 ± (3.096)(3,900) which yields 11,425 < µ < 35,574.

9.20 n = 12, ¯ x = 48.50, s = 1.5, 1 − α = 0.90, and γ = 0.05, with k = 2.655. The tolerance interval is 48.50 ± (2.655)(1.5) which yields (44.52, 52.48).

9.21 By definition, MSE = E( ˆ

Θ − θ) 2 which can be expressed as

MSE = E[ ˆ 2 Θ − E( ˆ Θ) + E( ˆ Θ) − θ]

2 = E[ ˆ 2 Θ − E( ˆ Θ)] + E[E( ˆ Θ) − θ] + 2E[ ˆ Θ − E( ˆ Θ)]E[E( ˆ Θ) − θ]. The third term on the right hand side is zero since E[ ˆ Θ − E( ˆ Θ)] = E[ ˆ Θ] − E( ˆ Θ) = 0.

Hence the claim is valid.

9.22 (a) The bias is E(S ′2

2 = n−1 σ 2 2 σ )−σ 2

−σ = n .

(b) lim 2 Bias = lim σ

2 (n−1)S 9.23 Using Theorem 8.4, we know that X 2 = σ 2 follows a chi-squared distribution with n − 1 degrees of freedom, whose variance is 2(n − 1). So, V ar(S 2 ) = V ar σ 2 n−1 X 2 =

n 2 . Therefore, the variance of S ′2 is smaller.

n−1 σ , and V ar(S ) = V ar n S =

V ar(S )=

n−1 2 n−1 2 2 2(n−1)σ 4

9.24 Using Exercises 9.21 and 9.23,

V ar(S 2 ) + [Bias(S 2 )] 2 2σ 4 /(n − 1) ′2 MSE(S = )

MSE(S 2 )

V ar(S ′2 ) + [Bias(S ′2

which is always larger than 1 when n is larger than 1. Hence the MSE of S ′2 is usually smaller.

9.25 n = 20, ¯ x = 11.3, s = 2.45, and t 0.025 = 2.093 with 19 degrees of freedom. A 95% prediction interval for a future observation is

which yields (6.05, 16.55).

106 Chapter 9 One- and Two-Sample Estimation Problems

9.26 n = 12, ¯ x = 79.3, s = 7.8, and t 0.025 = 2.201 with 11 degrees of freedom. A 95% prediction interval for a future observation is

which yields (61.43, 97.17).

9.27 n = 15, ¯ x = 3.7867, s = 0.9709, and t 0.025 = 2.145 with 14 degrees of freedom. A 95% prediction interval for a new observation is

1 + 1/15 = 3.7867 ± 2.1509, which yields (1.6358, 5.9376).

9.28 n = 9, ¯ x = 1.0056, s = 0.0245, 1 − α = 0.95, and γ = 0.05, with k = 3.532. The tolerance interval is 1.0056 ± (3.532)(0.0245) which yields (0.919, 1.092).

9.29 n = 15, ¯ x = 3.84, and s = 3.07. To calculate an upper 95% prediction limit, we obtain t 0.05 = 1.761 with 14 degrees of freedom. So, the upper limit is 3.84 + p (1.761)(3.07)

1 + 1/15 = 3.84 + 5.58 = 9.42. This means that a new observation will have a chance of 95% to fall into the interval (−∞, 9.42). To obtain an up- per 95% tolerance limit, using 1 − α = 0.95 and γ = 0.05, with k = 2.566, we get

3.84 + (2.566)(3.07) = 11.72. Hence, we are 95% confident that (−∞, 11.72) will contain 95% of the orthophosphorous measurements in the river.

9.30 n = 50, ¯ x = 78.3, and s = 5.6. Since t 0.05 = 1.677 with 49 degrees of freedom, the bound of a lower 95% prediction interval for a single new observation is 78.3 − p

1 + 1/50 = 68.91. So, the interval is (68.91, ∞). On the other hand, with 1 − α = 95% and γ = 0.01, the k value for a one-sided tolerance limit is 2.269 and the bound is 78.3 − (2.269)(5.6) = 65.59. So, the tolerance interval is (65.59, ∞).

9.31 Since the manufacturer would be more interested in the mean tensile strength for future products, it is conceivable that prediction interval and tolerance interval may be more interesting than just a confidence interval.

9.32 This time 1 − α = 0.99 and γ = 0.05 with k = 3.126. So, the tolerance limit is

78.3 − (3.126)(5.6) = 60.79. Since 62 exceeds the lower bound of the interval, yes, this is a cause of concern.

9.33 In Exercise 9.27, a 95% prediction interval for a new observation is calculated as (1.6358, 5.9377). Since 6.9 is in the outside range of the prediction interval, this new observation is likely to be an outlier.

9.34 n = 12, ¯ x = 48.50, s = 1.5, 1 − α = 0.95, and γ = 0.05, with k = 2.815. The lower bound of the one-sided tolerance interval is 48.50 − (2.815)(1.5) = 44.275. Their claim is not necessarily correct.

Solutions for Exercises in Chapter 9 107

9.35 n 1 = 25, n 2 = 36, ¯ x 1 = 80, ¯ x 2 = 75, σ 1 = 5, σ 2 = 3, and z 0.03 = 1.88. So, a 94%

confidence interval for µ 1 −µ 2 is

p (80 − 75) − (1.88) 25/25 + 9/36 < µ 1 −µ 2 < (80 − 75) + (1.88) 25/25 + 9/36,

which yields 2.9 < µ 1 −µ 2 < 7.1.

9.36 n A = 50, n B = 50, ¯ x A = 78.3, ¯ x B = 87.2, σ A = 5.6, and σ B = 6.3. It is known that z 0.025 = 1.96. So, a 95% confidence interval for the difference of the population means is

or 6.56 < µ A −µ B < 11.24.

9.37 n 1 = 100, n 2 = 200, ¯ x 1 = 12.2, ¯ x 2 = 9.1, s 1 = 1.1, and s 2 = 0.9. It is known that z 0.01 = 2.327. So

(12.2 − 9.1) ± 2.327 1.1 2 /100 + 0.9 2 /200 = 3.1 ± 0.30, or 2.80 < µ 1 −µ 2 < 3.40. The treatment appears to reduce the mean amount of metal

removed.

9.38 n 1 = 12, n 2 = 10, ¯ x 1 = 85, ¯ x 2 = 81, s 1 = 4, s 2 = 5, and s p = 4.478 with t 0.05 = 1.725 with 20 degrees of freedom. So

which yields 0.69 < µ 1 −µ 2 < 7.31.

9.39 n 1 = 12, n 2 = 18, ¯ x 1 = 84, ¯ x 2 = 77, s 1 = 4, s 2 = 6, and s p = 5.305 with t 0.005 = 2.763 with 28 degrees of freedom. So,

which yields 1.54 < µ 1 −µ 2 < 12.46.

9.40 n 1 = 10, n 2 = 10, ¯ x 1 = 0.399, ¯ x 2 = 0.565, s 1 = 0.07279, s 2 = 0.18674, and s p = 0.14172 with t 0.025 = 2.101 with 18 degrees of freedom. So,

which yields 0.033 < µ 1 −µ 2 < 0.299.

9.41 n 1 = 14, n 2 = 16, ¯ x 1 = 17, ¯ x 2 = 19, s 2 1 = 1.5, s 2 2 = 1.8, and s p = 1.289 with t 0.005 = 2.763 with 28 degrees of freedom. So,

which yields 0.70 < µ 1 −µ 2 < 3.30.

108 Chapter 9 One- and Two-Sample Estimation Problems

9.42 n 1 = 12, n 2 = 10, ¯ x 1 = 16, ¯ x 2 = 11, s 1 = 1.0, s 2 = 0.8, and s p = 0.915 with t 0.05 = 1.725 with 20 degrees of freedom. So,

which yields 4.3 < µ 1 −µ 2 < 5.7.

9.43 n A =n B = 12, ¯ x A = 36, 300, ¯ x B = 38, 100, s A = 5, 000, s B = 6, 100, and

with t 0.025 = 2.080 with 21 degrees of freedom. So,

12 12 which yields −6, 536 < µ A −µ B < 2, 936.

9.44 n = 8, ¯ d = −1112.5, s d = 1454, with t 0.005 = 3.499 with 7 degrees of freedom. So,

8 which yields −2911.2 < µ D < 686.2.

9.45 n = 9, ¯ d = 2.778, s d = 4.5765, with t 0.025 = 2.306 with 8 degrees of freedom. So,

9 which yields −0.74 < µ D < 6.30.

9.46 n I = 5, n II = 7, ¯ x I = 98.4, ¯ x II = 110.7, s I = 8.375, and s II = 32.185, with

2 (8.735 2 /5 + 32.185 /7)2 v= (8.735 2 /5) 2 (32.185 2 /7) 2 =7

So, t 0.05 = 1.895 with 7 degrees of freedom.

which yields −11.9 < µ II −µ I < 36.5.

9.47 n = 10, ¯ d = 14.89%, and s d = 30.4868, with t 0.025 = 2.262 with 9 degrees of freedom. So,

10 which yields −6.92 < µ D < 36.70.

Solutions for Exercises in Chapter 9 109

9.48 n A =n B = 20, ¯ x A = 32.91, ¯ x B = 30.47, s A = 1.57, s B = 1.74, and S p = 1.657. (a) t 0.025 ≈ 2.042 with 38 degrees of freedom. So,

which yields 1.37 < µ A −µ B < 3.51.

(b) Since it is apparent that type A battery has longer life, it should be adopted.

9.49 n A =n B = 15, ¯ x A = 3.82, ¯ x B = 4.94, s A = 0.7794, s B = 0.7538, and s p = 0.7667 with t 0.025 = 2.048 with 28 degrees of freedom. So,

which yields 0.55 < µ B −µ A < 1.69.

9.50 n 1 = 8, n 2 = 13, ¯ x 1 = 1.98, ¯ x 2 = 1.30, s 1 = 0.51, s 2 = 0.35, and s p = 0.416. t 0.025 = 2.093 with 19 degrees of freedom. So,

which yields 0.29 < µ 1 −µ 2 < 1.07.

9.51 (a) n = 200, ˆ p = 0.57, ˆ q = 0.43, and z 0.02 = 2.05. So, r

which yields 0.498 < p < 0.642.

(b) Error ≤ (2.05) (0.57)(0.43)

9.52 n = 500.ˆ 485 p=

500 = 0.97, ˆ q = 0.03, and z 0.05 = 1.645. So,

which yields 0.957 < p < 0.983.

9.53 n = 1000, ˆ 228 p=

1000 = 0.228, ˆ q = 0.772, and z 0.005 = 2.575. So,

which yields 0.194 < p < 0.262.

9.54 n = 100, ˆ 8 p=

100 = 0.08, ˆ q = 0.92, and z 0.01 = 2.33. So,

which yields 0.017 < p < 0.143.

110 Chapter 9 One- and Two-Sample Estimation Problems

9.55 (a) n = 40, ˆ 34 p=

40 = 0.85, ˆ q = 0.15, and z 0.025 = 1.96. So,

which yields 0.739 < p < 0.961. (b) Since p = 0.8 falls in the confidence interval, we can not conclude that the new

system is better.

9.56 n = 100, ˆ 24 p= 100 = 0.24, ˆ q = 0.76, and z 0.005 = 2.575.

(a) 0.24 ± (2.575) (0.24)(0.76)

= 0.24 ± 0.110, which yields 0.130 < p < 0.350.

(b) Error ≤ (2.575) (0.24)(0.76)

2 9.57 n = 1600, ˆ 1 p=

3 ,ˆ q= 3 , and z 0.025 = 1.96.

= 3 ± 0.023, which yields 0.644 < p < 0.690.

(b) Error ≤ (1.96) (2/3)(1/3)

(0.02) 2 = 2090 when round up.

(2.05) 2 9.59 n = (0.57)(0.43)

(0.02) 2 = 2576 when round up.

(2.575) 2 9.60 n = (0.228)(0.772)

(0.05) 2 = 467 when round up.

(2.33) 2 9.61 n = (0.08)(0.92)

(0.05) 2 = 160 when round up.

(1.96) 9.62 n = 2 (4)(0.01) 2 = 9604 when round up. (2.575) 9.63 n = 2

(4)(0.01) 2 = 16577 when round up. (1.96) 9.64 n = 2

(4)(0.04) 2 = 601 when round up.

9.65 n M =n F = 1000, ˆ p M = 0.250, ˆ q M = 0.750, ˆ p F = 0.275, ˆ q F = 0.725, and z 0.025 = 1.96. So

which yields −0.0136 < p F −p M < 0.0636.

Solutions for Exercises in Chapter 9 111

80 9.66 n 40

1 = 250, n 2 = 175, ˆ p 1 = 250 = 0.32, ˆ p 2 = 175 = 0.2286, and z 0.05 = 1.645. So,

which yields 0.0201 < p 1 −p 2 < 0.1627. From this study we conclude that there is

a significantly higher proportion of women in electrical engineering than there is in chemical engineering.

9.67 n 98

1 =n 2 = 500, ˆ p 1 = 500 = 0.24, ˆ p 2 = 500 = 0.196, and z 0.05 = 1.645. So,

which yields 0.0011 < p 1 −p 2 < 0.0869. Since 0 is not in this confidence interval, we conclude, at the level of 90% confidence, that inoculation has an effect on the incidence of the disease.

9.68 n 5 ◦ C =n 15 ◦ C = 20, ˆ p 5 ◦ C = 0.50, ˆ p 15 ◦ C = 0.75, and z 0.025 = 1.96. So, r

which yields −0.5399 < p 5 ◦ C −p 15 ◦ C < 0.0399. Since this interval includes 0, the significance of the difference cannot be shown at the confidence level of 95%.

9.69 n now = 1000, ˆ p now = 0.2740, n 91 = 760, ˆ p 91 = 0.3158, and z 0.025 = 1.96. So,

which yields −0.0849 < p now −p 91 < 0.0013. Hence, at the confidence level of 95%, the significance cannot be shown.

9.70 n 90 =n 94 = 20, ˆ p 90 = 0.337, and ˆ0 94 = 0.362 (a) n 90 p ˆ 90 = (20)(0.337) ≈ 7 and n 94 p ˆ 94 = (20)(0.362) ≈ 7.

(b) Since z (0.362)(0.638) 0.025 = 1.96, (0.337 − 0.362) ± (1.96)

20 = −0.025 ± 0.295, which yields −0.320 < p 90 −p 94 < 0.270. Hence there is no evidence, at

the confidence level of 95%, that there is a change in the proportions.

9.71 s 2 = 0.815 with v = 4 degrees of freedom. Also, χ 2 0.025 2 = 11.143 and χ 0.975 = 0.484. So,

2 (4)(0.815) <σ 2 < , which yields 0.293 < σ < 6.736. 11.143

Since this interval contains 1, the claim that σ 2 seems valid.

112 Chapter 9 One- and Two-Sample Estimation Problems

2 9.72 s 2 = 16 with v = 19 degrees of freedom. It is known χ

0.01 = 36.191 and χ 0.99 = 7.633. Hence

2 9.73 s 2 = 6.0025 with v = 19 degrees of freedom. Also, χ 0.025 = 32.852 and χ 0.975 = 8.907. Hence, (19)(6.0025)

2 9.74 s 2 = 0.0006 with v = 8 degrees of freedom. Also, χ 0.005 = 21.955 and χ 0.995 = 1.344. Hence, (8)(0.0006)

9.75 s 2 = 225 with v = 9 degrees of freedom. Also, χ 2 0.005 = 23.589 and χ 2 0.995 = 1.735. Hence,

which yields 9.27 < σ < 34.16.

9.76 s 2 = 2.25 with v = 11 degrees of freedom. Also, χ 2 0.05 = 19.675 and χ 2 0.95 = 4.575. Hence,

9.77 s 2 1 = 1.00, s 2 2 = 0.64, f 0.01 (11, 9) = 5.19, and f 0.01 (9, 11) = 4.63. So,

σ which yields 0.549 < 1

9.78 s 2 1 2 = 5000 ,s 2 2 = 6100 2 , and f 0.05 (11, 11) = 2.82. (Note: this value can be found by using “=finv(0.05,11,11)” in Microsoft Excel.) So,

Since the interval contains 1, it is reasonable to assume that σ 2 1 =σ 2 2 .

2 9.79 s 2

I = 76.3, s II = 1035.905, f 0.05 (4, 6) = 4.53, and f 0.05 (6, 4) = 6.16. So,

76.3 1 σ 2 I 76.3 σ 2 I

Hence, we may assume that σ 2 I 2 6= σ II .

Solutions for Exercises in Chapter 9 113

9.80 s A = 0.7794, s B = 0.7538, and f 0.025 (14, 14) = 2.98 (Note: this value can be found by using “=finv(0.025,14,14)” in Microsoft Excel.) So,

Hence, it is reasonable to assume the equality of the variances.

9.81 The likelihood function is

L(x x)

1 ,...,x n )=

(1 − p) n(1−¯ .

i=1

i=1

Hence, ln L = n[¯ x ln(p) + (1 − ¯x) ln(1 − p)]. Taking derivative with respect to p

1−¯ and setting the derivative to zero, we obtain x

∂ ln(L)

=n p − 1−p = 0, which yields

∂p

x ¯ 1−¯ x

p − 1−p = 0. Therefore, ˆ p=¯ x.

9.82 (a) The likelihood function is

n Y β−1

L(x β

1 ,...,x n )=

f (x i ; α, β) = (αβ)

= (αβ) e i=1

i=1

(b) So, the log-likelihood can be expressed as

ln L = n[ln(α) + ln(β)] − α

x i + (β − 1)

ln(x i ).

i=1

i=1

To solve for the maximum likelihood estimate, we need to solve the following two equations

9.83 (a) The likelihood function is

1 2 − [ln(xi)−µ]

L(x 1 ,...,x n )=

f (x i ; µ, σ) =

e 2σ2

2πσx i

i=1

i=1

exp − 2 [ln(x i ) − µ] .

(2π) n/2 σ n

i=1

i=1

114 Chapter 9 One- and Two-Sample Estimation Problems (b) It is easy to obtain

2 X 1 X 2 ln L = − ln(2π) − ln(σ )−

ln(x i )−

[ln(x i ) − µ] .

So, setting 0 = ∂µ = σ 2 [ln(x i ) − µ], we obtain ˆµ = n

ln(x i ), and setting

∂σ 2 =− 2σ 2 + 1 2 2 1 2σ 2 4 [ln(x i ) − µ] , we get ˆ σ = n [ln(x i ) − ˆµ] .

i=1

i=1

9.84 (a) The likelihood function is

L(x 1 ,...,x )=

x α−1 e −x n i

e i i=1 .

β nα Γ(α) n

β nα Γ(α) i=1 n i=1

(b) Hence

ln L = −nα ln(β) − n ln(Γ(α)) + (α − 1)

ln(x i )− x i .

β i=1

i=1

Taking derivatives of ln L with respect to α and β, respectively and setting both as zeros. Then solve them to obtain the maximum likelihood estimates.

9.85 L(x) = p x (1 − p) , and ln L = x ln(p) + (1 − x) ln(1 − p), with

p − 1−p = 0, we obtain ˆ p = x = 1.

9.86 From the density function b ∗ (x; p) = x−1 k−1 p k

(1 − p) x−k , we obtain

x−1

ln L = ln

+ k ln p + (n − k) ln(1 − p).

k−1

Setting ∂ ln L = k

p − 1−p

n−k = 0, we obtain ˆ p= k

∂p

9.87 For the estimator S 2 ,

V ar(S )=

2 V ar

(x i − ¯x) =

2 V ar(σ χ n−1 )

For the estimator b σ 2 , we have

2σ (n − 1)

V ar( b σ 2

Solutions for Exercises in Chapter 9 115

9.88 n = 7, ¯ d = 3.557, s d = 2.776, and t 0.025 = 2.447 with 6 degrees of freedom. So,

which yields 0.99 < µ D < 6.12. Since 0 is not in the interval, the claim appears valid.

9.89 n = 75, x = 28, hence ˆ p= 28 75 = 0.3733. Since z 0.025 = 1.96, a 95% confidence interval for p can be calculate as

which yields 0.2638 < p < 0.4828. Since the interval contains 0.421, the claim made by the Roanoke Times seems reasonable.

9.90 n = 12, ¯ d = 40.58, s d = 15.791, and t 0.025 = 2.201 with 11 degrees of freedom. So,

12 which yields 30.55 < µ D < 50.61.

9.91 n = 6, ¯ d = 1.5, s d = 1.543, and t 0.025 = 2.571 with 5 degrees of freedom. So,

6 which yields −0.12 < µ D < 3.12.

9.92 n = 12, ¯ d = 417.5, s d = 1186.643, and t 0.05 = 1.796 with 11 degrees of freedom. So, 1186.643

12 which yields −197.73 < µ D < 1032.73.

9.93 n p =n u = 8, ¯ x p = 86, 250.000, ¯ x u = 79, 837.500, σ p =σ u = 4, 000, and z 0.025 = 1.96. So,

(86250 − 79837.5) ± (1.96)(4000) 1/8 + 1/8 = 6412.5 ± 3920, which yields 2, 492.5 < µ p −µ u < 10, 332.5. Hence, polishing does increase the average

endurance limit.

9.94 n A = 100, n B = 120, ˆ p A = 24 100 = 0.24, ˆ p B = 36 120 = 0.30, and z 0.025 = 1.96. So,

which yields −0.057 < p B −p A < 0.177.

116 Chapter 9 One- and Two-Sample Estimation Problems

2 9.95 n 2 N =n O = 23, s N = 105.9271, s O = 77.4138, and f 0.025 (22, 22) = 2.358. So, 105.9271 1 σ 2 σ N 2 105.9271 N

(2.358), or 0.58 < 2 < 3.23. 77.4138 2.358

σ O For the ratio of the standard deviations, the 95% confidence interval is approximately

Since the intervals contain 1 we will assume that the variability did not change with the local supplier.

9.96 n A =n B = 6, ¯ x A = 0.1407, ¯ x B = 0.1385, s A = 0.002805, s B = 0.002665, and s p = 002736. Using a 90% confidence interval for the difference in the population means, t 0.05 = 1.812 with 10 degrees of freedom, we obtain

(0.1407 − 0.1385) ± (1.812)(0.002736) 1/6 + 1/6 = 0.0022 ± 0.0029, which yields −0.0007 < µ A −µ B < 0.0051. Since the 90% confidence interval contains 0,

we conclude that wire A was not shown to be better than wire B, with 90% confidence.

9.97 To calculate the maximum likelihood estimator, we need to use

 = −nµ + ln(µ)

P 1 n Taking derivative with respect to µ and setting it to zero, we obtain ˆ µ= n

x i =¯ x.

i=1

On the other hand, using the method of moments, we also get ˆ µ=¯ x.

9.98 ˆ 2 µ=¯ x and ˆ σ =

2 2µ+σ 2 2 2 9.99 Equating ¯ 2 x=e and s = (e )(e σ σ −1), we get ln(¯x) = µ+ , or ˆ µ = ln(¯ σ ˆ

2 x)− 2 .

2 2 σ On the other hand, ln s 2 = 2µ + σ + ln(e − 1). Plug in the form of ˆµ, we obtain σ 2 ˆ 2 = ln 1+ s

2 2 x ¯ 9.100 Setting ¯ 2 x = αβ and s = αβ , we get ˆ α= , and ˆ β= s 2

9.101 n 1 =n 2 = 300, ¯ x 1 = 102300, ¯ x 2 = 98500, s 1 = 5700, and s 2 = 3800. (a) z 0.005 = 2.575. Hence,

which yields 2781.54 < µ 1 −µ 2 < 4818.46. There is a significant difference in salaries between the two regions.

Solutions for Exercises in Chapter 9 117 (b) Since the sample sizes are large enough, it is not necessary to assume the normality

due to the Central Limit Theorem. (c) We assumed that the two variances are not equal. Here we are going to obtain

a 95% confidence interval for the ratio of the two variances. It is known that

f 0.025 (299, 299) = 1.255. So,

Since the confidence interval does not contain 1, the difference between the vari- ances is significant.

9.102 The error in estimation, with 95% confidence, is (1.96)(4000) 2 n . Equating this quan- tity to 1000, we obtain

when round up. Hence, the sample sizes in Review Exercise 9.101 is sufficient to produce a 95% confidence interval on µ 1 −µ 2 having a width of $1,000.

9.103 n = 300, ¯ x = 6.5 and s = 2.5. Also, 1 − α = 0.99 and 1 − γ = 0.95. Using Table A.7, k = 2.522. So, the limit of the one-sided tolerance interval is 6.5+(2.522)(2.5) = 12.805. Since this interval contains 10, the claim by the union leaders appears valid.

9.104 n = 30, x = 8, and z 0.025 = 1.96. So,

which yields 0.108 < p < 0.425. (1.96) 2 9.105 n = (4/15)(11/15)

0.05 2 = 301, when round up.

9.106 n 1 =n 2 = 100, ˆ p 1 = 0.1, and ˆ p 2 = 0.06.

(a) z 0.025 = 1.96. So,

which yields −0.035 < p 1 −p 2 < 0.115.

(b) Since the confidence interval contains 0, it does not show sufficient evidence that p 1 >p 2 .

118 Chapter 9 One- and Two-Sample Estimation Problems

2 2 9.107 n = 20 and s 2 = 0.045. It is known that χ 0.025 = 32.825 and χ 0.975 = 8.907 with 19 degrees of freedom. Hence the 95% confidence interval for σ 2 can be expressed as

Therefore, the 95% confidence interval for σ can be approximated as

Since 0.3 falls outside of the confidence interval, there is strong evidence that the process has been improved in variability.

9.108 n A =n B = 15, ¯ y A = 87, s A = 5.99, ¯ y B = 75, s B = 4.85, s p = 5.450, and t 0.025 = 2.048 with 28 degrees of freedom. So,

which yields 7.924 < µ A −µ B < 16.076. Apparently, the mean operating costs of type

A engines are higher than those of type B engines. 9.109 Since the unbiased estimators of σ 2 1 and σ 2 2 2 2 are S 1 and S 2 , respectively,

2 1 2 2 (n 1 − 1)σ 1 + (n 2 − 1)σ 2 E(S )=

[(n 1 − 1)E(S 1 ) + (n 2 − 1)E(S 2 )] =

n 1 +n 2 −2

n 1 +n 2 −2 If we assume σ 2 =σ 2 =σ 2 1 2 2 , the right hand side of the above is σ , which means that

S 2 is unbiased for σ 2 .

9.110 n = 15, ¯ x = 3.2, and s = 0.6. (a) t 0.01 = 2.624 with 14 degrees of freedom. So, a 99% left-sided confidence interval

has an upper bound of 3.2 + (2.624) 0.6 √

15 = 3.607 seconds. We assumed normality in the calculation.

(b) 3.2 + (2.624)(0.6) 1+ 1 15 = 4.826. Still, we need to assume normality in the distribution.

(c) 1 − α = 0.99 and 1 − γ = 0.95. So, k = 3.520 with n = 15. So, the upper bound is 3.2 + (3.520)(0.6) = 5.312. Hence, we are 99% confident to claim that 95% of

the pilot will have reaction time less than 5.312 seconds.

9.111 n = 400, x = 17, so ˆ p= 17 400 = 0.0425.

(a) z 0.025 = 1.96. So,

which yields 0.0227 < p < 0.0623.

Solutions for Exercises in Chapter 9 119 (b) z 0.05 = 1.645. So, the upper bound of a left-sided 95% confidence interval is q

(c) Using both intervals, we do not have evidence to dispute suppliers’ claim.