Single-Degree-of-Freedom Comparisons

13.5 Single-Degree-of-Freedom Comparisons

  The analysis of variance in a one-way classification, or a one-factor experiment, as it is often called, merely indicates whether or not the hypothesis of equal treatment means can be rejected. Usually, an experimenter would prefer his or her analysis to probe deeper. For instance, in Example 13.1, by rejecting the null hypothesis we concluded that the means are not all equal, but we still do not know where the differences exist among the aggregates. The engineer might have the feeling a priori that aggregates 1 and 2 should have similar absorption properties and that the same is true for aggregates 3 and 5. However, it is of interest to study the difference between the two groups. It would seem, then, appropriate to test the hypothesis

  H 0 :μ 1 +μ 2 −μ 3 −μ 5 = 0,

  H 1 :μ 1 +μ 2 −μ 3 −μ 5 = 0.

  We notice that the hypothesis is a linear function of the population means where the coefficients sum to zero.

  Definition 13.1: Any linear function of the form

  c i = 0, is called a comparison or contrast in the treatment means.

  i=1

  The experimenter can often make multiple comparisons by testing the significance of contrasts in the treatment means, that is, by testing a hypothesis of the following type:

  13.5 Single-Degree-of-Freedom Comparisons

  Hypothesis for a

  The test is conducted by first computing a similar contrast in the sample means,

  Since ¯ Y 1. ,¯ Y 2. ,...,¯ Y k. are independent random variables having normal distribu-

  tions with means μ 1 ,μ 2 ,...,μ k and variances σ 2 2 1 2 n 1 ,σ n 2 ,...,σ k n k , respec-

  tively, Theorem 7.11 assures us that w is a value of the normal random variable W with

  k

  k c 2

  mean μ W =

  c i μ i and variance σ 2 W =σ 2 i .

  i=1 n i

  i=1

  Therefore, when H 0 is true, μ W = 0 and, by Example 7.5, the statistic

  is distributed as a chi-squared random variable with 1 degree of freedom.

  Test Statistic for Our hypothesis is tested at the α-level of significance by computing

  Testing a

  Here f is a value of the random variable F having the F -distribution with 1 and N − k degrees of freedom.

  When the sample sizes are all equal to n,

  The quantity SSw, called the contrast sum of squares, indicates the portion of SSA that is explained by the contrast in question.

  Chapter 13 One-Factor Experiments: General

  This sum of squares will be used to test the hypothesis that

  It is often of interest to test multiple contrasts, particularly contrasts that are linearly independent or orthogonal. As a result, we need the following definition:

  Definition 13.2: The two contrasts

  are said to be orthogonal if

  b i c i n i = 0 or, when the n i are all equal to n, if

  If ω 1 and ω 2 are orthogonal, then the quantities SSw 1 and SSw 2 are compo-

  nents of SSA, each with a single degree of freedom. The treatment sum of squares with k − 1 degrees of freedom can be partitioned into at most k − 1 independent single-degree-of-freedom contrast sums of squares satisfying the identity

  SSA = SSw 1 + SSw 2 + · · · + SSw k −1 ,

  if the contrasts are orthogonal to each other.

  Example 13.4: Referring to Example 13.1, find the contrast sum of squares corresponding to the

  orthogonal contrasts

  ω 1 =μ 1 +μ 2 −μ 3 −μ 5 ,

  ω 2 =μ 1 +μ 2 +μ 3 − 4μ 4 +μ 5 ,

  and carry out appropriate tests of significance. In this case, it is of interest a priori to compare the two groups (1, 2) and (3, 5). An important and independent contrast is the comparison between the set of aggregates (1, 2, 3, 5) and aggregate

  4. Solution : It is obvious that the two contrasts are orthogonal, since

  (1)(1) + (1)(1) + ( −1)(1) + (0)(−4) + (−1)(1) = 0.

  The second contrast indicates a comparison between aggregates (1, 2, 3, and 5) and aggregate 4. We can write two additional contrasts orthogonal to the first two, namely

  ω 3 =μ 1 −μ 2 (aggregate 1 versus aggregate 2), ω 4 =μ 3 −μ 5 (aggregate 3 versus aggregate 5).

  13.6 Multiple Comparisons

  From the data of Table 13.1, we have

  − 3663 − 3664)

  SSw 1 =

  2 2 −1) = 14, 553, +( −1) ]

  A more extensive analysis-of-variance table is shown in Table 13.5. We note that the two contrast sums of squares account for nearly all the aggregate sum of squares. There is a significant difference between aggregates in their absorption properties,

  and the contrast ω 1 is marginally significant. However, the f-value of 14.12 for ω 2

  is highly significant, and the hypothesis

  H 0 :μ 1 +μ 2 +μ 3 +μ 5 = 4μ 4

  is rejected.

  Table 13.5: Analysis of Variance Using Orthogonal Contrasts

  Source of

  Sum of

  Degrees of

  Mean Computed

  Orthogonal contrasts allow the practitioner to partition the treatment varia- tion into independent components. Normally, the experimenter would have certain contrasts that were of interest to him or her. Such was the case in our example, where a priori considerations suggested that aggregates (1, 2) and (3, 5) consti- tuted distinct groups with different absorption properties, a postulation that was not strongly supported by the significance test. However, the second comparison supported the conclusion that aggregate 4 seemed to “stand out” from the rest. In this case, the complete partitioning of SSA was not necessary, since two of the four possible independent comparisons accounted for a majority of the variation in treatments.

  Figure 13.4 shows a SAS GLM procedure that displays a complete set of or- thogonal contrasts. Note that the sums of squares for the four contrasts add to the aggregate sum of squares. Also, note that the latter two contrasts (1 versus 2,

  3 versus 5) reveal insignificant comparisons.