Analysis of variance ppt @ bec doms

3,290 views

Published on

Analysis of variance ppt @ bec doms

Published in: Technology, Health & Medicine
1 Comment
5 Likes
Statistics
Notes
No Downloads
Views
Total views
3,290
On SlideShare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
353
Comments
1
Likes
5
Embeds 0
No embeds

No notes for slide

Analysis of variance ppt @ bec doms

  1. 1. Analysis of Variance
  2. 2. Chapter Goals <ul><li>After completing this chapter, you should be able to: </li></ul><ul><li>Recognize situations in which to use analysis of variance </li></ul><ul><li>Understand different analysis of variance designs </li></ul><ul><li>Perform a single-factor hypothesis test and interpret results </li></ul><ul><li>Conduct and interpret post-analysis of variance pairwise comparisons procedures </li></ul><ul><li>Set up and perform randomized blocks analysis </li></ul><ul><li>Analyze two-factor analysis of variance test with replications results </li></ul>
  3. 3. Chapter Overview Analysis of Variance (ANOVA) F-test F-test Tukey- Kramer test Fisher’s Least Significant Difference test One-Way ANOVA Randomized Complete Block ANOVA Two-factor ANOVA with replication
  4. 4. General ANOVA Setting <ul><li>Investigator controls one or more independent variables </li></ul><ul><ul><li>Called factors (or treatment variables) </li></ul></ul><ul><ul><li>Each factor contains two or more levels (or categories/classifications) </li></ul></ul><ul><li>Observe effects on dependent variable </li></ul><ul><ul><li>Response to levels of independent variable </li></ul></ul><ul><li>Experimental design: the plan used to test hypothesis </li></ul>
  5. 5. One-Way Analysis of Variance <ul><li>Evaluate the difference among the means of three or more populations </li></ul><ul><ul><li>Examples: Accident rates for 1 st , 2 nd , and 3 rd shift </li></ul></ul><ul><ul><li>Expected mileage for five brands of tires </li></ul></ul><ul><li>Assumptions </li></ul><ul><ul><li>Populations are normally distributed </li></ul></ul><ul><ul><li>Populations have equal variances </li></ul></ul><ul><ul><li>Samples are randomly and independently drawn </li></ul></ul>
  6. 6. Completely Randomized Design <ul><li>Experimental units (subjects) are assigned randomly to treatments </li></ul><ul><li>Only one factor or independent variable </li></ul><ul><ul><li>With two or more treatment levels </li></ul></ul><ul><li>Analyzed by </li></ul><ul><ul><li>One-factor analysis of variance (one-way ANOVA) </li></ul></ul><ul><li>Called a Balanced Design if all factor levels have equal sample size </li></ul>
  7. 7. Hypotheses of One-Way ANOVA <ul><ul><li>All population means are equal </li></ul></ul><ul><ul><li>i.e., no treatment effect (no variation in means among groups) </li></ul></ul><ul><ul><li>At least one population mean is different </li></ul></ul><ul><ul><li>i.e., there is a treatment effect </li></ul></ul><ul><ul><li>Does not mean that all population means are different (some pairs may be the same) </li></ul></ul>
  8. 8. One-Factor ANOVA All Means are the same: The Null Hypothesis is True (No Treatment Effect)
  9. 9. One-Factor ANOVA At least one mean is different: The Null Hypothesis is NOT true (Treatment Effect is present) or (continued)
  10. 10. Partitioning the Variation <ul><li>Total variation can be split into two parts: </li></ul>SST = Total Sum of Squares SSB = Sum of Squares Between SSW = Sum of Squares Within SST = SSB + SSW
  11. 11. Partitioning the Variation Total Variation = the aggregate dispersion of the individual data values across the various factor levels (SST) Within-Sample Variation = dispersion that exists among the data values within a particular factor level (SSW) Between-Sample Variation = dispersion among the factor sample means (SSB) SST = SSB + SSW (continued)
  12. 12. Partition of Total Variation <ul><li>Commonly referred to as: </li></ul><ul><li>Sum of Squares Within </li></ul><ul><li>Sum of Squares Error </li></ul><ul><li>Sum of Squares Unexplained </li></ul><ul><li>Within Groups Variation </li></ul>Variation Due to Factor (SSB) Variation Due to Random Sampling (SSW) Total Variation (SST) <ul><li>Commonly referred to as: </li></ul><ul><li>Sum of Squares Between </li></ul><ul><li>Sum of Squares Among </li></ul><ul><li>Sum of Squares Explained </li></ul><ul><li>Among Groups Variation </li></ul>= +
  13. 13. Total Sum of Squares Where: SST = Total sum of squares k = number of populations (levels or treatments) n i = sample size from population i x ij = j th measurement from population i x = grand mean (mean of all data values) SST = SSB + SSW
  14. 14. Total Variation (continued)
  15. 15. Sum of Squares Between Where: SSB = Sum of squares between k = number of populations n i = sample size from population i x i = sample mean from population i x = grand mean (mean of all data values) SST = SSB + SSW
  16. 16. Between-Group Variation Variation Due to Differences Among Groups Mean Square Between = SSB/degrees of freedom
  17. 17. Between-Group Variation (continued)
  18. 18. Sum of Squares Within Where: SSW = Sum of squares within k = number of populations n i = sample size from population i x i = sample mean from population i x ij = j th measurement from population i SST = SSB + SSW
  19. 19. Within-Group Variation Summing the variation within each group and then adding over all groups Mean Square Within = SSW/degrees of freedom
  20. 20. Within-Group Variation (continued)
  21. 21. One-Way ANOVA Table Source of Variation df SS MS Between Samples SSB MSB = Within Samples N - k SSW MSW = Total N - 1 SST = SSB+SSW k - 1 MSB MSW F ratio k = number of populations N = sum of the sample sizes from all populations df = degrees of freedom SSB k - 1 SSW N - k F =
  22. 22. One-Factor ANOVA F Test Statistic <ul><li>Test statistic </li></ul><ul><ul><ul><li>MSB is mean squares between variances </li></ul></ul></ul><ul><ul><ul><li>MSW is mean squares within variances </li></ul></ul></ul><ul><li>Degrees of freedom </li></ul><ul><ul><li>df 1 = k – 1 (k = number of populations) </li></ul></ul><ul><ul><li>df 2 = N – k (N = sum of sample sizes from all populations) </li></ul></ul>H 0 : μ 1 = μ 2 = … = μ k H A : At least two population means are different
  23. 23. Interpreting One-Factor ANOVA F Statistic <ul><li>The F statistic is the ratio of the between estimate of variance and the within estimate of variance </li></ul><ul><ul><li>The ratio must always be positive </li></ul></ul><ul><ul><li>df 1 = k -1 will typically be small </li></ul></ul><ul><ul><li>df 2 = N - k will typically be large </li></ul></ul><ul><li>The ratio should be close to 1 if </li></ul><ul><li>H 0 : μ 1 = μ 2 = … = μ k is true </li></ul><ul><li>The ratio will be larger than 1 if </li></ul><ul><li>H 0 : μ 1 = μ 2 = … = μ k is false </li></ul>
  24. 24. One-Factor ANOVA F Test Example <ul><li>You want to see if three different golf clubs yield different distances. You randomly select five measurements from trials on an automated driving machine for each club. At the .05 significance level, is there a difference in mean distance? </li></ul>Club 1 Club 2 Club 3 254 234 200 263 218 222 241 235 197 237 227 206 251 216 204
  25. 25. One-Factor ANOVA Example: Scatter Diagram • • • • • 270 260 250 240 230 220 210 200 190 • • • • • • • • • • Distance Club 1 Club 2 Club 3 254 234 200 263 218 222 241 235 197 237 227 206 251 216 204 Club 1 2 3
  26. 26. One-Factor ANOVA Example Computations Club 1 Club 2 Club 3 254 234 200 263 218 222 241 235 197 237 227 206 251 216 204 x 1 = 249.2 x 2 = 226.0 x 3 = 205.8 x = 227.0 n 1 = 5 n 2 = 5 n 3 = 5 N = 15 k = 3 SSB = 5 [ (249.2 – 227) 2 + (226 – 227) 2 + (205.8 – 227) 2 ] = 4716.4 SSW = (254 – 249.2) 2 + (263 – 249.2) 2 +…+ (204 – 205.8) 2 = 1119.6 MSB = 4716.4 / (3-1) = 2358.2 MSW = 1119.6 / (15-3) = 93.3
  27. 27. One-Factor ANOVA Example Solution <ul><li>H 0 : μ 1 = μ 2 = μ 3 </li></ul><ul><li>H A : μ i not all equal </li></ul><ul><li> = .05 </li></ul><ul><li>df 1 = 2 df 2 = 12 </li></ul>F = 25.275 Test Statistic: Decision: Conclusion: Reject H 0 at  = 0.05 There is evidence that at least one μ i differs from the rest 0  = .05 F .05 = 3.885 Reject H 0 Do not reject H 0 Critical Value: F  = 3.885
  28. 28. ANOVA -- Single Factor: Excel Output EXCEL: tools | data analysis | ANOVA: single factor SUMMARY Groups Count Sum Average Variance Club 1 5 1246 249.2 108.2 Club 2 5 1130 226 77.5 Club 3 5 1029 205.8 94.2 ANOVA Source of Variation SS df MS F P-value F crit Between Groups 4716.4 2 2358.2 25.275 4.99E-05 3.885 Within Groups 1119.6 12 93.3 Total 5836.0 14        
  29. 29. The Tukey-Kramer Procedure <ul><li>Tells which population means are significantly different </li></ul><ul><ul><li>e.g.: μ 1 = μ 2  μ 3 </li></ul></ul><ul><ul><li>Done after rejection of equal means in ANOVA </li></ul></ul><ul><li>Allows pair-wise comparisons </li></ul><ul><ul><li>Compare absolute mean differences with critical range </li></ul></ul>x μ 1 = μ 2 μ 3
  30. 30. Tukey-Kramer Critical Range where: q  = Value from standardized range table with k and N - k degrees of freedom for the desired level of  MSW = Mean Square Within n i and n j = Sample sizes from populations (levels) i and j
  31. 31. The Tukey-Kramer Procedure: Example <ul><li>1. Compute absolute mean differences: </li></ul>Club 1 Club 2 Club 3 254 234 200 263 218 222 241 235 197 237 227 206 251 216 204 2. Find the q value from the table in appendix J with k and N - k degrees of freedom for the desired level of 
  32. 32. The Tukey-Kramer Procedure: Example 5. All of the absolute mean differences are greater than critical range. Therefore there is a significant difference between each pair of means at 5% level of significance. 3. Compute Critical Range: 4. Compare:
  33. 33. Tukey-Kramer in PHStat
  34. 34. Randomized Complete Block ANOVA <ul><li>Like One-Way ANOVA, we test for equal population means (for different factor levels, for example)... </li></ul><ul><li>...but we want to control for possible variation from a second factor (with two or more levels) </li></ul><ul><li>Used when more than one factor may influence the value of the dependent variable, but only one is of key interest </li></ul><ul><li>Levels of the secondary factor are called blocks </li></ul>
  35. 35. Partitioning the Variation <ul><li>Total variation can now be split into three parts: </li></ul>SST = Total sum of squares SSB = Sum of squares between factor levels SSBL = Sum of squares between blocks SSW = Sum of squares within levels SST = SSB + SSBL + SSW
  36. 36. Sum of Squares for Blocking Where: k = number of levels for this factor b = number of blocks x j = sample mean from the j th block x = grand mean (mean of all data values) SST = SSB + SSBL + SSW
  37. 37. Partitioning the Variation <ul><li>Total variation can now be split into three parts: </li></ul>SST and SSB are computed as they were in One-Way ANOVA SST = SSB + SSBL + SSW SSW = SST – (SSB + SSBL)
  38. 38. Mean Squares
  39. 39. Randomized Block ANOVA Table Source of Variation df SS MS Between Samples SSB MSB Within Samples (k–1)(b-1) SSW MSW Total N - 1 SST k - 1 MSBL MSW F ratio k = number of populations N = sum of the sample sizes from all populations b = number of blocks df = degrees of freedom Between Blocks SSBL b - 1 MSBL MSB MSW
  40. 40. Blocking Test <ul><li>Blocking test: df 1 = b - 1 </li></ul><ul><li> df 2 = (k – 1)(b – 1) </li></ul>MSBL MSW F = Reject H 0 if F > F 
  41. 41. <ul><li>Main Factor test: df 1 = k - 1 </li></ul><ul><li> df 2 = (k – 1)(b – 1) </li></ul>Main Factor Test MSB MSW F = Reject H 0 if F > F 
  42. 42. Fisher’s Least Significant Difference Test <ul><li>To test which population means are significantly different </li></ul><ul><ul><li>e.g.: μ 1 = μ 2 ≠ μ 3 </li></ul></ul><ul><ul><li>Done after rejection of equal means in randomized block ANOVA design </li></ul></ul><ul><li>Allows pair-wise comparisons </li></ul><ul><ul><li>Compare absolute mean differences with critical range </li></ul></ul>x  =   1 2 3
  43. 43. Fisher’s Least Significant Difference (LSD) Test where: t  /2 = Upper-tailed value from Student’s t-distribution for  /2 and (k -1)(n - 1) degrees of freedom MSW = Mean square within from ANOVA table b = number of blocks k = number of levels of the main factor
  44. 44. Fisher’s Least Significant Difference (LSD) Test (continued) If the absolute mean difference is greater than LSD then there is a significant difference between that pair of means at the chosen level of significance. Compare:
  45. 45. Two-Way ANOVA <ul><li>Examines the effect of </li></ul><ul><ul><li>Two or more factors of interest on the dependent variable </li></ul></ul><ul><ul><ul><li>e.g.: Percent carbonation and line speed on soft drink bottling process </li></ul></ul></ul><ul><ul><li>Interaction between the different levels of these two factors </li></ul></ul><ul><ul><ul><li>e.g.: Does the effect of one particular percentage of carbonation depend on which level the line speed is set? </li></ul></ul></ul>
  46. 46. Two-Way ANOVA <ul><li>Assumptions </li></ul><ul><ul><li>Populations are normally distributed </li></ul></ul><ul><ul><li>Populations have equal variances </li></ul></ul><ul><ul><li>Independent random samples are drawn </li></ul></ul>(continued)
  47. 47. Two-Way ANOVA Sources of Variation Two Factors of interest: A and B a = number of levels of factor A b = number of levels of factor B N = total number of observations in all cells
  48. 48. Two-Way ANOVA Sources of Variation SST Total Variation SS A Variation due to factor A SS B Variation due to factor B SS AB Variation due to interaction between A and B SSE Inherent variation (Error) Degrees of Freedom: a – 1 b – 1 (a – 1)(b – 1) N – ab N - 1 SST = SS A + SS B + SS AB + SSE (continued)
  49. 49. Two Factor ANOVA Equations Total Sum of Squares: Sum of Squares Factor A: Sum of Squares Factor B:
  50. 50. Two Factor ANOVA Equations Sum of Squares Interaction Between A and B: Sum of Squares Error: (continued)
  51. 51. Two Factor ANOVA Equations where: a = number of levels of factor A b = number of levels of factor B n’ = number of replications in each cell (continued)
  52. 52. Mean Square Calculations
  53. 53. Two-Way ANOVA: The F Test Statistic F Test for Factor B Main Effect F Test for Interaction Effect H 0 : μ A1 = μ A2 = μ A3 = • • • H A : Not all μ Ai are equal H 0 : factors A and B do not interact to affect the mean response H A : factors A and B do interact F Test for Factor A Main Effect H 0 : μ B1 = μ B2 = μ B3 = • • • H A : Not all μ Bi are equal Reject H 0 if F > F  Reject H 0 if F > F  Reject H 0 if F > F 
  54. 54. Two-Way ANOVA Summary Table Source of Variation Sum of Squares Degrees of Freedom Mean Squares F Statistic Factor A SS A a – 1 MS A = SS A /(a – 1) MS A MSE Factor B SS B b – 1 MS B = SS B /(b – 1) MS B MSE AB (Interaction) SS AB (a – 1)(b – 1) MS AB = SS AB / [(a – 1)(b – 1)] MS AB MSE Error SSE N – ab MSE = SSE/(N – ab) Total SST N – 1
  55. 55. Features of Two-Way ANOVA F Test <ul><li>Degrees of freedom always add up </li></ul><ul><ul><li>N-1 = (N-ab) + (a-1) + (b-1) + (a-1)(b-1) </li></ul></ul><ul><ul><li>Total = error + factor A + factor B + interaction </li></ul></ul><ul><li>The denominator of the F Test is always the same but the numerator is different </li></ul><ul><li>The sums of squares always add up </li></ul><ul><ul><li>SST = SSE + SS A + SS B + SS AB </li></ul></ul><ul><ul><li>Total = error + factor A + factor B + interaction </li></ul></ul>
  56. 56. Examples: Interaction vs. No Interaction <ul><li>No interaction: </li></ul>1 2 Factor B Level 1 Factor B Level 3 Factor B Level 2 Factor A Levels 1 2 Factor B Level 1 Factor B Level 3 Factor B Level 2 Factor A Levels Mean Response Mean Response Interaction is present:

×