• Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,628
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
106
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. 1 Design of Experiments Simplified (DOES) Presented By: Juanito S. Chan, PIE
  • 2. 2 Rationale  In any manufacturing setting, processes are governed by different variables that can affect the overall performance of the different processes. It is therefore necessary to identify these critical process variables with the purpose of controlling these variables in order to ensure a smooth process and to achieve the optimal operations of manufacturing.
  • 3. 3 Objectives  Identify and use the appropriate hypothesis test for comparing two sets of data  Identify and use the appropriate design for comparing more than two data sets  Design efficient experiments  Quantify the effects of factors in the experiment  Identify presence of interactions among the different process variables  Predict response as a function of the levels of the factors
  • 4. 4 Course Outline  Introduction  Learning Process  Role of Experimental Design  Guidelines for Designing Experiments  How to Use Statistical Techniques  Simple Comparative Experiments  Inferences about the Difference in Means  Inferences About the Variance  Inferences About Proportions  Contingency Tables and Test of Associations
  • 5. 5 Course Outline (Cont…)  Multiple Factor Experiments  Introduction to Factorial Designs  The 2k Factorial Design  The General 2k-p Fractional Factorial Designs
  • 6. 6 Improvement Horizons Customers Design Optimization Elimination Process Optimization Rectification Process Control Prevention Product Control Detection
  • 7. 7 DOE Potential Accomplishments DOE Enhance Common Cause Variability Management Enhance Robustness of Processes and Products Identify Complex Special Causes Maintain Processes on Target Reduce Variability Increase Yield Less Sensitive to Environmental Variation Less Sensitive to Internal Component Variation Less Sensitive to “piece-to- piece” Variation Applicable when Special Causes Involve Interactions of Several Variables
  • 8. 8 SPC/DOE Roles in Improvement Efforts SPC Passive Observation DOE Participative Observation Informed observation of naturally Occurring Informative Events Perturbing Process to Invite Occurrence of Informative Events Increased Probability of Observing Key Events when they Naturally Occur Increased Probability of Key Events Occurring
  • 9. 9 Learning Process Data (facts, phenomena) Hypothesis (conjecture, model, theory) deduction induction induction induction deduction deduction ... ...
  • 10. 10 How to Use Statistical Techniques  Use your non-statistical knowledge of the problem  Keep the design and analysis as simple as possible  Recognize the difference between practical and statistical significance  Experiments are usually iterative
  • 11. 11 Three R’s of DOE andomization  sequence of experiments and/or the assignment of specimens to various treatment combinations in a purely chance manner eplication  infers two or more runs were conducted under the same test conditions, each run following a new set-up or resetting of the conditions epetition  obtaining more than one measurement or unit of output for each run
  • 12. 12  Obtain a clear statement of the problem  Identify the problem area in quantitative terms  Identify the response(s) to be measured, the factors that may be varied, the factors to be held constant, and the factors that cannot be controlled  Identify the ranges or limitations of the measurements and of the experimental factors  Collect available background information  Investigate all available sources of information  Tabulate data pertinent to planning the experimental program  Be quantitative Checklist for Planning Experiments
  • 13. 13 Checklist for Planning Experiments  Design the experimental program  Hold a conference of all parties concerned  State the propositions to be explored  Agree on magnitude of differences in the response considered worthwhile  Outline possible alternative outcomes  Choose the factors to be studied  Determine practical range of factors and specify levels  Choose the measurements and methods of measurement  Consider the effect of sampling variability and of precision of the measurement methods  Consider possible interrelationships of the factors  Determine influences of time, cost, materials, manpower, instrumentation, and other facilities and of extraneous conditions such as weather  Consider personnel and human relations requirements of the program
  • 14. 14  Design the experimental program in preliminary form  Prepare a systematic and inclusive schedule, which includes the randomization pattern  Provide for stepwise performance or adaptation of schedule if necessary  Eliminate effect of variables not under study by controlling, balancing, or randomization  Minimize the number of experimental runs consistent with objectives  Choose the method of statistical analysis  Arrange for orderly accumulation of data  Review the experimental design program with all concerned  Adjust the program as required  Spell out the steps to be followed in unmistakable terms Checklist for Planning Experiments
  • 15. 15  Plan and carry out the experimental work  Develop methods, materials, and equipment  Carry out the experimental design in some random order  Record ancillary data  Record any modifications of the experimental design  Take precautions in the collection and recording of data, especially data from extra experiments and missing experiments  Record progress of the program by date, run number, and other ancillary data  Analyze the data  Review the data with attention to recording errors, omissions, etc.  Use graphics: plot the data, plot averages, plot simple graphs  Apply appropriate statistical techniques Checklist for Planning Experiments
  • 16. 16  Interpret the results  Consider all the observed data  Confine initial conclusions to strict deductions from the experimental evidence at hand  Elucidate the analysis in both graphical and numerical terms  State results in terms of verifiable probabilities  Arrive at conclusions as to the technical meaning of results as well as their statistical significance  Point out implications of the findings for application and for further work  Account for any limitations imposed by the data or by the methods of analysis used Checklist for Planning Experiments
  • 17. 17  Prepare the report  Describe work clearly, giving background, pertinence of problems, meaning of results  Use tabular and graphic methods of presenting data, and consider their possible future use  Supply sufficient information to permit readers to verify results and to draw their own conclusions  Limit conclusion to objective summary of evidence Checklist for Planning Experiments
  • 18. 18 DOE Vocabulary  Factor  One of the independent variables under investigation that can be set to a desired value  k  the number of factors or variables, the effects of which are to be estimated in an experiment  Level  the numerical value or qualitative feature of a factor  Run  the act of operating the process with the factors at certain settings  Treatment  specific combination of the levels of all factors for a given test or run  Response  the numerical result of a run
  • 19. 19 DOE Vocabulary  Experimental error  the amount of variability that may be expected in the experimental environment just by chance without any changing of the factors being investigated  Main Effect  the average influence on the response as a variable changes levels  Interaction Effect  the average difference in the effect on a response of one variable dependent upon the settings of another variable  MSFE  Minimum Significant Factor Effect is the minimum absolute value of an effect which may be considered a significant result  Factorial experiment  designed to determine the effect of all possible combinations across all levels of the factors under study  Fractional Factorial  designed to examine k factors with a fraction of the runs required for a full factorial
  • 20. 20 DOE Vocabulary  Confounding  the consequences of conducting a fractional factorial design  Blocking  a strategy for designing experiments to provide the ability to eliminate from the experimental error a contributor of variability that is known but not under investigation  Robust  the quality of a process or output being little affected by environmental or internal component variation  Noise  refers to variability, frequently uncontrollable or random variability in experimental design work  ANOVA  a mathematical procedure testing for significant differences between or among groups
  • 21. 21 Strategies for Designing Experiments Screening Highly fractionalized factorials Many potentially important variables Few tests Knowledge Building Stage Full Factorials or High resolution fractional factorials A few important variables emerge Optimization Replicated designs Full factorials Evolutionary Operation Response Surface Methodology Best settings are determined
  • 22. 22 Simple Comparative Experiments
  • 23. 23 Types of Errors  Type I Error (α risk)  Reject the hypothesis when it is true  Type II Error (β risk)  Accept the hypothesis when it is false
  • 24. 24 Steps in Hypothesis Testing  State the hypothesis  Choose the Type I error  Choose the test statistic for testing the hypothesis  Determine the acceptance region for the test  Obtain the sample of observations, compute the test statistic, and compare the value to the acceptance region to make a decision to accept or reject the hypothesis  Draw an engineering conclusion
  • 25. 25 Summary Table of Hypotheses Tests n x Z / 0 σ µ− = X is distribution-free but should be continuous and have only one mode Standard deviation of population is known Normal distribution ns x t / 0µ− = X is normally distributed Standard deviation of population is estimated by sample s Test 1. The mean of a population is equal to µ0 (Ho: µ = µ0)
  • 26. 26 Summary Table of Hypotheses Tests 2 2 2 1 2 1 21 nn xx Z σσ + − = X1 and X2 are distribution-free but should be continuous and have only one mode. If the populations are not normally distributed, sample sizes n1 and n2 should be large so that sampling distribution of Z is approximately normal Standard deviations of populations are known Normal distribution 21 21 11 nn s xx t p + − = σ1 = σ2. X1 and X2 are normally distributed Standard deviations of populations are estimated by sample s1 and s2 Test 2. The means of two populations are equal (Ho: µ1 = µ2) ( ) ( ) 2 11 21 2 22 2 112 −+ −+− = nn snsn sp t distribution with df = n1 + n2 - 2
  • 27. 27 Summary Table of Hypotheses Tests ns d t d / = Populations are normally distributed Data are taken in n pairs and difference d within each pair is calculated t distribution with df = n - 1 2 2 2 1 2 1 21 n s n s xx t + − = X1 and X2 are normally distributed Standard deviations of populations are estimated by sample s1 and s2 (no assumption that σ1 = σ2) Test 2. The means of two populations are equal (Ho: µ1 = µ2) t distribution with df = min(n1 - 1, n2 - 1)
  • 28. 28 Summary Table of Hypotheses Tests ( ) 2 0 2 2 1 σ χ sn − = Populations are normally distributed Standard deviation of population is estimated by sample s Chi-square dist. With df = n - 1 Test 3. The standard deviation of a population is equal to σ0 (Ho: σ = σ0)
  • 29. 29 Summary Table of Hypotheses Tests 2 2 2 1 s s f = Populations are normally distributed Standard deviation of population is estimated by sample s1 and s2 F distribution with df1 = n1 - 1 df2 = n2 - 1 Test 4. The standard deviations of two populations are equal (Ho: σ1 2 = σ2 2 )
  • 30. 30 Summary Table of Hypotheses Tests ( )00 0 1 pnp npX z − − = n > 100. Only for large sample sizes Proportion of population is estimated by sample proportion Normal distribution Test 5. The proportion of a population exhibiting a certain characteristic is po (Ho: p = po)
  • 31. 31 Summary Table of Hypotheses Tests ( )( )21 2211 /1/1ˆ1ˆ // nnpp nXnX z +− − = np > 5 for each population. Sample sizes n1 and n2 must be large so that sampling dist. of Z is approximately normal Proportions in populations are estimated by sample proportions Normal distribution Test 6. The proportion in two populations are equal (Ho: p1 = p2) 21 21 ˆ nn XX p + + =where
  • 32. 32 Example #1:  The shelf life of a carbonated beverage is of interest. Ten bottles are randomly selected and tested, and the following results are obtained: Days: 108, 124, 124, 106, 115, 138, 163, 159, 134, 139 Assume that the alternative hypothesis is that the mean shelf life is greater than 125 days. Can the null hypothesis H0: µ = 125 be rejected?
  • 33. 33 Exercise #1:  The time to repair an electronic instrument is a normally distributed random variable measured in hours. The repair times for 16 such instruments chosen at random are as follows: Hours 159, 280, 101, 212, 224, 379, 179, 264 222, 362, 168, 250, 149, 260, 485, 170 Does it seem reasonable that the true mean repair time is greater than 225 hours?
  • 34. 34 Example #2  A chemical engineer is investigating the inherent variability of two types of test equipment that can be used to monitor the output of a production process. He suspects that the old equipment, type 1, has a larger variance than the new one. Thus, he wishes to test the hypothesis Ho:σ1 2 = σ2 2 Ha:σ1 2 > σ2 2 Two random samples of n1 = 12 and n2 = 10 observations are taken, and the sample variances are s1 2 = 14.5 and s2 2 = 10.8
  • 35. 35 Exercise #2  A machined engine part produced by a company is claimed to have diameter variance no larger than .0002”. A random sample of 10 parts gave a sample variance of .0003”. Test at the 5 percent level, Ho: σ2 = .0002 against Ha: σ2 > . 0002.
  • 36. 36 Example #3  Two machines are used for filling plastic bottles with a net volume of 16.0 ounces. The filling processes can be assumed to be normal, with standard deviations of s1 = 0.015 and s2 = 0.018. The quality engineering department suspects that both machines fill to the same net volume, whether or not this volume is 16.0 ounces. A random sample is taken from the output of each machine. Machine 1: 16.03 16.04 16.05 16.05 16.02 16.01 15.66 15.98 16.02 15.99 Machine 2: 16.02 15.97 15.96 16.01 15.99 16.03 16.04 16.02 16.01 16.00 Do you think the quality engineering department is correct?
  • 37. 37 Exercise #3  An article in Solid State Technology, “Orthogonal Design for Process Optimization and Its Application to Plasma Etching” by GZ Yin and DW Jillie (May, 1987) describes an experiment to determine the effect of the C2F6 flow rate on the uniformity of the etch on a silicon wafer used in integrated circuit manufacturing. Data for two flow rates are as follows: C2F6 Flow Uniformity Observation 125 2.7 4.6 2.6 3.0 3.2 3.8 200 4.6 3.4 2.9 3.5 4.1 5.1 Does the C2F6 flow rate affect the wafer-to-wafer variability and average etch uniformity?
  • 38. 38 Example #4  An article in the Journal of Strain Analysis (vol. 18, no. 2, 1983) compares several procedures for predicting the shear strength for steep plate girders. Data for nine girders in the form of the ratio of predicted to observed load for two of these procedures, the Karlsruhe and Lehigh methods, are as follows: Girder 1 2 3 4 5 6 7 8 9 KM 1.186 1.151 1.322 1.339 1.200 1.402 1.365 1.537 1.559 LM 1.061 0.992 1.063 1.062 1.065 1.178 1.037 1.086 1.052 Is there any difference between the two methods?
  • 39. 39 Exercise #4  The diameter of a ball bearing was measured by 12 inspectors, each using two different kinds of calipers. The results were Insp 1 2 3 4 5 6 7 8 9 10 11 12 Caliper 1 0.265 0.265 0.266 0.267 0.267 0.265 0.267 0.267 0.265 0.268 0.268 0.265 Caliper 2 0.264 0.265 0.264 0.266 0.267 0.268 0.264 0.265 0.265 0.267 0.268 0.269 Is there a significant difference between the means of the population of measurements represented by the two samples? Use α = .05
  • 40. 40 Statistical Tables
  • 41. 41 Table 1: t distribution 0.1 0.05 0.025 0.01 0.005 0.0025 3.078 6.314 12.706 31.821 63.656 127.321 1.886 2.920 4.303 6.965 9.925 14.089 1.638 2.353 3.182 4.541 5.841 7.453 1.533 2.132 2.776 3.747 4.604 5.598 1.476 2.015 2.571 3.365 4.032 4.773 1.440 1.943 2.447 3.143 3.707 4.317 1.415 1.895 2.365 2.998 3.499 4.029 1.397 1.860 2.306 2.896 3.355 3.833 1.383 1.833 2.262 2.821 3.250 3.690 1.372 1.812 2.228 2.764 3.169 3.581 1.363 1.796 2.201 2.718 3.106 3.497 1.356 1.782 2.179 2.681 3.055 3.428 1.350 1.771 2.160 2.650 3.012 3.372 1.345 1.761 2.145 2.624 2.977 3.326 1.341 1.753 2.131 2.602 2.947 3.286 1.337 1.746 2.120 2.583 2.921 3.252 1.333 1.740 2.110 2.567 2.898 3.222 1.330 1.734 2.101 2.552 2.878 3.197 1.328 1.729 2.093 2.539 2.861 3.174 1.325 1.725 2.086 2.528 2.845 3.153 1.323 1.721 2.080 2.518 2.831 3.135 1.321 1.717 2.074 2.508 2.819 3.119 1.319 1.714 2.069 2.500 2.807 3.104 1.318 1.711 2.064 2.492 2.797 3.091 1.316 1.708 2.060 2.485 2.787 3.078 1.315 1.706 2.056 2.479 2.779 3.067 1.314 1.703 2.052 2.473 2.771 3.057 1.313 1.701 2.048 2.467 2.763 3.047 1.311 1.699 2.045 2.462 2.756 3.038 1.310 1.697 2.042 2.457 2.750 3.030 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 29 30 25 26 27 28 α v
  • 42. 42 Table 2: z distribution z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 0.0 0.50000 0.50399 0.50798 0.51197 0.51595 0.51994 0.52392 0.52790 0.53188 0.53586 0.1 0.53983 0.54380 0.54776 0.55172 0.55567 0.55962 0.56356 0.56749 0.57142 0.57535 0.2 0.57926 0.58317 0.58706 0.59095 0.59483 0.59871 0.60257 0.60642 0.61026 0.61409 0.3 0.61791 0.62172 0.62552 0.62930 0.63307 0.63683 0.64058 0.64431 0.64803 0.65173 0.4 0.65542 0.65910 0.66276 0.66640 0.67003 0.67364 0.67724 0.68082 0.68439 0.68793 0.5 0.69146 0.69497 0.69847 0.70194 0.70540 0.70884 0.71226 0.71566 0.71904 0.72240 0.6 0.72575 0.72907 0.73237 0.73565 0.73891 0.74215 0.74537 0.74857 0.75175 0.75490 0.7 0.75804 0.76115 0.76424 0.76730 0.77035 0.77337 0.77637 0.77935 0.78230 0.78524 0.8 0.78814 0.79103 0.79389 0.79673 0.79955 0.80234 0.80511 0.80785 0.81057 0.81327 0.9 0.81594 0.81859 0.82121 0.82381 0.82639 0.82894 0.83147 0.83398 0.83646 0.83891 1.0 0.84134 0.84375 0.84614 0.84849 0.85083 0.85314 0.85543 0.85769 0.85993 0.86214 1.1 0.86433 0.86650 0.86864 0.87076 0.87286 0.87493 0.87698 0.87900 0.88100 0.88298 1.2 0.88493 0.88686 0.88877 0.89065 0.89251 0.89435 0.89617 0.89796 0.89973 0.90147 1.3 0.90320 0.90490 0.90658 0.90824 0.90988 0.91149 0.91308 0.91466 0.91621 0.91774 1.4 0.91924 0.92073 0.92220 0.92364 0.92507 0.92647 0.92785 0.92922 0.93056 0.93189 1.5 0.93319 0.93448 0.93574 0.93699 0.93822 0.93943 0.94062 0.94179 0.94295 0.94408 1.6 0.94520 0.94630 0.94738 0.94845 0.94950 0.95053 0.95154 0.95254 0.95352 0.95449 1.7 0.95543 0.95637 0.95728 0.95818 0.95907 0.95994 0.96080 0.96164 0.96246 0.96327 1.8 0.96407 0.96485 0.96562 0.96638 0.96712 0.96784 0.96856 0.96926 0.96995 0.97062 1.9 0.97128 0.97193 0.97257 0.97320 0.97381 0.97441 0.97500 0.97558 0.97615 0.97670 2.0 0.97725 0.97778 0.97831 0.97882 0.97932 0.97982 0.98030 0.98077 0.98124 0.98169
  • 43. 43 Table 2: z distribution (cont…) z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 2.1 0.98214 0.98257 0.98300 0.98341 0.98382 0.98422 0.98461 0.98500 0.98537 0.98574 2.2 0.98610 0.98645 0.98679 0.98713 0.98745 0.98778 0.98809 0.98840 0.98870 0.98899 2.3 0.98928 0.98956 0.98983 0.99010 0.99036 0.99061 0.99086 0.99111 0.99134 0.99158 2.4 0.99180 0.99202 0.99224 0.99245 0.99266 0.99286 0.99305 0.99324 0.99343 0.99361 2.5 0.99379 0.99396 0.99413 0.99430 0.99446 0.99461 0.99477 0.99492 0.99506 0.99520 2.6 0.99534 0.99547 0.99560 0.99573 0.99585 0.99598 0.99609 0.99621 0.99632 0.99643 2.7 0.99653 0.99664 0.99674 0.99683 0.99693 0.99702 0.99711 0.99720 0.99728 0.99736 2.8 0.99744 0.99752 0.99760 0.99767 0.99774 0.99781 0.99788 0.99795 0.99801 0.99807 2.9 0.99813 0.99819 0.99825 0.99831 0.99836 0.99841 0.99846 0.99851 0.99856 0.99861 3.0 0.99865 0.99869 0.99874 0.99878 0.99882 0.99886 0.99889 0.99893 0.99896 0.99900 3.1 0.99903 0.99906 0.99910 0.99913 0.99916 0.99918 0.99921 0.99924 0.99926 0.99929 3.2 0.99931 0.99934 0.99936 0.99938 0.99940 0.99942 0.99944 0.99946 0.99948 0.99950 3.3 0.99952 0.99953 0.99955 0.99957 0.99958 0.99960 0.99961 0.99962 0.99964 0.99965 3.4 0.99966 0.99968 0.99969 0.99970 0.99971 0.99972 0.99973 0.99974 0.99975 0.99976 3.5 0.99977 0.99978 0.99978 0.99979 0.99980 0.99981 0.99981 0.99982 0.99983 0.99983 3.6 0.99984 0.99985 0.99985 0.99986 0.99986 0.99987 0.99987 0.99988 0.99988 0.99989 3.7 0.99989 0.99990 0.99990 0.99990 0.99991 0.99991 0.99992 0.99992 0.99992 0.99992 3.8 0.99993 0.99993 0.99993 0.99994 0.99994 0.99994 0.99994 0.99995 0.99995 0.99995 3.9 0.99995 0.99995 0.99996 0.99996 0.99996 0.99996 0.99996 0.99996 0.99997 0.99997 4.0 0.99997 0.99997 0.99997 0.99997 0.99997 0.99997 0.99998 0.99998 0.99998 0.99998
  • 44. 44 Table 3: f distribution 1 2 3 4 5 6 7 8 9 10 12 15 20 24 30 40 60 120 1 161.45 199.50 215.71 224.58 230.16 233.99 236.77 238.88 240.54 241.88 243.90 245.95 248.02 249.05 250.10 251.14 252.20 253.25 2 18.51 19.00 19.16 19.25 19.30 19.33 19.35 19.37 19.38 19.40 19.41 19.43 19.45 19.45 19.46 19.47 19.48 19.49 3 10.13 9.55 9.28 9.12 9.01 8.94 8.89 8.85 8.81 8.79 8.74 8.70 8.66 8.64 8.62 8.59 8.57 8.55 4 7.71 6.94 6.59 6.39 6.26 6.16 6.09 6.04 6.00 5.96 5.91 5.86 5.80 5.77 5.75 5.72 5.69 5.66 5 6.61 5.79 5.41 5.19 5.05 4.95 4.88 4.82 4.77 4.74 4.68 4.62 4.56 4.53 4.50 4.46 4.43 4.40 6 5.99 5.14 4.76 4.53 4.39 4.28 4.21 4.15 4.10 4.06 4.00 3.94 3.87 3.84 3.81 3.77 3.74 3.70 7 5.59 4.74 4.35 4.12 3.97 3.87 3.79 3.73 3.68 3.64 3.57 3.51 3.44 3.41 3.38 3.34 3.30 3.27 8 5.32 4.46 4.07 3.84 3.69 3.58 3.50 3.44 3.39 3.35 3.28 3.22 3.15 3.12 3.08 3.04 3.01 2.97 9 5.12 4.26 3.86 3.63 3.48 3.37 3.29 3.23 3.18 3.14 3.07 3.01 2.94 2.90 2.86 2.83 2.79 2.75 10 4.96 4.10 3.71 3.48 3.33 3.22 3.14 3.07 3.02 2.98 2.91 2.85 2.77 2.74 2.70 2.66 2.62 2.58 11 4.84 3.98 3.59 3.36 3.20 3.09 3.01 2.95 2.90 2.85 2.79 2.72 2.65 2.61 2.57 2.53 2.49 2.45 12 4.75 3.89 3.49 3.26 3.11 3.00 2.91 2.85 2.80 2.75 2.69 2.62 2.54 2.51 2.47 2.43 2.38 2.34 13 4.67 3.81 3.41 3.18 3.03 2.92 2.83 2.77 2.71 2.67 2.60 2.53 2.46 2.42 2.38 2.34 2.30 2.25 14 4.60 3.74 3.34 3.11 2.96 2.85 2.76 2.70 2.65 2.60 2.53 2.46 2.39 2.35 2.31 2.27 2.22 2.18 15 4.54 3.68 3.29 3.06 2.90 2.79 2.71 2.64 2.59 2.54 2.48 2.40 2.33 2.29 2.25 2.20 2.16 2.11 16 4.49 3.63 3.24 3.01 2.85 2.74 2.66 2.59 2.54 2.49 2.42 2.35 2.28 2.24 2.19 2.15 2.11 2.06 17 4.45 3.59 3.20 2.96 2.81 2.70 2.61 2.55 2.49 2.45 2.38 2.31 2.23 2.19 2.15 2.10 2.06 2.01 18 4.41 3.55 3.16 2.93 2.77 2.66 2.58 2.51 2.46 2.41 2.34 2.27 2.19 2.15 2.11 2.06 2.02 1.97 19 4.38 3.52 3.13 2.90 2.74 2.63 2.54 2.48 2.42 2.38 2.31 2.23 2.16 2.11 2.07 2.03 1.98 1.93 20 4.35 3.49 3.10 2.87 2.71 2.60 2.51 2.45 2.39 2.35 2.28 2.20 2.12 2.08 2.04 1.99 1.95 1.90 v1 v2
  • 45. 45 Table 3: f distribution 1 2 3 4 5 6 7 8 9 10 12 15 20 24 30 40 60 120 21 4.32 3.47 3.07 2.84 2.68 2.57 2.49 2.42 2.37 2.32 2.25 2.18 2.10 2.05 2.01 1.96 1.92 1.87 22 4.30 3.44 3.05 2.82 2.66 2.55 2.46 2.40 2.34 2.30 2.23 2.15 2.07 2.03 1.98 1.94 1.89 1.84 23 4.28 3.42 3.03 2.80 2.64 2.53 2.44 2.37 2.32 2.27 2.20 2.13 2.05 2.01 1.96 1.91 1.86 1.81 24 4.26 3.40 3.01 2.78 2.62 2.51 2.42 2.36 2.30 2.25 2.18 2.11 2.03 1.98 1.94 1.89 1.84 1.79 25 4.24 3.39 2.99 2.76 2.60 2.49 2.40 2.34 2.28 2.24 2.16 2.09 2.01 1.96 1.92 1.87 1.82 1.77 26 4.23 3.37 2.98 2.74 2.59 2.47 2.39 2.32 2.27 2.22 2.15 2.07 1.99 1.95 1.90 1.85 1.80 1.75 27 4.21 3.35 2.96 2.73 2.57 2.46 2.37 2.31 2.25 2.20 2.13 2.06 1.97 1.93 1.88 1.84 1.79 1.73 28 4.20 3.34 2.95 2.71 2.56 2.45 2.36 2.29 2.24 2.19 2.12 2.04 1.96 1.91 1.87 1.82 1.77 1.71 29 4.18 3.33 2.93 2.70 2.55 2.43 2.35 2.28 2.22 2.18 2.10 2.03 1.94 1.90 1.85 1.81 1.75 1.70 30 4.17 3.32 2.92 2.69 2.53 2.42 2.33 2.27 2.21 2.16 2.09 2.01 1.93 1.89 1.84 1.79 1.74 1.68 40 4.08 3.23 2.84 2.61 2.45 2.34 2.25 2.18 2.12 2.08 2.00 1.92 1.84 1.79 1.74 1.69 1.64 1.58 60 4.00 3.15 2.76 2.53 2.37 2.25 2.17 2.10 2.04 1.99 1.92 1.84 1.75 1.70 1.65 1.59 1.53 1.47 120 3.92 3.07 2.68 2.45 2.29 2.18 2.09 2.02 1.96 1.91 1.83 1.75 1.66 1.61 1.55 1.50 1.43 1.35 v1 v2
  • 46. 46 Table 4: Chi-square Dist.1.99 1.98 1.95 1.9 0.1 0.01 0.995 0.99 0.975 0.95 0.05 0.025 0.01 0.005 0.000 0.000 0.001 0.004 3.841 5.024 6.635 7.879 0.010 0.020 0.051 0.103 5.991 7.378 9.210 10.597 0.072 0.115 0.216 0.352 7.815 9.348 11.345 12.838 0.207 0.297 0.484 0.711 9.488 11.143 13.277 14.860 0.412 0.554 0.831 1.145 11.070 12.832 15.086 16.750 0.676 0.872 1.237 1.635 12.592 14.449 16.812 18.548 0.989 1.239 1.690 2.167 14.067 16.013 18.475 20.278 1.344 1.647 2.180 2.733 15.507 17.535 20.090 21.955 1.735 2.088 2.700 3.325 16.919 19.023 21.666 23.589 2.156 2.558 3.247 3.940 18.307 20.483 23.209 25.188 2.603 3.053 3.816 4.575 19.675 21.920 24.725 26.757 3.074 3.571 4.404 5.226 21.026 23.337 26.217 28.300 3.565 4.107 5.009 5.892 22.362 24.736 27.688 29.819 4.075 4.660 5.629 6.571 23.685 26.119 29.141 31.319 4.601 5.229 6.262 7.261 24.996 27.488 30.578 32.801 5.142 5.812 6.908 7.962 26.296 28.845 32.000 34.267 5.697 6.408 7.564 8.672 27.587 30.191 33.409 35.718 6.265 7.015 8.231 9.390 28.869 31.526 34.805 37.156 6.844 7.633 8.907 10.117 30.144 32.852 36.191 38.582 7.434 8.260 9.591 10.851 31.410 34.170 37.566 39.997 8.034 8.897 10.283 11.591 32.671 35.479 38.932 41.401 8.643 9.542 10.982 12.338 33.924 36.781 40.289 42.796 9.260 10.196 11.689 13.091 35.172 38.076 41.638 44.181 9.886 10.856 12.401 13.848 36.415 39.364 42.980 45.558 10.520 11.524 13.120 14.611 37.652 40.646 44.314 46.928 11.160 12.198 13.844 15.379 38.885 41.923 45.642 48.290 11.808 12.878 14.573 16.151 40.113 43.195 46.963 49.645 12.461 13.565 15.308 16.928 41.337 44.461 48.278 50.994 13.121 14.256 16.047 17.708 42.557 45.722 49.588 52.335 13.787 14.953 16.791 18.493 43.773 46.979 50.892 53.672 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 29 30 25 26 27 28 α v
  • 47. 47 B vs C Experiment
  • 48. 48  A quick and simple non-parametric method of comparing two samples and deciding whether one is better than the other  B is the code used for “Better” and C is the code for “Current” What is B vs. C?
  • 49. 49  Choose an acceptable level of risk (α)  Decide on Sample Sizes for B and C Tests  Randomize and Conduct the Tests  Rank Order the Results  Decision Rule  No Overlap  With Overlap (End-Count) Technique General Procedure: B vs. C
  • 50. 50  Choose an acceptable level of risk (α)  Decide on Sample Sizes for B and C Tests (Base on the Table for B vs. C With Overlap)  Randomize and Conduct the Tests  Rank Order the Results  Decision Rule  SIMPLE RULE: An entity is better than the other if all readings outrank all readings of the other entity. B vs. C: No Overlap Rule
  • 51. 51 B vs. C: Table for No Overlap Rule
  • 52. 52  A process change on a tuning coil was expected to increase yield. It was decided to list yields of three C lots and four B lots. It was stated that the proposed change would only be put into effect if all B yields outranked all C yields  C: 93.6 93.8 92.5  B: 94.1 94.3 93.7 94.2 Exercise 1
  • 53. 53  Choose an acceptable level of risk (α)  Decide on Sample Sizes for B and C Tests  Tukey’s Overlap End Count Technique:  Sample sizes are larger, generally 10 or more for each  If sample sizes are unequal, the ratio of nB:nC should be no more than 3:4  Randomize and Conduct the Tests  Rank Order the Results  Decision Rule α-risk 0.10 0.05 0.01 0.001 End Count 6 7 10 13> = B vs. C: With Overlap Rule
  • 54. 54 What is an “END-COUNT” ? BBBBBB CBBBCCCCBBCB CCCCCCCC “high” end “low” endOVERLAP END-COUNT = #high end + #low end = 6 + 8 = 14 B vs. C: With Overlap Rule
  • 55. 55 What is an “END-COUNT” ? CCCCCC BBBBCCCCBBCC BBBBBBB “high” end “low” endOVERLAP END-COUNT = 6 + 7 = 13 CCCCCC BBBBCBBCBBCCBBBCCCC “high” end OVERLAP END-COUNT = 6 + 0 = 6 B vs. C: With Overlap Rule
  • 56. 56  In the fabrication of a 64K RAM, a B vs C test was run to see if standard substrates in a room atmosphere (C) could be produced in a high-oxygen atmosphere (B) in order to improve yields. Twelve C and thirteen B samples were selected and processed in random order  C: 95.6 92.5 98.5 94.6 95.8 88.3 94.1 90.5 97.5 94.9 93.7 90.0  B: 96.7 91.2 99.2 98.6 97.0 99.4 93.6 97.2 94.5 93.2 99.3 90.4 98.2 Exercise 2
  • 57. 57 Factorial Experiments
  • 58. 58 One-level Factorial Designs  One-level factorial designs are experimental designs in which each of the k factors is set at a single level only for all the test combinations that are conducted.  They are similar to one-way ANOVA. They are very useful for simple comparisons of categorical treatments.
  • 59. 59 General one factor design problem  Three bowlers must compete for the last position in the national team. They bowl six games. (see data on the next slide)  The captain knows better to just simply pick the bowler with highest score. Maybe it’s a fluke that Mark scored highest and Pat’s score is low. He wants to know if the scores are significantly different, given the variability in individual scores.
  • 60. 60 Game Pat Mark Shari 1 160 165 166 2 150 180 158 3 140 170 145 4 167 185 161 5 157 195 151 6 148 175 156
  • 61. 61  The bowling captain does not care if averages differ by less than 10 pins and his records provide the standard deviation of 5.  Which of the three bowlers should he pick for the national team?  Justify your conclusion using the Design Expert software.
  • 62. 62 Two-level Factorial Designs  Two Level factorial designs are experimental designs in which  each of the k factors is set at one of two levels (high or low)  all 2k test combinations of factor levels are conducted A 22 Full Factorial Design A B 1 - + 2 + - 3 - - 4 + +
  • 63. 63 Generalized 2k Designs
  • 64. 64 Estimation of Main Effects  A main effect (or average main effect) of a factor is the difference between the response (or average of several responses) at the high level of the factor and the response (or average of several responses) at the low level of the factor 92 93 94 95 96 97 98 99 - + Y Main effect
  • 65. 65 Calculation Matrix Run X1 X2 X1X2 1 - - + 2 + - - 3 - + - 4 + + + Factor Design matrix Calculation matrix  List all possible interactions  Multiply the row entries of the respective columns associated with the variables that are listed in the interaction column heading  Calculation of effect estimates is similar to the main effects calculation
  • 66. 66 Graphical Aids for Analysis 92 93 94 95 96 97 98 99 - + Y 90 92 94 96 98 100 102 - + X2 (low) X2 (high) One Factor Main Effect Two Factor Interaction Effect
  • 67. 67 General Nature of Interactions No Interaction No Interaction Slight Interaction Slight Interaction Strong Interaction (reversal) Strong Interaction (reversal) The interaction plot is a way to visualize the inter-dependency between factors The Calculations are not difficult the MEANING is critical Low High High Low B Low High High Low B Low High High Low BBB A-A- BB BB A+A+ A-A- A+A+ A-A- A+A+ 0 10 20 30 40 50 60 70 0 10 20 30 40 50 60 70 0 10 20 30 40 50 60 70
  • 68. 68 Steps in Judging Relative Importance of Effects  Estimate experimental error (s)  Determine the standard error of effects  State the precision for an effect as  a confidence interval about the effect estimates  a range of values centered at zero which would have happened by chance
  • 69. 69 Estimating Experimental Error for Location Effects ∑ ∑ i ii v sv 2       = N s s p e 2 N E s i e ∑= 2 estimategraphical=es Normal Probability Plotting (NPP) Replicated Experiment Unreplicated Experiment Calculate se of Higher Order Effects NPP => 24
  • 70. 70 Graphical Assessment of Effects  Normally Distributed  True Average of Zero  Variability Relatively Small
  • 71. 71 Generic First Order Models # of Variables Model 2 y = b0 + b1X1 + b2X2 + b12X1X2 3 y = b0 + b1X1 + b2X2 + b3X3 + b12X1X2 + … + b123X1X2X3 Where: b0 = average of all responses bi = Ei/2
  • 72. 72 Model Adequacy Checking  Residual Analysis  Normality Assumption  Construct a normal probability plot of the residuals to check departure from normality  Plot of Residuals in Time Sequence  Check for positive or negative runs. This implies that the independence assumption on the errors have been violated  Check for change in the error variance over time  Plot of Residuals Versus Fitted Values yij  Check for nonconstant variance  Plot of Residuals Versus Other Variables  Check for patterns that might imply that the variable affects the response ^
  • 73. 73 Effect Estimates with 2k Designs Interaction Main Two Three Four Five Total Higher Type Effect Factor Factor Factor Factor Effect Order Order of 1 2 3 4 5 6 7 8 9 10 Estimates Effects Interaction (2k - 1) (3rd up) # of Variables 1 1 1 0 2 2 1 3 0 3 3 3 1 7 1 4 4 6 4 1 15 5 5 5 10 10 5 1 31 16 6 6 15 20 15 6 1 63 42 7 7 21 35 35 21 7 1 127 99 8 8 28 56 70 56 28 8 1 255 219 9 9 36 84 126 126 84 36 9 1 511 466 10 10 45 120 210 252 210 120 45 10 1 1023 968 We don’t need to expend resources to estimate these...
  • 74. 74 Factorial Design Problem  A 23 factorial design problem was employed to study the possible effects of temperature (deg C), time (hr) and stir rate (Rpm) on the output (gallons/hr) of a chemical process. Run X1 X2 X3 Y 1 20 1 10 28 2 35 1 10 43 3 20 3 10 30 4 35 3 10 47 5 20 1 30 59 6 35 1 30 88 7 20 3 30 65 8 35 3 30 81
  • 75. 75 Popcorn Problem  The owner has developed a new hybrid of seed which may or may not be use the same cooking process recommended for the current seed. One of the processes used by customers to pop corn is the hot oil method, which is addressed in this situation.  The problem is to find the process factors which influence popcorn quality characteristics relative to customer requirements. Characteristics such as the number of unpopped kernels in a batch, the fluffiness or volume of popped corn, the color, the taste, and crispiness are considered.
  • 76. 76  The objective of the experiment is to find the process conditions which optimizes the various quality characteristics to provide improved popping, fluffiness, color, taste, and texture.  The popcorn factors and levels are given in the next slide and use full factorial design to solve the problem.
  • 77. 77 Factor Level 1 Level 2 A: Type of oil Corn oil Peanut oil B: Amount of oil Low High C: Amount of heat Low High D: Preheat No Yes E: Agitation No Yes F: Venting No Yes G: Pan material Aluminum Steel H: Pan shape Shallow Deep
  • 78. 78 Fractional Factorial Designs  Experimental designs requiring only a fraction of the total number of runs of a full factorial  Always accompanied by some degree of confounding  Denoted by: 2k - p where 2-p = the fraction of the design k = the number of variables to be investigated p = number of extra variables to be introduced into a 2k factorial design 2k - p = the number of total runs required r = resolution of the design
  • 79. 79 Resolution of Fractional Factorial Designs Resolution Sum of Orders of Confounding Description Effects Confounded I Main effects non-estimable II 1 + 1 Main effects confounded with main effects III 1 + 2 Main effects confounded with two-factor interactions IV 1 + 3 Main effects confounded with three-factor interactions 2 + 2 Two-factor interactions confounded with 2-factor V 1 + 4 No confounding of main effects 2 + 3 or two-factor interactions with each other VI 1 + 5 Greater clarity of effects 2 + 4 3 + 3
  • 80. 80 Exercise Run X1 X2 X3 X4 Y 1 33 2 0.03 1.9 33.3 2 39 2 0.03 1.9 40.9 3 33 4 0.03 1.9 39.4 4 39 4 0.03 1.9 47.2 5 33 2 0.09 1.9 28.1 6 39 2 0.09 1.9 31.1 7 33 4 0.09 1.9 34.9 8 39 4 0.09 1.9 41 9 33 2 0.03 3.5 31.7 10 39 2 0.03 3.5 41.6 11 33 4 0.03 3.5 37.6 12 39 4 0.03 3.5 44.6 13 33 2 0.09 3.5 28.3 14 39 2 0.09 3.5 32.2 15 33 4 0.09 3.5 33.6 16 39 4 0.09 3.5 39.3 What are the estimates of all main and interaction effects? Which effects are significant? Write down the predictive model and calculate residuals. Do model adequacy check.
  • 81. 81 Biker problem  Lance Legstrong has one week to fine-tune his bicycle before the early-bird Spring meet. He decides to test seven factors in only 8 runs around the quarter-mile track. It is “saturated” with factors, that is, no more can be added for the given number of runs.  The data for the given problem is given in the next slide.
  • 82. 82 Factor Level 1 Level 2 Seat Up Down Tires (psi) 40 50 Handle bars Up Down Helmet Brand Atlas Windy Gear High Low Wheelcovers On Off Generator Off On
  • 83. 83  The time for the ¼ mile track for the 8 runs are given below Run Time, secs 1 77 2 74 3 82 4 47 5 72 6 77 7 48 8 83
  • 84. 84 DOE is a proactive tool. There is no such thing as a bad experiment - only poorly designed and executed ones. Not every experiment will produce earth shattering discoveries: − Something will always be learned. − New data prompts asking new questions and generates follow-on studies. The best time to design an experiment is after the previous one is finished. Don’t try to answer all the questions in one study. Rely on a sequence of studies. Use two-level designs early. Spend less than 25% of the budget on the first experiment. Always verify results in a capping run. Be ready for changes. A final report is a requirement. 90% Planning and 10% doing experiments ... Dr. Montgomery Some Guidelines
  • 85. 85 FINAL NOTE  “When your television set misbehaves, you may discover that a kick in the right place fixes the problem at least temporarily. However, for a long-term solution the reason for the fault must be discovered. In general, problems can be fixed, or they may be solved. Experimental design catalyzes both fixing and solving.” -- Box, Hunter, and Hunter, “Statistics for Experimenters”, 2nd edition, 2005, John Wiley & Sons, Inc.
  • 86. 86