A basic introduction to Design of Experiments (DOE) explaining how it works, the benefits of using it and how to use a common DOE application (MiniTab)
This document provides an overview of how to conduct a design of experiments (DOE). It explains that a DOE tests multiple factors at once to reduce the number of tests compared to testing each factor individually. A factor is an input that can change the output when varied. Different types of test arrays are described, including full factorial arrays and fractional arrays. It is important to select the appropriate array based on the number of factors. The document also discusses analyzing the results using ANOVA to determine significant factors and interactions between factors. It emphasizes the importance of conducting a confirmation run to validate the results of the DOE.
Six Sigma Process Capability Study (PCS) Training Module Frank-G. Adler
The Process Capability Study (PCS) Training Module v3.0 includes:
1. MS PowerPoint Presentation including 98 slides covering Introduction to Six Sigma, Creating and analyzing a Histogram, Basic Statistics & Product Capability, Statistical Process Control for Variable Data, Definitions of Process Capability Indices, Confidence Interval Analysis for Capability Indices, Capability Study for Non-Normal Distributed Processes, and several Exercises.
2. MS Excel Confidence Interval Analysis Calculator making it really easy to calculate Confidence Intervals for Capability Indices and other Statistics.
This was presented at an ASQLA Section 700 monthly meeting in 2012.
This covers the basics of SPC and some of the things that need to be in place before SPC can be used effectively like a proper Gage R&R evaluation, proper specs derived and characterization of the process performed using Design of Experiments. Also covered are the main cultural barriers to implementation and some suggestions on how to proceed.
Also shown are some advanced methods of charting such as Delta from Target that allows easier use of SPC by floor shop personnel and maintains date/time sequence flow of product/measurements when there are multiple products run on a single machine.
Introduction & Basics of DoE
Terminologies
Key steps in DOE
Softwares used for DOE
Factorial Designs ( Full and Fractional)
Mixture Designs
Response Surface Methodology
Central Composite Design
Box -Behnken Design
Conclusion
References
Experiments
A Quick History of Design of Experiments
Why We Use Experimental Designs
What is Design of Experiment
How Design of Experiment contributes
Terminology
Analysis Of Variation (ANOVA)
Basic Principle of Design of Experiments
Some Experimental Designs
Here are the key IR frequencies identified in the sample that match the reference standard of propafenone:
- C-C stretch at 1186 cm-1
- C=C stretch at 1651 cm-1
- C-H stretch (symmetric) at 2939 cm-1
- C-H bend at 1328 cm-1
- CH2 stretch (symmetric) at 1369 cm-1
- CH2 bend at 1485 cm-1
- CH3 bend at 1398 cm-1
- C-O stretch at 1100 cm-1
- C=O stretch at 1695 cm-1
- N-H stretch at 3417 cm-1
The IR spectrum
This document discusses quality by design (QbD) approaches for biopharmaceutical development. QbD focuses on designing quality into the product and process based on an understanding of critical quality attributes and critical process parameters. Key aspects of QbD include identifying critical attributes and parameters, using tools like design of experiments to understand their impact, defining a design space, and ensuring robustness through continuous monitoring and improvement. Statistical tools and multidisciplinary teams are important for successful QbD implementation.
Stability Indicating HPLC Method Development A Reviewijtsrd
High performance liquid chromatography HPLC is an essential analytical tool for evaluating drug stability. HPLC methods must be able to isolate, detect, and quantify drug related degradation products that may form during storage or production, and identify drug related impurities that may form during synthesis. .. This article describes strategies and challenges for designing HPLC methods to demonstrate drug stability. It will deepen our understanding of drugs and medicinal chemistry and demonstrate advances in stability that reflect an analytical approach. Several important chromatographic parameters were investigated to improve the detection of potentially related degradants. It is necessary to find suitable solvent and mobile phase samples that provide sufficient stability and compatibility with each component and potential impurities and degradants. This method should be carefully considered as it has the ability to distinguish between primary and secondary decomposers. The study of forced destruction of chemicals and new drugs is essential for the development and characterization of these immobilization methods. Practical guidance is provided at each stage of drug development to develop a forced disposal protocol and avoid common issues that might impede data interpretation. Suraj Nagwanshi | Smita Aher | Rishikesh Bachhav "Stability Indicating HPLC Method Development - A Review" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-5 , August 2021, URL: https://www.ijtsrd.com/papers/ijtsrd46310.pdf Paper URL: https://www.ijtsrd.com/pharmacy/other/46310/stability-indicating-hplc-method-development--a-review/suraj-nagwanshi
This document provides an overview of how to conduct a design of experiments (DOE). It explains that a DOE tests multiple factors at once to reduce the number of tests compared to testing each factor individually. A factor is an input that can change the output when varied. Different types of test arrays are described, including full factorial arrays and fractional arrays. It is important to select the appropriate array based on the number of factors. The document also discusses analyzing the results using ANOVA to determine significant factors and interactions between factors. It emphasizes the importance of conducting a confirmation run to validate the results of the DOE.
Six Sigma Process Capability Study (PCS) Training Module Frank-G. Adler
The Process Capability Study (PCS) Training Module v3.0 includes:
1. MS PowerPoint Presentation including 98 slides covering Introduction to Six Sigma, Creating and analyzing a Histogram, Basic Statistics & Product Capability, Statistical Process Control for Variable Data, Definitions of Process Capability Indices, Confidence Interval Analysis for Capability Indices, Capability Study for Non-Normal Distributed Processes, and several Exercises.
2. MS Excel Confidence Interval Analysis Calculator making it really easy to calculate Confidence Intervals for Capability Indices and other Statistics.
This was presented at an ASQLA Section 700 monthly meeting in 2012.
This covers the basics of SPC and some of the things that need to be in place before SPC can be used effectively like a proper Gage R&R evaluation, proper specs derived and characterization of the process performed using Design of Experiments. Also covered are the main cultural barriers to implementation and some suggestions on how to proceed.
Also shown are some advanced methods of charting such as Delta from Target that allows easier use of SPC by floor shop personnel and maintains date/time sequence flow of product/measurements when there are multiple products run on a single machine.
Introduction & Basics of DoE
Terminologies
Key steps in DOE
Softwares used for DOE
Factorial Designs ( Full and Fractional)
Mixture Designs
Response Surface Methodology
Central Composite Design
Box -Behnken Design
Conclusion
References
Experiments
A Quick History of Design of Experiments
Why We Use Experimental Designs
What is Design of Experiment
How Design of Experiment contributes
Terminology
Analysis Of Variation (ANOVA)
Basic Principle of Design of Experiments
Some Experimental Designs
Here are the key IR frequencies identified in the sample that match the reference standard of propafenone:
- C-C stretch at 1186 cm-1
- C=C stretch at 1651 cm-1
- C-H stretch (symmetric) at 2939 cm-1
- C-H bend at 1328 cm-1
- CH2 stretch (symmetric) at 1369 cm-1
- CH2 bend at 1485 cm-1
- CH3 bend at 1398 cm-1
- C-O stretch at 1100 cm-1
- C=O stretch at 1695 cm-1
- N-H stretch at 3417 cm-1
The IR spectrum
This document discusses quality by design (QbD) approaches for biopharmaceutical development. QbD focuses on designing quality into the product and process based on an understanding of critical quality attributes and critical process parameters. Key aspects of QbD include identifying critical attributes and parameters, using tools like design of experiments to understand their impact, defining a design space, and ensuring robustness through continuous monitoring and improvement. Statistical tools and multidisciplinary teams are important for successful QbD implementation.
Stability Indicating HPLC Method Development A Reviewijtsrd
High performance liquid chromatography HPLC is an essential analytical tool for evaluating drug stability. HPLC methods must be able to isolate, detect, and quantify drug related degradation products that may form during storage or production, and identify drug related impurities that may form during synthesis. .. This article describes strategies and challenges for designing HPLC methods to demonstrate drug stability. It will deepen our understanding of drugs and medicinal chemistry and demonstrate advances in stability that reflect an analytical approach. Several important chromatographic parameters were investigated to improve the detection of potentially related degradants. It is necessary to find suitable solvent and mobile phase samples that provide sufficient stability and compatibility with each component and potential impurities and degradants. This method should be carefully considered as it has the ability to distinguish between primary and secondary decomposers. The study of forced destruction of chemicals and new drugs is essential for the development and characterization of these immobilization methods. Practical guidance is provided at each stage of drug development to develop a forced disposal protocol and avoid common issues that might impede data interpretation. Suraj Nagwanshi | Smita Aher | Rishikesh Bachhav "Stability Indicating HPLC Method Development - A Review" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-5 , August 2021, URL: https://www.ijtsrd.com/papers/ijtsrd46310.pdf Paper URL: https://www.ijtsrd.com/pharmacy/other/46310/stability-indicating-hplc-method-development--a-review/suraj-nagwanshi
This document provides an introduction to statistical significance testing. It discusses why significance tests are used, how they work, key terminology like p-values and hypotheses, and examples of one-sample and two-sample significance tests for means, proportions, and categorical data. Specific tests covered include the z-test, t-test, and chi-square test. The goal of significance testing is to determine whether observed differences in sample data could plausibly be due to chance or represent real effects in the underlying population.
- Response surface methodology (RSM) is used to optimize processes with multiple variables to maximize or minimize a response. It uses experimental design and regression analysis.
- The method of steepest ascent is used to sequentially move from an initial guess towards the optimum region using a first-order model. Additional experiments are conducted to fit higher-order models closer to the optimum.
- A second-order model that includes interaction and quadratic terms can identify if the stationary point is a maximum, minimum, or saddle point. Canonical analysis of the eigenvalues further characterizes the stationary point.
This document discusses statistical process control (SPC), which uses statistical methods to monitor and control processes to improve quality. SPC aims to ensure processes operate efficiently and produce specification-conforming products with less waste. Key SPC tools include control charts, histograms, cause-and-effect diagrams and check sheets. Control charts in particular plot process data over time to identify changes or variability. SPC provides benefits like reduced waste, lower costs, improved customer satisfaction and early problem detection and prevention.
Design Of Experiments (DOE) Applied To Pharmaceutical and Analytical QbD.SALMA RASHID SHAIKH
According to ICH Q8 Quality should be built into the product.
Design of Experiments (DoE) generate knowledge about a product or process and established a Mathematical relationship of dependent variables and independent variables.
The most common screening designs, such as two-level full factorial, fractionate factorial, and Plackett- Burman designs.
Optimization designs, such as three-level full factorial, central composite designs (CCD), and Box-Behnken designs.
Analysis of variance (ANOVA) used in multiple regression analysis to evaluate regression significance, residual error, and lack-of-fit adjustment.
Determination coefficients (R2, R2 -adj, and R2 -pred) is also evaluated.
Quality By Design:
QbD is “a systematic approach to pharmaceutical development that begins with predefined objectives and emphasizes product and process understanding and process control, based on sound science and quality risk management”
Goals Of Pharmaceutical QbD:
To achieve meaningful product quality specifications
To increase process capability and reduce product variability
To increase pharmaceutical development and manufacturing efficiencies and
To enhance cause-effect analysis and regulatory flexibility.
Technology Transfer From R and D to Pilot Plant to Plant for Non-Sterile Semi...shiv
Technology transfer is important for successful progress of drug development from research to commercialization. This presentation discusses technology transfer from research and development to pilot plant and full-scale production of non-sterile semisolids. It covers the importance of pilot plants in scale-up, factors to consider in scaling up semisolids like mixing equipment and homogenization processes. SUPAC guidelines for scale-up and post-approval changes are also summarized. A case study demonstrates issues encountered like congealing during scale-up of a cream formulation containing diethylene glycol monoethyl ether from lab to pilot scale.
DESIGN OF EXPERIMENTS (DOE)
DOE is invented by Sir Ronald Fisher in 1920’s and 1930’s.
The following designs of experiments will be usually followed:
Completely randomised design(CRD)
Randomised complete block design(RCBD)
Latin square design(LSD)
Factorial design or experiment
Confounding
Split and strip plot design
FACTORIAL DESIGN
When a several factors are investigated simultaneously in a single experiment such experiments are known as factorial experiments. Though it is not an experimental design, indeed any of the designs may be used for factorial experiments.
For example, the yield of a product depends on the particular type of synthetic substance used and also on the type of chemical used.
ADVANTAGES OF FACTORIAL DESIGN.
Factorial experiments are advantageous to study the combined effect of two or more factors simultaneously and analyze their interrelationships. Such factorial experiments are economic in nature and provide a lot of relevant information about the phenomenon under study. It also increases the efficiency of the experiment.
It is an advantageous because a wide range of factor combination are used. This will give us an idea to predict about what will happen when two or more factors are used in combination.
DISADVANTAGES
It is disadvantageous because the execution of the experiment and the statistical analysis becomes more complex when several treatments combinations or factors are involved simultaneously.
It is also disadvantageous in cases where may not be interested in certain treatment combinations but we are forced to include them in the experiment. This will lead to wastage of time and also the experimental material.
2(square) FACTORIAL EXPERIMENT
A special set of factorial experiment consist of experiments in which all factors have 2 levels such experiments are referred to generally as 2n factorials.
If there are four factors each at two levels the experiment is known as 2x2x2x2 or 24 factorial experiment. On the other hand if there are 2 factors each with 3 levels the experiment is known as 3x3 or 32 factorial experiment. In general if there are n factors each with p levels then it is known as pn factorial experiment.
The calculation of the sum of squares is as follows:
Correction factor (CF) = (𝐺𝑇)2/𝑛
GT = grand total
n = total no of observations
Total sum of squares = ∑▒〖𝑥2−𝐶𝐹〗
Replication sum of squares (RSS) = ((𝑅1)2+(𝑅2)2+…+(𝑅𝑛)2)/𝑛 - CF
Or
1/𝑛 ∑▒𝑅2−𝐶𝐹
2(Cube) FACTORIAL DESIGN
In this type of design, one independent variable has 2 levels, and the other independent variable has 3 levels.
Estimating the effect:
In a factorial design the main effect of an independent variable is its overall effect averaged across all other independent variable.
Effect of a factor A is the average of the runs where A is at the high level minus the average of the runs
The document describes using evolutionary operation (EVOP) to optimize biodiesel production yield through incremental changes to methanol to oil ratio and sodium hydroxide catalyst concentration. It details conducting a first phase, cycle 1 experiment using a 22 factorial design at current operating conditions and high/low settings for each factor. Initial yields average 88.8%, with the maximum obtained at a methanol ratio of 0.34 and catalyst concentration of 0.95%.
The document discusses optimization in pharmaceutical formulation and processing. It defines optimization as choosing the best alternative from available options. Optimization in pharmacy involves formulating drug products using the best combination of ingredients and processing parameters. Experimental design techniques are used to optimize multiple variables. Response surface methodology and central composite designs are commonly used to model quadratic relationships between variables. The document outlines different types of experimental designs and their applications in pharmaceutical optimization.
Six Sigma - A Presentation by Akshay AnandAkshay Anand
Six Sigma is a quality management system that aims for near perfection. It uses statistical methods and process improvement tools to identify and remove defects. There are two main methodologies - DMAIC which improves existing processes, and DMADV which designs new processes. DMAIC involves defining a problem, measuring metrics, analyzing causes, improving the process, and controlling future performance. DMADV defines goals, measures customer needs, analyzes design options, designs an improved process, and verifies it meets requirements. Many major companies use Six Sigma to reduce costs and errors through projects led by Green Belts and Black Belts trained in its methods.
Introduction to Design of Experiments by Teck Nam Ang (University of Malaya)Teck Nam Ang
This set of slides explains in a simple manner the purpose of experiment, various strategies of experiment, how to plan and design experiment, and the handling of experimental data.
This presentation discusses statistical control charts which are tools used in pharmaceutical manufacturing to determine if a process is in statistical control. It defines control charts and explains that they provide a visual representation to monitor a process and identify instances where the process may be going out of control. The presentation covers the objectives, principles, types of control charts including variable and attribute charts, their characteristics and benefits such as improving quality, productivity and reducing defects. It also discusses using control charts to evaluate process capabilities.
This document discusses process capability analysis, which relates a production process's variability to customer specifications to determine if the process is capable of meeting requirements. It defines key terms like critical-to-quality characteristics, control charts, process capability indices Cp and Cpk. Cp measures a process's potential capability if centered on target, while Cpk considers deviation of the mean. For a process to be capable, its natural variation (control limits) must be narrower than specifications. If Cpk=1 the process is barely capable, and if Cpk<1 the process is incapable and requires improvement. Process capability analysis assumes an in-control, stable production process.
This document discusses deviations and out-of-specification/out-of-trend results in the pharmaceutical industry. It defines deviations as unwanted events that differ from approved processes or standards. Deviations are classified as major, critical, or minor depending on their impact. Out-of-specification results occur when test results fall outside predetermined specifications, requiring investigation. Out-of-trend results differ from historical results but are still within specifications, also necessitating investigation. The document provides examples of planned and unplanned deviations as well as approaches to minimize out-of-specification results through good practices.
The drug or drug combination may not be official in any pharmacopoeias.
A proper analytical procedure for the drug may not be available in the literature due to patent regulations.
Analytical methods may not be available for the drug in the form of a formulation due to the interference caused by the formulation excipients.
Analytical methods for the quantitation of the drug in biological fluids may not be available.
Analytical methods for a drug in combination with other drugs may not be available.
The existing analytical procedures may require expensive reagents and solvents. It may also involve cumbersome extraction and separation procedures and these may not be reliable.
The document discusses validation of analytical procedures. It defines validation as establishing by laboratory studies that an analytical procedure meets requirements for its intended use. It describes the typical steps in validating identification tests, quantitative impurity tests, and assays. Key validation characteristics discussed include specificity, linearity, range, accuracy, precision, detection and quantitation limits, robustness, and ruggedness. The guidelines provide details on establishing each characteristic to help ensure analytical methods are suitable for their intended pharmaceutical applications.
This document provides an overview of the methodology used in clinical trials. It defines key terms like randomized controlled trials, control groups, randomization, and blinding. It describes the various phases of clinical trials including phases 1-3 and post-marketing studies. Phase 1 trials test safety in healthy volunteers while phases 2 and 3 test efficacy in larger patient populations. The goals of each phase are explained as well as important demographic information. The document also outlines the drug development process from preclinical research through regulatory approval and commercialization.
Experimental methods are widely used in industrial settings and research activities. In industrial settings, the main goal is to extract the maximum amount of unbiased information regarding the factors affecting production process form few observations, whereas in research, ANOVA techniques are used to reveal the reality. Drawing inferences from the experimental result is an important step in design process of product. Therefore, proper planning of experimentation is the precondition for accurate conclusion drawn from the experimental findings. Design of experiment is powerful statistical tool introduced by R.A. Fisher in England in the early 1920 to study the effect of different parameters affecting the mean and variance of a process performance characteristics
Taguchi's orthogonal arrays are highly fractional orthogonal designs. These designs can be used to estimate main effects using only a few experimental runs.
Consider the L4 array shown in the next Figure. The L4 array is denoted as L4(2^3).
L4 means the array requires 4 runs. 2^3 indicates that the design estimates up to three main effects at 2 levels each. The L4 array can be used to estimate three main effects using four runs provided that the twthree-factoro factor and three factor interactions can be ignored.
This document provides information about a case study on the blending process for a pharmaceutical formulation. It includes:
1) Details of the formulation and factors being studied (mixing time, magnesium stearate concentration, and talc concentration) to evaluate blend uniformity.
2) Descriptions of key concepts for experimental design including treatments, experimental units, responses, and interactions.
3) Discussion of blocking as a technique to reduce nuisance factors like different batches of active ingredients being studied.
The document discusses design of experiments (DOE) and provides details about:
1) DOE is a process optimization technique that relies on planned experimentation and statistical analysis to study multiple factors and their interactions.
2) Traditional experimentation methods study one factor at a time and ignore interactions, while DOE allows studying multiple factors and interactions using fewer experiments.
3) Steps for DOE include defining objectives, factors, responses, levels, and designing the experiment using full or fractional factorial designs such as orthogonal arrays.
2003 work climate and organizational effectiveness-the application of data en...Henry Sumampau
This document discusses using data envelopment analysis (DEA) to measure organizational effectiveness when multiple inputs and outputs are involved. DEA calculates a single efficiency measure for each organization being analyzed without requiring the researcher to assign subjective weightings to inputs and outputs. The document argues that previous research on the relationship between organizational climate and effectiveness has been limited by analyzing multiple separate measures of effectiveness. It suggests that using DEA to generate a single efficiency measure for each organization can help overcome these limitations and provide a more comprehensive analysis of the climate-effectiveness relationship when multiple dimensions of effectiveness are involved. An example application of DEA to measure the relative efficiencies of branch offices in a retail banking network is described.
The document discusses various location strategy considerations for operations management. It covers factors that affect location decisions such as labor productivity, exchange rates, political risks, and proximity to markets/suppliers. Methods for evaluating location alternatives are described, including the factor-rating method, locational break-even analysis, and center-of-gravity method. Specific location strategies for different industries like hotels, call centers, and how companies use geographic information systems are also summarized.
This document provides an introduction to statistical significance testing. It discusses why significance tests are used, how they work, key terminology like p-values and hypotheses, and examples of one-sample and two-sample significance tests for means, proportions, and categorical data. Specific tests covered include the z-test, t-test, and chi-square test. The goal of significance testing is to determine whether observed differences in sample data could plausibly be due to chance or represent real effects in the underlying population.
- Response surface methodology (RSM) is used to optimize processes with multiple variables to maximize or minimize a response. It uses experimental design and regression analysis.
- The method of steepest ascent is used to sequentially move from an initial guess towards the optimum region using a first-order model. Additional experiments are conducted to fit higher-order models closer to the optimum.
- A second-order model that includes interaction and quadratic terms can identify if the stationary point is a maximum, minimum, or saddle point. Canonical analysis of the eigenvalues further characterizes the stationary point.
This document discusses statistical process control (SPC), which uses statistical methods to monitor and control processes to improve quality. SPC aims to ensure processes operate efficiently and produce specification-conforming products with less waste. Key SPC tools include control charts, histograms, cause-and-effect diagrams and check sheets. Control charts in particular plot process data over time to identify changes or variability. SPC provides benefits like reduced waste, lower costs, improved customer satisfaction and early problem detection and prevention.
Design Of Experiments (DOE) Applied To Pharmaceutical and Analytical QbD.SALMA RASHID SHAIKH
According to ICH Q8 Quality should be built into the product.
Design of Experiments (DoE) generate knowledge about a product or process and established a Mathematical relationship of dependent variables and independent variables.
The most common screening designs, such as two-level full factorial, fractionate factorial, and Plackett- Burman designs.
Optimization designs, such as three-level full factorial, central composite designs (CCD), and Box-Behnken designs.
Analysis of variance (ANOVA) used in multiple regression analysis to evaluate regression significance, residual error, and lack-of-fit adjustment.
Determination coefficients (R2, R2 -adj, and R2 -pred) is also evaluated.
Quality By Design:
QbD is “a systematic approach to pharmaceutical development that begins with predefined objectives and emphasizes product and process understanding and process control, based on sound science and quality risk management”
Goals Of Pharmaceutical QbD:
To achieve meaningful product quality specifications
To increase process capability and reduce product variability
To increase pharmaceutical development and manufacturing efficiencies and
To enhance cause-effect analysis and regulatory flexibility.
Technology Transfer From R and D to Pilot Plant to Plant for Non-Sterile Semi...shiv
Technology transfer is important for successful progress of drug development from research to commercialization. This presentation discusses technology transfer from research and development to pilot plant and full-scale production of non-sterile semisolids. It covers the importance of pilot plants in scale-up, factors to consider in scaling up semisolids like mixing equipment and homogenization processes. SUPAC guidelines for scale-up and post-approval changes are also summarized. A case study demonstrates issues encountered like congealing during scale-up of a cream formulation containing diethylene glycol monoethyl ether from lab to pilot scale.
DESIGN OF EXPERIMENTS (DOE)
DOE is invented by Sir Ronald Fisher in 1920’s and 1930’s.
The following designs of experiments will be usually followed:
Completely randomised design(CRD)
Randomised complete block design(RCBD)
Latin square design(LSD)
Factorial design or experiment
Confounding
Split and strip plot design
FACTORIAL DESIGN
When a several factors are investigated simultaneously in a single experiment such experiments are known as factorial experiments. Though it is not an experimental design, indeed any of the designs may be used for factorial experiments.
For example, the yield of a product depends on the particular type of synthetic substance used and also on the type of chemical used.
ADVANTAGES OF FACTORIAL DESIGN.
Factorial experiments are advantageous to study the combined effect of two or more factors simultaneously and analyze their interrelationships. Such factorial experiments are economic in nature and provide a lot of relevant information about the phenomenon under study. It also increases the efficiency of the experiment.
It is an advantageous because a wide range of factor combination are used. This will give us an idea to predict about what will happen when two or more factors are used in combination.
DISADVANTAGES
It is disadvantageous because the execution of the experiment and the statistical analysis becomes more complex when several treatments combinations or factors are involved simultaneously.
It is also disadvantageous in cases where may not be interested in certain treatment combinations but we are forced to include them in the experiment. This will lead to wastage of time and also the experimental material.
2(square) FACTORIAL EXPERIMENT
A special set of factorial experiment consist of experiments in which all factors have 2 levels such experiments are referred to generally as 2n factorials.
If there are four factors each at two levels the experiment is known as 2x2x2x2 or 24 factorial experiment. On the other hand if there are 2 factors each with 3 levels the experiment is known as 3x3 or 32 factorial experiment. In general if there are n factors each with p levels then it is known as pn factorial experiment.
The calculation of the sum of squares is as follows:
Correction factor (CF) = (𝐺𝑇)2/𝑛
GT = grand total
n = total no of observations
Total sum of squares = ∑▒〖𝑥2−𝐶𝐹〗
Replication sum of squares (RSS) = ((𝑅1)2+(𝑅2)2+…+(𝑅𝑛)2)/𝑛 - CF
Or
1/𝑛 ∑▒𝑅2−𝐶𝐹
2(Cube) FACTORIAL DESIGN
In this type of design, one independent variable has 2 levels, and the other independent variable has 3 levels.
Estimating the effect:
In a factorial design the main effect of an independent variable is its overall effect averaged across all other independent variable.
Effect of a factor A is the average of the runs where A is at the high level minus the average of the runs
The document describes using evolutionary operation (EVOP) to optimize biodiesel production yield through incremental changes to methanol to oil ratio and sodium hydroxide catalyst concentration. It details conducting a first phase, cycle 1 experiment using a 22 factorial design at current operating conditions and high/low settings for each factor. Initial yields average 88.8%, with the maximum obtained at a methanol ratio of 0.34 and catalyst concentration of 0.95%.
The document discusses optimization in pharmaceutical formulation and processing. It defines optimization as choosing the best alternative from available options. Optimization in pharmacy involves formulating drug products using the best combination of ingredients and processing parameters. Experimental design techniques are used to optimize multiple variables. Response surface methodology and central composite designs are commonly used to model quadratic relationships between variables. The document outlines different types of experimental designs and their applications in pharmaceutical optimization.
Six Sigma - A Presentation by Akshay AnandAkshay Anand
Six Sigma is a quality management system that aims for near perfection. It uses statistical methods and process improvement tools to identify and remove defects. There are two main methodologies - DMAIC which improves existing processes, and DMADV which designs new processes. DMAIC involves defining a problem, measuring metrics, analyzing causes, improving the process, and controlling future performance. DMADV defines goals, measures customer needs, analyzes design options, designs an improved process, and verifies it meets requirements. Many major companies use Six Sigma to reduce costs and errors through projects led by Green Belts and Black Belts trained in its methods.
Introduction to Design of Experiments by Teck Nam Ang (University of Malaya)Teck Nam Ang
This set of slides explains in a simple manner the purpose of experiment, various strategies of experiment, how to plan and design experiment, and the handling of experimental data.
This presentation discusses statistical control charts which are tools used in pharmaceutical manufacturing to determine if a process is in statistical control. It defines control charts and explains that they provide a visual representation to monitor a process and identify instances where the process may be going out of control. The presentation covers the objectives, principles, types of control charts including variable and attribute charts, their characteristics and benefits such as improving quality, productivity and reducing defects. It also discusses using control charts to evaluate process capabilities.
This document discusses process capability analysis, which relates a production process's variability to customer specifications to determine if the process is capable of meeting requirements. It defines key terms like critical-to-quality characteristics, control charts, process capability indices Cp and Cpk. Cp measures a process's potential capability if centered on target, while Cpk considers deviation of the mean. For a process to be capable, its natural variation (control limits) must be narrower than specifications. If Cpk=1 the process is barely capable, and if Cpk<1 the process is incapable and requires improvement. Process capability analysis assumes an in-control, stable production process.
This document discusses deviations and out-of-specification/out-of-trend results in the pharmaceutical industry. It defines deviations as unwanted events that differ from approved processes or standards. Deviations are classified as major, critical, or minor depending on their impact. Out-of-specification results occur when test results fall outside predetermined specifications, requiring investigation. Out-of-trend results differ from historical results but are still within specifications, also necessitating investigation. The document provides examples of planned and unplanned deviations as well as approaches to minimize out-of-specification results through good practices.
The drug or drug combination may not be official in any pharmacopoeias.
A proper analytical procedure for the drug may not be available in the literature due to patent regulations.
Analytical methods may not be available for the drug in the form of a formulation due to the interference caused by the formulation excipients.
Analytical methods for the quantitation of the drug in biological fluids may not be available.
Analytical methods for a drug in combination with other drugs may not be available.
The existing analytical procedures may require expensive reagents and solvents. It may also involve cumbersome extraction and separation procedures and these may not be reliable.
The document discusses validation of analytical procedures. It defines validation as establishing by laboratory studies that an analytical procedure meets requirements for its intended use. It describes the typical steps in validating identification tests, quantitative impurity tests, and assays. Key validation characteristics discussed include specificity, linearity, range, accuracy, precision, detection and quantitation limits, robustness, and ruggedness. The guidelines provide details on establishing each characteristic to help ensure analytical methods are suitable for their intended pharmaceutical applications.
This document provides an overview of the methodology used in clinical trials. It defines key terms like randomized controlled trials, control groups, randomization, and blinding. It describes the various phases of clinical trials including phases 1-3 and post-marketing studies. Phase 1 trials test safety in healthy volunteers while phases 2 and 3 test efficacy in larger patient populations. The goals of each phase are explained as well as important demographic information. The document also outlines the drug development process from preclinical research through regulatory approval and commercialization.
Experimental methods are widely used in industrial settings and research activities. In industrial settings, the main goal is to extract the maximum amount of unbiased information regarding the factors affecting production process form few observations, whereas in research, ANOVA techniques are used to reveal the reality. Drawing inferences from the experimental result is an important step in design process of product. Therefore, proper planning of experimentation is the precondition for accurate conclusion drawn from the experimental findings. Design of experiment is powerful statistical tool introduced by R.A. Fisher in England in the early 1920 to study the effect of different parameters affecting the mean and variance of a process performance characteristics
Taguchi's orthogonal arrays are highly fractional orthogonal designs. These designs can be used to estimate main effects using only a few experimental runs.
Consider the L4 array shown in the next Figure. The L4 array is denoted as L4(2^3).
L4 means the array requires 4 runs. 2^3 indicates that the design estimates up to three main effects at 2 levels each. The L4 array can be used to estimate three main effects using four runs provided that the twthree-factoro factor and three factor interactions can be ignored.
This document provides information about a case study on the blending process for a pharmaceutical formulation. It includes:
1) Details of the formulation and factors being studied (mixing time, magnesium stearate concentration, and talc concentration) to evaluate blend uniformity.
2) Descriptions of key concepts for experimental design including treatments, experimental units, responses, and interactions.
3) Discussion of blocking as a technique to reduce nuisance factors like different batches of active ingredients being studied.
The document discusses design of experiments (DOE) and provides details about:
1) DOE is a process optimization technique that relies on planned experimentation and statistical analysis to study multiple factors and their interactions.
2) Traditional experimentation methods study one factor at a time and ignore interactions, while DOE allows studying multiple factors and interactions using fewer experiments.
3) Steps for DOE include defining objectives, factors, responses, levels, and designing the experiment using full or fractional factorial designs such as orthogonal arrays.
2003 work climate and organizational effectiveness-the application of data en...Henry Sumampau
This document discusses using data envelopment analysis (DEA) to measure organizational effectiveness when multiple inputs and outputs are involved. DEA calculates a single efficiency measure for each organization being analyzed without requiring the researcher to assign subjective weightings to inputs and outputs. The document argues that previous research on the relationship between organizational climate and effectiveness has been limited by analyzing multiple separate measures of effectiveness. It suggests that using DEA to generate a single efficiency measure for each organization can help overcome these limitations and provide a more comprehensive analysis of the climate-effectiveness relationship when multiple dimensions of effectiveness are involved. An example application of DEA to measure the relative efficiencies of branch offices in a retail banking network is described.
The document discusses various location strategy considerations for operations management. It covers factors that affect location decisions such as labor productivity, exchange rates, political risks, and proximity to markets/suppliers. Methods for evaluating location alternatives are described, including the factor-rating method, locational break-even analysis, and center-of-gravity method. Specific location strategies for different industries like hotels, call centers, and how companies use geographic information systems are also summarized.
Practical Sustainability
Threat
Opportunity
Why should Businesses care?
ETS
Levies
Trade Barriers
Legislation
Environmental Watchdogs
Spills
Global Warming
Threats
LOHAS
Competitive Advantage
Clean Green NZ
100% Pure
Opportunities
Complexity
Multiple Stakeholders
6
Management Attitude
A system for planning, implementing, reviewing and improving the actions an organization takes to meet its environmental obligations.
Environmental Management System
Practical Guide #1
Be Holistic
Include all aspects of your business in the system
Practical Guide #2
Be Systematic
Break the system into manageable segments
Practical Guide #3
Be Inclusive
Delegate responsibility of each segment to its main stakeholders
Practical Guide #4
Manage Projects
Treat each segment as a project and break it down to manageable objectives
Practical Guide #5
Be Transparent
Communicate to all stakeholders regularly and provide them with access to all the information required
Practical Guide #6
Match International Standards
Work towards and acquire international standards that matter to your industry and customers
Practical Guide #7
Tell everyone
Communicate your programme objectives and your achievements to all your stakeholders
Some Tools
https://www.youtube.com/watch?v=MdZwuR0daso
https://www.youtube.com/watch?v=MdZwuR0daso
https://www.youtube.com/watch?v=cYOC8_jJcII
Step 1: Goal Definition & Scope (ISO 14040)
Step 2: Inventory Analysis (ISO 14041)
Step 3: Impact Assessment (ISO 14042)
Step 4: Improvement Assessment / Interpretation (ISO 14043)
Life Cycle Assessment (LCA)
20
21
Domestic Coffee Maker Example
Source: http://home.howstuffworks.com/coffee-maker.htm
21
22
Step 1: Goal Definition & Scope
Establish purpose & goal
Define decision criteria, function & functional unit
Define system boundaries
Life cycle stages
Time
Place
Determine required data quality
22
23
Step 1: Coffee Maker
Purpose of LCA?
Determine how to improve the environmental performance of a coffee maker
Decision criteria?
Total energy consumed, equivalent CO2 produced, eco-indicator 99 score
Function of coffee maker? Functional units?
Cups of coffee poured, Time coffee is warmed
System boundaries?
Five years of use, Europe, production, use & end-of-life stages
23
24
Difficulties & Limitations of Step 1
How do you compare different products that provide similar functions or services?
How do you compare similar products that provide multiple functions or services?
How do you define more abstract functional units such as entertainment from toys or higher self-esteem?
Where do you stop drawing the bounds to your system?
24
25
Step 2: Inventory Analysis
Make process tree or flow chart classifying events in a product’s life cycle
Determine all mass and energy inputs and outputs
Collect relevant data
Make assumptions for missing data
Establish (correct) material and energy.
Lean & Agile Project Manaagement: Its Leadership ConsiderationsDavid Rico
The document provides an overview of lean and agile project management and its leadership considerations. It begins with introducing the author's background and credentials in agile project management. It then discusses the need for agile project management by highlighting the high failure rates of traditional projects. The rest of the document outlines an agenda for covering topics including an introduction to agile project management, the model of agile project management, the phases of agile project management, scaling agile project management, and metrics for agile project management.
Identifying rebound effects in product-service systems: actors, mechanisms, t...Daniel Guzzo
Paper title: Identifying rebound effects in product-service systems: actors, mechanisms, triggers and drivers
Authors: Daniel Guzzo, Daniela C.A. Pigosso
Abstract:
The implementation of product-service systems (PSS) is prone to the occurrence of rebound effects (RE). This research aims to systematically identify the rebound mechanisms in a PSS context. Through the case study of a use-oriented PSS offer, we showcase a structured way to address RE that led to a comprehensive mapping of 23 mechanisms. The analysis demonstrates an approach to mapping rebound triggers, drivers, and mechanisms within the actors’ realms that designers can apply to ensure the potential sustainability gains of PSS offers.
DESIGN 2024 Conference presentation
This study uses an integrated assessment model to analyze climate policy approaches based on different ethical frameworks like utilitarianism, prioritarianism, and compromise. The model was run under multiple scenarios and the outcomes evaluated based on emissions control, costs, temperature change, and damages. The results showed that prioritarian frameworks led to steadier emissions reductions than Nordhaus or Stern, but all approaches exceeded the 2 degree warming limit. Prioritarianism presented the best balance of outcomes and aligned more with ethical considerations, but relied heavily on how damages were modeled. Further research is needed on prioritarian modeling and statistical analysis of the policy approaches.
This document contains information about an assignment for a Level 2 BTEC First Certificate in Applied Science course. The assignment requires students to:
1) Conduct a practical investigation into the physical properties of copper, glass, and brick and explain how these properties determine their uses.
2) Identify if samples of chemicals are solids, liquids, or gases at room temperature and pressure.
3) Explain why bricks are used for construction, copper wire for electricity, and glass for windows based on their physical properties.
The assignment aims to demonstrate how the physical properties of materials determine their applications. Students must provide evidence that meets criteria on practical investigations and explaining material properties and uses.
Corenet montreal 2_13_08_nanotech_materialsAhmad Rashwan
The document discusses how nanotechnology and nanomaterials can be used in sustainable construction. It begins by providing context on nanotechnology and its potential applications in construction materials like steel, concrete, glass, drywall, fabrics, coatings, and insulation. It then discusses a case study of using a sound-dampening nanogel material called ISOPods to provide acoustic privacy in open-plan offices without walls. The document concludes by giving an example of using aerogel, a transparent insulating nanomaterial, for skylight panels and insulation.
This document outlines an agenda and presentation on advanced project schedule risk analysis. It discusses how Monte Carlo simulation can be used to model schedule risk by assigning probability distributions to activity durations, accounting for risk along sequential paths and at merge points where parallel paths converge. It highlights the "merge bias" effect where having multiple parallel paths can increase overall schedule risk due to the increased likelihood of delay on at least one path. The document provides examples of applying duration probability distributions and simulation to simple schedules to demonstrate these concepts.
2017 GRESB Real Estate Results presentation for Canada, presented on 5 October in Toronto, hosted by Oxford Properties, with Industry Partners REALPAC and CaGBC, and sponsored by GRESB Global Partner Delos
Slide 36: WSP
Slide 45: Delos
Side 72: Quinn & Partners
The document summarizes Catherine Michelle Rose's PhD thesis from Stanford University on formulating product end-of-life strategies. It discusses her research on design for environment and the hierarchy of end-of-life strategies from reuse to recycling to disposal. The document also explains Philips Consumer Electronics' process for environmental impact analysis of products, which involves life cycle assessment tools to examine impacts across a product's entire lifecycle.
This document provides guidance on completing an Initial Study for a proposed project under the California Environmental Quality Act (CEQA). It discusses the purpose and contents of an Initial Study, including the project description, environmental setting, and checklist questions. It also covers the processes for adopting a Negative Declaration or Mitigated Negative Declaration, including public review requirements. Key points include substantiating all conclusions in an Initial Study with substantial evidence, and including adequate mitigation measures in a Mitigated Negative Declaration to reduce impacts to less-than-significant levels.
Readiness of Indian Export Sector and Strategy to deal with Climate Change Po...mehtavd
Globally there has been an acceptance ( albeit of varying degrees) that human
induced climate change needs to be addressed. In the absence of a global
agreement which is not yet in sight, various countries, regions, and private
companies have taken commitments and are at different levels of implementing
them. Europe has been the leader from a policy point of view followed by some other
countries and states in the USA. There have been unilateral policy initiatives by
private players like Walmart, GE etc. where they are asking for disclosures on
Carbon and other environmental parameters from their suppliers.
The Indian scenario has been one of taking small steps to address climate change mitigation and there are no requirements to report or improve on carbon footprint on
organizations. The Indian export sector which is dependent largely on the western
markets has grown more than 25% annually and is thus the first to be affected by
any climate change related policy initiative by the governments or buyer companies
in the west.
This dissertation studies the current policy initiatives from publicly available data both
in the west and India. As part of the research a survey was conducted using a
questionnaire to assess levels of Awareness, Action taken, perception of Risk and
Opportunity and Market Behaviour. There is a detailed analysis of the data collected
in the survey which points to the fact that a lot needs to be done to remain
competitive in a carbon constrained world order. Based on the analysis there is a
strategy proposed for the Indian export sector to retain their competitive advantage
with regards to the upcoming policy initiatives around climate change.
Res 351Education Specialist / snaptutorial.comMcdonaldRyan120
For more classes visit
www.snaptutorial.com
RES 351 Week 1 Discussion Question 1
RES 351 Week 1 Discussion Question 2
RES 351 Week 1 Individual Assignment Current Events in Business Research Paper
RES 351 Week 2 Discussion Question 1
RES 351 Week 2 Discussion Question 2
Mr. Sanket Chordiya presented on optimization techniques like factorial design and fractional factorial design. He introduced key terminology used in design of experiments like factors, levels, responses, effects and interactions. Full factorial design involves studying all possible factors and levels, while fractional factorial design is used when there are many factors to reduce the number of experiments. Software like Design-Expert can be used to design factorial experiments and analyze results. Factorial designs find applications in formulation, processing, and studying pharmacokinetic parameters. A case study on sustained release metformin tablets was presented to illustrate a 23 factorial design.
This document discusses risk evaluation and management in exploration and production projects. It emphasizes the importance of integrated data management, analysis, and visualization in reducing risks. Key aspects of risk include reservoir, trap and hydrocarbon risks, which depend on understanding geological processes. The document outlines the typical workflow in E&P projects, including data collection, mapping, interpretation, modeling, and risk analysis. It argues that integrating tools like seismic inversion, well log analysis, and basin modeling at different stages can help transform data into useful knowledge and reduce project risks.
The document discusses tools and methods for researchers to identify quality journals for publication, avoid predatory journals, and properly evaluate research impact. It covers identifying good journals using tools like UGC Care, Scimago, and DOAJ. It also discusses how to identify predatory journals and fake impact factors. The document reviews citation analysis methods like the h-index and i10-index and how to calculate journal impact factors. It concludes with information on publication ethics and avoiding plagiarism.
Introduction to DI Engineering Explorer for Oilfield ServicesDrillinginfo
DI Engineering Explorer is a proprietary tool that allows users to visualize completion and production data from over 130,000 wells across 14 states. It saves customers over $800,000 and a year of man-hours by providing well-level analysis and insights. The tool finds correlations between completion techniques and production metrics and identifies the specific materials used in the highest producing wells. It helps users find the most profitable opportunities by understanding what completion strategies are most effective.
Wellbriefing: Creating a building brief that helps clients articulate and pri...Atkins
Wellbeing is starting to be recognised as an important metric for citizens, employees and, increasingly, building occupants. Caroline Paradise, head of design research at Atkins, gave a 'Wellbriefing' at Vision 2015 in a seminar on Building Health and Wellness, where she talked about creating a building brief that helps clients articulate and prioritise wellbeing.
This presentation was first delivered in June 2015 at Vision 2015 in London.
Predictably Improve Your B2B Tech Company's Performance by Leveraging DataKiwi Creative
Harness the power of AI-backed reports, benchmarking and data analysis to predict trends and detect anomalies in your marketing efforts.
Peter Caputa, CEO at Databox, reveals how you can discover the strategies and tools to increase your growth rate (and margins!).
From metrics to track to data habits to pick up, enhance your reporting for powerful insights to improve your B2B tech company's marketing.
- - -
This is the webinar recording from the June 2024 HubSpot User Group (HUG) for B2B Technology USA.
Watch the video recording at https://youtu.be/5vjwGfPN9lw
Sign up for future HUG events at https://events.hubspot.com/b2b-technology-usa/
Codeless Generative AI Pipelines
(GenAI with Milvus)
https://ml.dssconf.pl/user.html#!/lecture/DSSML24-041a/rate
Discover the potential of real-time streaming in the context of GenAI as we delve into the intricacies of Apache NiFi and its capabilities. Learn how this tool can significantly simplify the data engineering workflow for GenAI applications, allowing you to focus on the creative aspects rather than the technical complexities. I will guide you through practical examples and use cases, showing the impact of automation on prompt building. From data ingestion to transformation and delivery, witness how Apache NiFi streamlines the entire pipeline, ensuring a smooth and hassle-free experience.
Timothy Spann
https://www.youtube.com/@FLaNK-Stack
https://medium.com/@tspann
https://www.datainmotion.dev/
milvus, unstructured data, vector database, zilliz, cloud, vectors, python, deep learning, generative ai, genai, nifi, kafka, flink, streaming, iot, edge
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...Social Samosa
The Modern Marketing Reckoner (MMR) is a comprehensive resource packed with POVs from 60+ industry leaders on how AI is transforming the 4 key pillars of marketing – product, place, price and promotions.
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...sameer shah
"Join us for STATATHON, a dynamic 2-day event dedicated to exploring statistical knowledge and its real-world applications. From theory to practice, participants engage in intensive learning sessions, workshops, and challenges, fostering a deeper understanding of statistical methodologies and their significance in various fields."
End-to-end pipeline agility - Berlin Buzzwords 2024Lars Albertsson
We describe how we achieve high change agility in data engineering by eliminating the fear of breaking downstream data pipelines through end-to-end pipeline testing, and by using schema metaprogramming to safely eliminate boilerplate involved in changes that affect whole pipelines.
A quick poll on agility in changing pipelines from end to end indicated a huge span in capabilities. For the question "How long time does it take for all downstream pipelines to be adapted to an upstream change," the median response was 6 months, but some respondents could do it in less than a day. When quantitative data engineering differences between the best and worst are measured, the span is often 100x-1000x, sometimes even more.
A long time ago, we suffered at Spotify from fear of changing pipelines due to not knowing what the impact might be downstream. We made plans for a technical solution to test pipelines end-to-end to mitigate that fear, but the effort failed for cultural reasons. We eventually solved this challenge, but in a different context. In this presentation we will describe how we test full pipelines effectively by manipulating workflow orchestration, which enables us to make changes in pipelines without fear of breaking downstream.
Making schema changes that affect many jobs also involves a lot of toil and boilerplate. Using schema-on-read mitigates some of it, but has drawbacks since it makes it more difficult to detect errors early. We will describe how we have rejected this tradeoff by applying schema metaprogramming, eliminating boilerplate but keeping the protection of static typing, thereby further improving agility to quickly modify data pipelines without fear.
ViewShift: Hassle-free Dynamic Policy Enforcement for Every Data LakeWalaa Eldin Moustafa
Dynamic policy enforcement is becoming an increasingly important topic in today’s world where data privacy and compliance is a top priority for companies, individuals, and regulators alike. In these slides, we discuss how LinkedIn implements a powerful dynamic policy enforcement engine, called ViewShift, and integrates it within its data lake. We show the query engine architecture and how catalog implementations can automatically route table resolutions to compliance-enforcing SQL views. Such views have a set of very interesting properties: (1) They are auto-generated from declarative data annotations. (2) They respect user-level consent and preferences (3) They are context-aware, encoding a different set of transformations for different use cases (4) They are portable; while the SQL logic is only implemented in one SQL dialect, it is accessible in all engines.
#SQL #Views #Privacy #Compliance #DataLake
2. What is a Designed Experiment?
A systematic procedure carried out under controlled
conditions in order to discover an unknown effect, to
test or establish a hypothesis, or to illustrate a known
effect.
7/10/2020 Author: Mark D. Harrison 2
3. Why use DOE?
Reduce time to design/develop new products and processes
Compare Alternatives
Identify significant Inputs (Factors) affecting an Output
(Response) also known as “Separating the Vital Few from
the Trivial Many”
Reducing Variability
Establish a Minimum, Maximum or Target of a Response
Achieve Product and Process Robustness
Balancing tradeoffs
7/10/2020 Author: Mark D. Harrison 3
4. A Brief History of DOE
1918 – 1940s
R. A. Fisher and Co – Workers
Used extensively in Agricultural science
Factorial Design and ANOVA
1050s to late 1970s
1st Industrial era
Box & Wilson, Response Surfaces
Chemical and Process Industries
1970s – 1990s
2nd Industrial era
Quality improvement initiatives in many industries
CQI and TQM became company goals
7/10/2020 Author: Mark D. Harrison 4
5. Approaches to Experimentation
Trail and Error
Change One Separate Factor at a Time (C.O.S.T.)
Design of Experiments (DOE)
7/10/2020 Author: Mark D. Harrison 5
9. pH of Solution
7/10/2020 Author: Mark D. Harrison 9
pH is incremented by .5 units
You can see the highest yield is at about a pH of 4.5
10. Plot the Data
7/10/2020 Author: Mark D. Harrison 10
Point #10 is what we think is the highest yield
11. The Real Process
7/10/2020 Author: Mark D. Harrison 11
Our experiment location is on the edge of the highest yield possible
12. What DOE shows us
7/10/2020 Author: Mark D. Harrison 12
Our optimum is not a process optimum
Which direction to experiment next
13. DOE vs C.O.S.T.
Better approach to experimenting
DOE suggests # of runs, usually less than C.O.S.T.
DOE provides a model for the direction to follow
Many factors can be used, not just two
Benefits of DOE
An organized approach that connects experiments in a
rational manner
The influence of and interactions between all factors can be
estimated
More precise information is acquired in fewer experiments
Results are evaluated in the light of variability
Support for decision-making: map of the system (response
contour plot)
7/10/2020 Author: Mark D. Harrison 13
14. Components of DOE
Factors / Controllable Variables)
Variables you can change (Time, Temperature, etc..)
Levels
Where Factors are set (2 minutes, 100 degrees, etc)
Responses
Outputs of your process (yield, dimension, weight, etc)
7/10/2020 Author: Mark D. Harrison 14
15. The DOE Process
Define the Objective
Define the Process and select Factors to be studied
Select a Response and Measurement system
Select an Experimental Design
Execute Experiments accurately
Check results for any issues
Model data
Verify predicted results with confirmation
experiments to validate model
7/10/2020 Author: Mark D. Harrison 15
20. The Minitab Tutorial
Purpose
Investigate two factors that might decrease the time that is
needed to prepare an order for shipment: the order-
processing system and the packing procedure.
Factors
Order Processing System
Packing Procedures
Levels
Current/Proposed
A / B
Responses
Time to prepare order for shipment
7/10/2020 Author: Mark D. Harrison 20
44. DOE Software Providers
Minitab
Minitab 18 Data Analysis
Minitab Quality Companion Project Management
Sas JMP
Sas JMP Data Analysis
Statgraphics
Centurion 18
These companies provide free 30 day trails of their software
7/10/2020 Author: Mark D. Harrison 44
45. DOE References
One Factor at a Time vs DOE Veronica Czitrom
Design of Experiments – Moresteam
DOE Upendra Kartik
Minitab Blog
DOE Umetrics
7/10/2020 Author: Mark D. Harrison 45