This webinar looks at answering this question, not by going deeply into the various designed experiment types, but from a process improvement perspective. Progressing from a definition of a designed experiment, to Why and when do I need a designed experiment?, What’s the concept? (and why can’t I do a “one-factor-at-a-time” series of experiments? , to Will this tool solve REAL WORLD problems?
Introduction to Design of Experiments by Teck Nam Ang (University of Malaya)Teck Nam Ang
This set of slides explains in a simple manner the purpose of experiment, various strategies of experiment, how to plan and design experiment, and the handling of experimental data.
Factors affecting Design of Experiment (DOE) and softwares of DOED.R. Chandravanshi
What is an experiment ?
An experiment refers to any process that generates a set of data.
An experiment involves a test or series of test in which purposeful changes are made to the input variables of a process or system so that changes in the output responses can be observed and identified.
Introduction to Design of Experiments by Teck Nam Ang (University of Malaya)Teck Nam Ang
This set of slides explains in a simple manner the purpose of experiment, various strategies of experiment, how to plan and design experiment, and the handling of experimental data.
Factors affecting Design of Experiment (DOE) and softwares of DOED.R. Chandravanshi
What is an experiment ?
An experiment refers to any process that generates a set of data.
An experiment involves a test or series of test in which purposeful changes are made to the input variables of a process or system so that changes in the output responses can be observed and identified.
Everything You Wanted to Know About Definitive Screening DesignsJMP software from SAS
An introduction to definitive screening designs (DSDs). These slides describe issues with standard screening designs and how to overcome these issues by using DSDs and orthogonally blocked DSD, first introduced by Bradley Jones of SAS and Christopher Nachtsheim of the Carlson School of Management, University of Minnesota. For information about using JMP software for design of experiments and DSDs, see http://www.jmp.com/applications/doe/
Speaker at seminar "The Pharmaceutical quality system: ICH Q8/ICH Q9" - University of Parma, 18 May 2012.
Describing steps, tools, and approaches developed for application of QbD to manufacturing processes that have analogous application to the development and use of analytical methods.
An introduction to SigmaXL's Design of Experiments tools.
Established in 1998, SigmaXL Inc. is a leading provider of user friendly Excel Add-ins for Lean Six Sigma graphical and statistical tools and Monte Carlo simulation.
SigmaXL® customers include market leaders like Agilent, Diebold, FedEx, Microsoft, Motorola and Shell. SigmaXL® software is also used by numerous colleges, universities and government agencies.
Our flagship product, SigmaXL®, was designed from the ground up to be a cost-effective, powerful, but easy to use tool that enables users to measure, analyze, improve and control their service, transactional, and manufacturing processes. As an add-in to the already familiar Microsoft Excel, SigmaXL® is ideal for Lean Six Sigma training and application, or use in a college statistics course.
DiscoverSim™ enables you to quantify your risk through Monte Carlo simulation and minimize your risk with global optimization. Business decisions are often based on assumptions with a single point value estimate or an average, resulting in unexpected outcomes.
DiscoverSim™ allows you to model the uncertainty in your inputs so that you know what to expect in your outputs.
Design of Experiment (DOE): Taguchi Method and Full Factorial Design in Surfa...Ahmad Syafiq
Taguchi and full factorial design techniques to highlight the application and to compare the effectiveness of the Taguchi and full factorial design processes as applied on surface
roughness.
In this slide contains Study of Quality of Raw Materials and General methods of analysis of Raw materials used in cosmetic manufacture as per BSI
Presented by: P.PAVAN KALYAN (Department of pharmaceutical analysis).RIPER, anantapur
Please Subscribe to this Channel for more solutions and lectures
http://www.youtube.com/onlineteaching
Chapter 12: Analysis of Variance
12.1: One-Way ANOVA
The presentation was presented by Sahil Jain at IIIT-Delhi
The presentation briefly explains the Wilcoxon Rank-Sum test along with the help of an example.
This presentation was made to solely for students to make them aware/ understand basics of “Analytical Method Validation”. These slides are part of lectures delivered in M. Pharmacy Curriculum & taken up from various books and websites
Exploring Best Practises in Design of Experiments: A Data Driven Approach to ...JMP software from SAS
Learn about best practises in the
design of experiments and a data-driven approach to DOE that increases robustness, efficiency and effectiveness. This was presented at a JMP seminar in the UK.
Everything You Wanted to Know About Definitive Screening DesignsJMP software from SAS
An introduction to definitive screening designs (DSDs). These slides describe issues with standard screening designs and how to overcome these issues by using DSDs and orthogonally blocked DSD, first introduced by Bradley Jones of SAS and Christopher Nachtsheim of the Carlson School of Management, University of Minnesota. For information about using JMP software for design of experiments and DSDs, see http://www.jmp.com/applications/doe/
Speaker at seminar "The Pharmaceutical quality system: ICH Q8/ICH Q9" - University of Parma, 18 May 2012.
Describing steps, tools, and approaches developed for application of QbD to manufacturing processes that have analogous application to the development and use of analytical methods.
An introduction to SigmaXL's Design of Experiments tools.
Established in 1998, SigmaXL Inc. is a leading provider of user friendly Excel Add-ins for Lean Six Sigma graphical and statistical tools and Monte Carlo simulation.
SigmaXL® customers include market leaders like Agilent, Diebold, FedEx, Microsoft, Motorola and Shell. SigmaXL® software is also used by numerous colleges, universities and government agencies.
Our flagship product, SigmaXL®, was designed from the ground up to be a cost-effective, powerful, but easy to use tool that enables users to measure, analyze, improve and control their service, transactional, and manufacturing processes. As an add-in to the already familiar Microsoft Excel, SigmaXL® is ideal for Lean Six Sigma training and application, or use in a college statistics course.
DiscoverSim™ enables you to quantify your risk through Monte Carlo simulation and minimize your risk with global optimization. Business decisions are often based on assumptions with a single point value estimate or an average, resulting in unexpected outcomes.
DiscoverSim™ allows you to model the uncertainty in your inputs so that you know what to expect in your outputs.
Design of Experiment (DOE): Taguchi Method and Full Factorial Design in Surfa...Ahmad Syafiq
Taguchi and full factorial design techniques to highlight the application and to compare the effectiveness of the Taguchi and full factorial design processes as applied on surface
roughness.
In this slide contains Study of Quality of Raw Materials and General methods of analysis of Raw materials used in cosmetic manufacture as per BSI
Presented by: P.PAVAN KALYAN (Department of pharmaceutical analysis).RIPER, anantapur
Please Subscribe to this Channel for more solutions and lectures
http://www.youtube.com/onlineteaching
Chapter 12: Analysis of Variance
12.1: One-Way ANOVA
The presentation was presented by Sahil Jain at IIIT-Delhi
The presentation briefly explains the Wilcoxon Rank-Sum test along with the help of an example.
This presentation was made to solely for students to make them aware/ understand basics of “Analytical Method Validation”. These slides are part of lectures delivered in M. Pharmacy Curriculum & taken up from various books and websites
Exploring Best Practises in Design of Experiments: A Data Driven Approach to ...JMP software from SAS
Learn about best practises in the
design of experiments and a data-driven approach to DOE that increases robustness, efficiency and effectiveness. This was presented at a JMP seminar in the UK.
This presentation introduces Statistical Discovery, a process that allows you to work with data to discover new, useful, insights that drive cycles of learning. After a brief overview to introduce the concept, an example involving property prices in the US will be used to demonstrate the how the process works in practice. Through this example we also exemplify the skills and aptitudes required to exercise the process successfully.
Autologous and Allogeneic Cell Therapy Industrialisation – Overcoming Clinical Manufacturing Hurdles Early
A presentation by Chief Operating Officer, Dr Stephen Ward
This talk was presented live at JMP Discovery Summit 2102 in Cary, North Carolina, USA. More information about design of experiments is available at http://www.jmp.com/applications/doe/
Accelerated life tests (ALTs) are employed to generate failure time data at higher-than-normal-use stress levels. ALT planning is critical for achieving statistical efficiency and reducing experimental cost through design of experiments (DOE). In this talk, I will describe a real world example of ALT planning and its impact on decision making. I will present models for regression with failure time data, including exponential and Weibull regression. Censoring, which is present in many life testing experiments, and its effect on regression models is discussed. Graphical methods for data analysis of life testing experiments are discussed, as well as the software for ALT planning and data analysis.
The design of experiments (DOE, DOX, or experimental design) is the design of any task that aims to describe and explain the variation of information under conditions that are hypothesized to reflect the variation.
The term is generally associated with experiments in which the design introduces conditions that directly affect the variation, but may also refer to the design of quasi-experiments, in which natural conditions that influence the variation are selected for observation.
In its simplest form, an experiment aims at predicting the outcome by introducing a change of the preconditions, which is represented by one or more independent variables, also referred to as "input variables" or "predictor variables."
The change in one or more independent variables is generally hypothesized to result in a change in one or more dependent variables, also referred to as "output variables" or "response variables."
Design of experiments formulation development exploring the best practices ...Maher Al absi
Now days computer tools used in the formulation and development of pharmaceutical product. Various technique such as design of experiment are implemented for optimization of formulation and processing parameter.
The anonymised slides from an old (but hopefully still relevant) talk on the case for placing a strategic focus on design testability. The material covers the technical, process and organisational considerations arising from such a strategy and is predominantly a summary of the ideas presented in Brett Pettichord's 2001 "Design For Testability' paper available here. The presentation makes a case for why a high level of design testability can be seen as a critical success factor in achieving sustained agility.
As device software complexity grows and test cycles shrink, the risk of untested code resulting in defects in the field increases every day.
Wind River Test Management is a test management solution that identifies high-risk segments in production code, enabling change-based, optimized testing, using real-time instrumentation of devices under test.
Wind River Test Management provides the following:
•Coverage and performance metrics on the same code you ship to customers
•Optimized test suite generator that runs only the test needed to validate changes
•Full-featured lab management system and a universal, open test execution engine to run any type of test on any device
Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Passion For Testing, By Examples by Peter Zimmerer. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Test automation principles, terminologies and implementationsSteven Li
A general slides for test automation principle, terminologies and implementation
Also, the slides provide an example - PET, which is a platform written by Perl, but not just for Perl. It provides a general framework to use.
On Duty Cycle Concept in Reliability - Definitions, Pitfalls, and Clarifications
By Frank Sun, Ph.D.
Product Reliability Engineering
HGST, a Western Digital company
For ASQ Reliability Division Webinar
August 14, 2014
Objectives
To provide an introduction to the statistical analysis of
failure time data
To discuss the impact of data censoring on data analysis
To demonstrate software tools for reliability data analysis
Organization
Reliability definition
Characteristics of reliability data
Statistical analysis of censored reliability data
Objectives
To understand Weibull distribution
To be able to use Weibull plot for failure time analysis and
diagnosis
To be able to use software to do data analysis
Organization
Distribution model
Parameter estimation
Regression analysis
With the increase in global competition, more and more costumers consider reliability as one of their primary deciding factors, when purchasing new products. Several companies have invested in developing their own Design for Reliability (DFR) processes and roadmaps in order to be able to meet those requirements and compete in today’s market. This presentation will describe the DFR roadmap and how to effectively use it to ensure the success of the reliability program by focusing on the following DFR elements.
Improved QFN Reliability Process by John Ganjei. John will talk about the improvements in the reliability process in this webinar.
It is free to attend - see www.reliabilitycalendar.org/webinars/ to register for upcoming events.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
2. ASQ Reliability Division
English Webinar Series
One of the monthly webinars
on topics of interest to
reliability engineers.
To view recorded webinar (available to ASQ Reliability
Division members only) visit asq.org/reliability
To sign up for the free and available to anyone live
webinars visit reliabilitycalendar.org and select English
Webinars to find links to register for upcoming events
http://reliabilitycalendar.org/The_Reli
ability_Calendar/Webinars_‐
_English/Webinars_‐_English.html
3. Why do a Designed
Experiment?
- What is it?
- Why do I need it?
- How do I do it?
- Can it solve my problem?
9/8/2011 Jim Breneman1
4. What is Design of Experiments?
• A designed experiment is a test or series of tests in which purposeful
changes are made to the input variables of a process or system so
that we may observe and identify the reasons for changes in the
output response…
Doug Montgomery
9/8/2011 2
5. Why do I need a Designed experiment?
General Model of a Process or System
Process
Controllable factors
w1 w2 wp
...
NOISE
Inputs Transformation Vehicle Output
y
x1 x2… xp
NOISE We often want to
... Minimize
Maximize
z1 z2 zq or
Reduce variability
Uncontrollable factors in the Output(y)
9/8/2011 3
6. Where does a Designed
Experiment fit in?
Let’s look at Deming’s PDCA process
and then the DMAIC process
Control Define
Act Plan
Check Do
Improve Measure & Analyze
Major DOE use
9/8/2011 4
7. Step 1: What’s the objective of Your Experiment?
• Comparative objective:
– Primary goal is make a conclusion about one a-priori important factor.
– i.e. is this factor “significant” (and possibly what level maximizes or minimizes the
response)
• Process Improvement objective (Sequential experimentation):
– Step 1: Screening…the primary purpose of the 1st experiment is to select or screen out
the few important main effects from the many less important ones.
– Step 2: Followed up with experiment(s) to define the important 2-factor interactions,
(and any 3-factor interactions that may be important based on experience).
• Response Surface objective:
– The experiment is designed to allow us to estimate interaction (and even quadratic
effects), and to optimize the response or responses.
– Each factor is usually at 3 levels.
9/8/2011 5
8. Step 2 What level of Evidence will I accept?
I. Controlled Trials with complete randomization. DOE
II. Empirical Evidence.
a) Controlled Trials without complete randomization.
b) Case directed studies. Carefully observing cases as they occur.
c) Multiple Time Sequence Studies (Looking back through data files for
patterns and drastic changes)
“Scientific Studies have shown….”
III. Delphi (Agreement between a group of knowledgeable “experts”)
“8 out of 10 doctors recommend …..”
IV. Personal Antidote (“In my experience”)
V. Personal bullying (“I think we should do it this way”)
9/8/2011 6
9. Step 3: The DOE Roadmap
D
Design
Brainstorm Experiment
Run Experiment Sequential
M & Collect Data experimentation
Analyze data/
A Interpret results
Choose variables
& levels
I Run confirmation
test
C Incorporate into design
or process
9/8/2011 7
10. The Strategy of Experimentation
1. Screening experiments to find
the mountain range
2. Factorial/Fractional factorial Region of Interest
experiments to get close to
the peak.
3. Response Surface Modeling
to “climb” it.
Region of Operability
9/8/2011 8
11. Review of Basics
Language
• Factor: An independent variable. This is what we control and
change in an experiment. A factor is often generically
referred to as xi.
Examples: Reaction Temperature, Bake Time, Fuel flow, Stress
• Factor Setting or Level: A particular value for a factor.
For example, the factor Bake Temperature might have a setting of 275° F. Bake
Temperature is the factor, 275° F is one of the levels.
• Experimental Run: A particular combination of factor
settings.
For example, one run in an experimental design (for say a composite piece) might call for a
Bake Temperature of 275° F, a Bake Time of 30 minutes, and a Bake pressure of 5Atm.
9/8/2011 9
12. Review of Basics
Language
• Experimental Design: The complete set of runs that we plan
to do. It is sometimes called the Design Matrix. Experimental
design in general is often referred to as DOE or DOX (Design of Experiments).
• Response: A dependent variable. The level of the response is
measured rather than controlled like a factor. It is referred to as a
response because we think its level will change in response to changes in the factor settings.
One of the goals of experimental design is to relate changes in the factor levels to measured
changes in the response values.
Examples of Typical Response Variables: Tensile Strength, Elongation, Thrust
• Factor Effect: The way in which changes in the level of a
factor translate to changes in the response level.
9/8/2011 10
13. Review of Basics
Language
• Interaction: When the effect of a factor depends on the level
of another factor, the two factors are said to interact:
Life=f(Stress, Temp)
Temp 1
Life
(hrs)
Temp 2
Stress
9/8/2011 11
14. General Observations of DOEs
In General
1. Several factors (or main effects) will be significant
2. Some two-factor interactions will be significant
3. Very few (if any) three-factor and higher order interactions will be significant
Concentrate on main effects and 2-factor interactions in your experiments.
However, if a three-factor interaction is perceived to exist, then include it in the
experiment!
9/8/2011 12
15. Why Do Statistically Designed Experiments?
• Statistically designed experiments can detect and describe
factor-factor interactions.
Experiments that vary only one factor at a time and trial-and-error
experiments cannot.
• Statistically designed experiments offer more precise
estimates of factor effects for the same number of runs
compared to a one-factor-at-a-time (ladder) study.
This is because DOE’s use “hidden replication” and the
power of averaging to see through noise.
Let me illustrate this with an engineering example.
9/8/2011 13
16. DOE vs One-Factor-at-a-Time(OFAT)
Cooling Metal
Temp Air Temp Temp
Deg F Deg F Deg F
An engineer performed an experiment on a new piece
600 2500 1900
of equipment . As a function of three factors: 700 2500 1900
• Cooling Temp(°F) 800 2500 1900
• Air Temperature (°F) 900 2500 1900
• Metal temperature (°F) 1000 2500 1900
800 2300 1900
The objective was to maximize the response (y variable);
800 2400 1900
in this case, part life.
800 2500 1900
The engineer performed the experiment as a one-factor-
800 2600 1900
at-a-time for three factors in 15 runs 800 2700 1900
800 2500 1700
800 2500 1800
800 2500 1900
800 2500 2000
800 2500 2100
9/8/2011 14
17. DOE vs One-Factor-at-a-Time
Cooling Temp(°F)
1000
Illustrating this, we see that we can estimate that
effect, both linear & quadratic; however, we MetalTemp (°F)
900
cannot estimate interactions.
2300 2400 800
2500 2600 2700
Air Temp (°F)
700
600
9/8/2011 15
18. DOE beats OFAT – Round 1
The Box Behnken designed experiment shown
here and in the accompanying figure could have
been performed instead. Both the One-Factor-at-
a-Time and the designed experiments have 15
runs( if three center points are used in the Box
Behnken design to make the design rotatable
and to provide an estimate of natural variability).
And, the Box-Behnken estimates interactions
and their importance!
Cooling Temp(°F)
9/8/2011 16
19. OFAT vs DOE – Round 2
DOE’s “Hidden replication” beats OFAT
Cooling Temp(°F)
1000
4 points
MetalTemp (°F)
900
1 point
Cooling Temp(°F)
2300 2400 800
2500 2600 2700
Air Temp (°F)
700
600
1 point
9/8/2011 18
20. DOE Review
Planning Carefully
• DOE provides a useful framework for applied
experimentation. However, there’s no magic
involved and one of its advantages is that it
forces some rigorous thinking before an
experiment is started
9/8/2011 19
21. DOE Review
Planning Carefully
1. What do we want to have accomplished when the experiment is
finished?
2. What responses are in my job objectives?
3. How well can we measure these responses?
4. What factors are likely to cause these responses to vary?
5. How many factors can I reasonably investigate in a single
experiment?
6. Over what range should they be varied?
9/8/2011 20
22. Review
Planning Carefully
How will I manage the noise.
1. Consider making the noise factor into an experimental factor
for study.
2. Hold the noise factor as constant as possible during the
experiment.
3. Randomize the experiment.
4. Measure the noise factor levels for future analysis
(covariates).
5. Ignore it.
9/8/2011 21
23. Review
Noise Management
• Noise makes the effects of controllable factors more
difficult to see.
• If it happens that noise variables interact with controllable
variables, the conclusions we draw from an experiment will
only be valid at the noise variable settings experienced
during the experiment.
• As a result, the experiment may not repeat at a later time.
i.e there may be an indication that noise variables are
present with larger effects than the controllable variables!
9/8/2011 22
25. Can DOE solve my problem?
Responses:
1. Performance Model Assumptions
2. $
3. Safety The Model: Y=b1X1 + b2X2 + b12X1X2 + Noise
1. Each factor can be varied independently of the others.
2. All points in the experimental region are feasible:
There’s a difference between “infeasible” experimental runs and those
that are merely expected to give “bad” results.
3. Factors are continuous (so a center point makes some sense)
4. The experimental noise is consistent across the design space.
5. All factors can be run across the other factors.
6. Runs can be done in a random order.
7. There is no curvature in the model.
8. There are no CONSTRAINTS on Resources.
9/8/2011 24
26. A DOE Path Forward
Assumptions If No, then What
Each factor can be varied independently of
1 the others. Mixture or D-Optimal
All points in the experimental region are
2 feasible: D-Optimal
Factors are continuous (so a center point
3 makes some sense) Categorical Factorials
The experimental noise is consistent across
4 the design space. Blocking
5 All factors can be run across other factors Nested Designs
6 Runs can be done in a random order. Split Plots
RSM (Central Composite
7 There is no curvature in the model. Models)
8 There are no CONSTRAINTS on Resources. Fractional Factorials
9/8/2011 25
28. But, you could start by doing 8 experiments with these 7
variables to find which are MOST important
Taguchi L8 Orthogonal Array or Plackett-Burman
8-run screening design
Variables
Trial A B C D E F G
No. 1 2 3 4 5 6 7
1 1 1 1 1 1 1 1
2 1 1 1 2 2 2 2
3 1 2 2 1 1 2 2
4 1 2 2 2 2 1 1
5 2 1 2 1 2 1 2
6 2 1 2 2 1 2 1
7 2 2 1 1 2 2 1
8 2 2 1 2 1 1 2
9/8/2011 27
29. Before we proceed, one more “truism:”
After completing an experiment and analyzing
the significant factors, remember to recognize
the difference between practical and
statistical significance.
Practical Importance
Yes No
Move Stop
Yes (document Lessons
Forward learned)
Why not think
Statistical
about this
Significance at the beginning?
No Take more
Stop
data?
9/8/2011 28
30. Summary
• What is it?
A designed experiment is a test or series of tests in which purposeful changes are
made to the input variables of a process or system so that we may observe and
identify the reasons for changes in the output response…
• Why do I need it?
Efficiency with hidden replication and interaction detection.
• How do I do it?
Multi-factor sequential experimentation
• Can it solve my problem?
Yes, very often!
9/8/2011 29