Measurement System Analysis is the first step of the Measure Phase of an improvement project. Before you can pass judgment on the process, you need to ensure that your measurement system is accurate, precise, capable and in control.
The presentation is about basic statistical techniques and how statistics can be used effectively in the quality control and process control. It also presents statistical package Minitab version 16 and some of its applications in the field of statistical process control.
Detailed illustration of MSA procedures both for Variable and attribute, Analysis of results and planning for MSA. Complete guidance for planning and implementation of MSA.
An illustration on the Measurement System Analysis(MSA) which leads to Excellence in Dimensional integrity. A complete journey through the process and explanations for implementation.
The presentation is about basic statistical techniques and how statistics can be used effectively in the quality control and process control. It also presents statistical package Minitab version 16 and some of its applications in the field of statistical process control.
Detailed illustration of MSA procedures both for Variable and attribute, Analysis of results and planning for MSA. Complete guidance for planning and implementation of MSA.
An illustration on the Measurement System Analysis(MSA) which leads to Excellence in Dimensional integrity. A complete journey through the process and explanations for implementation.
Dear All, I have prepared this presentation to get a better understanding of Statistical Process Control (SPC). This is a very informative presentation and giving information about the History of SPC, the basics of SPC, the PDCA approach, the Benefits of SPC, application of 7-QC tools for problem-solving. You can follow this technique in your day to day business working to solve the problems. Thanking you.
THIS PPT IS ABOUT MEASUREMENT SYSTEM ANALYSIS.. THIS IS VERY USEFUL FOR PERSON WORKING IN INDUSTRY. IT ALSO TALK ABOUT SIX SIGMA APPROACH FOR EFFECTIVE MEASUREMENT.REPEATIBILITY & REPRODUCIBILITY ARE ALSO WELL EXPLAINED IN THIS PPT.
Measurement System Analysis (MSA) course is essential for successful Six Sigma DMAIC and DFSS projects. It is also key for implementation of SQC, and efficient process management.
Reliable measurement processes are critical to the success of any effort dependent on measurement data and process analysis, including Six Sigma DMAIC improvement projects, DFSS project, SPC, SQC, Supplier Quality, and business process management and continuous improvement. Without validation that measurements are accurate, repeatable with multiple measurements by the same person, reproducible from person to person (gage Repeatability and Reproducibility or gage R&R), all conclusions are suspect, and process management is therefore fragile and ineffective.
Organizations typically focus on measurement accuracy and calibration, but this course also emphasizes the essential elements of reliable measurement procedures.
Dear All, I have prepared this presentation to get a better understanding of Statistical Process Control (SPC). This is a very informative presentation and giving information about the History of SPC, the basics of SPC, the PDCA approach, the Benefits of SPC, application of 7-QC tools for problem-solving. You can follow this technique in your day to day business working to solve the problems. Thanking you.
THIS PPT IS ABOUT MEASUREMENT SYSTEM ANALYSIS.. THIS IS VERY USEFUL FOR PERSON WORKING IN INDUSTRY. IT ALSO TALK ABOUT SIX SIGMA APPROACH FOR EFFECTIVE MEASUREMENT.REPEATIBILITY & REPRODUCIBILITY ARE ALSO WELL EXPLAINED IN THIS PPT.
Measurement System Analysis (MSA) course is essential for successful Six Sigma DMAIC and DFSS projects. It is also key for implementation of SQC, and efficient process management.
Reliable measurement processes are critical to the success of any effort dependent on measurement data and process analysis, including Six Sigma DMAIC improvement projects, DFSS project, SPC, SQC, Supplier Quality, and business process management and continuous improvement. Without validation that measurements are accurate, repeatable with multiple measurements by the same person, reproducible from person to person (gage Repeatability and Reproducibility or gage R&R), all conclusions are suspect, and process management is therefore fragile and ineffective.
Organizations typically focus on measurement accuracy and calibration, but this course also emphasizes the essential elements of reliable measurement procedures.
Half day workshop slides that have been presented at Computer Measurement Group for the last few years, and at Usenix 08 and LISA 08. This version is what will be presented at Usenix 09, San Diego, June 16th, along with the Solaris/Linux Performance slide deck.
Statistical process control (SPC) is a method of quality control which uses statistical methods. SPC is applied in order to monitor and control a process. Monitoring and controlling the process ensures that it operates at its full potential. At its full potential, the process can make as much conforming product as possible with a minimum (if not an elimination) of waste (rework or scrap). SPC can be applied to any process where the "conforming product" (product meeting specifications) output can be measured. Key tools used in SPC include control charts; a focus on continuous improvement; and the design of experiments. An example of a process where SPC is applied is manufacturing lines.
Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...Gabor Szabo, CQE
This presentation walks you through the components of variation and the various metrics used in Variable Gage R&R Study. It also talks about the different root causes associated with a failing study, and how to perform root cause analysis using statistical tools.
Quality Journey- Measurement System Analysis .pdfNileshJajoo2
A measurement systems analysis (MSA) is a thorough assessment of a measurement process, and typically includes a specially designed experiment that seeks to identify the components of variation in that measurement process.
If there are errors in our measurement system we will be making decisions based on incorrect data. We could be making incorrect decisions or producing non-conforming parts.
A properly planned and executed Measurement System Analysis (MSA) can help build a strong foundation for any data based decision making process.
A measurement systems analysis considers the following:
Selecting the correct measurement and approach
Assessing the measuring device
Assessing procedures and operators
Assessing any measurement interactions
Calculating the measurement uncertainty of individual measurement devices and/or measurement systems
Common tools and techniques of measurement systems analysis include :-
Calibration , Gage R&R , ATA /RTR , ANOVA ,
Calibration Requirement
Alignment
ATA –Audit the Auditor (RTR – Review the Reviewer) – name itself defines the check process for the auditor
1. QL(Quality Lead) to Re Audit the transaction audited by Quality Specialist Within 48 hrs of monitoring.
2. QL to track & publish variance report
3. Both Overall & Parameter Wise Variance is calculated
4. Any Variance >5% ( as defined in process ) should be documented & Should Reaudit another call within 72 Hrs
5. This should be weekly activity by TL's & below is the tracker
Gage repeatability and reproducibility (GR&R) is defined as the process used to evaluate a gauging instrument’s accuracy by ensuring its measurements are repeatable and reproducible. The process includes taking a series of measurements to certify that the output is the same value as the input, and that the same measurements are obtained under the same operating conditions over a set duration.
Standard GR&R
Expanded GR&R
Actual Process
VariationMeasurement
Variation
Accuracy and Precision
Accuracy:
Precision:
Linearity:
Stability:
Gage R & R for Continuous Data
X Bar R Method
Typically used in automobile industry
Extreme values affect the method
Short & Long Method
Short Method does not measure operator and equipment variability separately
Long method measures operator and equipment variability separately
ANOVA Method
Measures operator & equipment variability separately as well as combined effect of operator & parts
More effective when extreme value are present
Most tedious to perform manual calculations
Analyzing Gage R&R Results
R&R less than 10%–Measurement System “acceptable
R&R 10% to 30%–May be acceptable–make decision based on classification of Characteristic , Application, Customer Input, etc.
R&R over 30%–Not acceptable. Find problem, re-visit the Fishbone Diagram, remove Root Causes
Bias
Stability
Linearity
Repeatability
The average of multiple measurements of an event are equal to the true value
There is little variation in repeated measurements of the same
Measuremen Systems Analysis Training ModuleFrank-G. Adler
The Six Sigma Measurement Systems Analysis (MSA) Training Module includes a MS PowerPoint Presentation including 62 slides covering an Introduction to Measurement Systems Analysis - Relevance - Discrimination - Accuracy - Stability - Linearity - Precision, Variable Gage R&R Study, and Attribute Gage R&R Study.
Critical Checks for Pharmaceuticals and Healthcare: Validating Your Data Inte...Minitab, LLC
Watch online at: https://hubs.ly/H0hswm60
Organizations in the pharmaceutical and health sectors are being asked by regulators to:
- Apply more complete methods to validate analytical techniques and measurement systems, known as Data Integrity
-Monitor and evaluate the performance of production processes, otherwise called Statistical Process Control (SPC)
In this presentation you will learn how to:
-Improve the precision and accuracy of analytical techniques, using Minitab's tools for Gage R & R, Gage Linearity and Bias studies and Design of Experiments
-Select the relevant control charts and capability analyses for data that does and does not follow the normal distribution
The presentation will explain how data integrity and process monitoring are critical to each other for regulatory compliance. If the data is not healthy, the evaluation of the process could also be incorrect.
You will finish with the confidence to use more sophisticated statistical techniques, in particular for data integrity.
Value Stream Mapping is a key component of Value Stream Management – the process by which Lean concepts and tools are utilized to minimize waste and promote one piece flow pulled by customer demand through the entire operation.
Response Surface Regression - a useful tool for data mining, historical data analysis, and identifying critical factors in your process optimization efforts.
Student information management system project report ii.pdfKamal Acharya
Our project explains about the student management. This project mainly explains the various actions related to student details. This project shows some ease in adding, editing and deleting the student details. It also provides a less time consuming process for viewing, adding, editing and deleting the marks of the students.
6th International Conference on Machine Learning & Applications (CMLA 2024)ClaraZara1
6th International Conference on Machine Learning & Applications (CMLA 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of on Machine Learning & Applications.
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)MdTanvirMahtab2
This presentation is about the working procedure of Shahjalal Fertilizer Company Limited (SFCL). A Govt. owned Company of Bangladesh Chemical Industries Corporation under Ministry of Industries.
Hierarchical Digital Twin of a Naval Power SystemKerry Sado
A hierarchical digital twin of a Naval DC power system has been developed and experimentally verified. Similar to other state-of-the-art digital twins, this technology creates a digital replica of the physical system executed in real-time or faster, which can modify hardware controls. However, its advantage stems from distributing computational efforts by utilizing a hierarchical structure composed of lower-level digital twin blocks and a higher-level system digital twin. Each digital twin block is associated with a physical subsystem of the hardware and communicates with a singular system digital twin, which creates a system-level response. By extracting information from each level of the hierarchy, power system controls of the hardware were reconfigured autonomously. This hierarchical digital twin development offers several advantages over other digital twins, particularly in the field of naval power systems. The hierarchical structure allows for greater computational efficiency and scalability while the ability to autonomously reconfigure hardware controls offers increased flexibility and responsiveness. The hierarchical decomposition and models utilized were well aligned with the physical twin, as indicated by the maximum deviations between the developed digital twin hierarchy and the hardware.
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
This paper addresses the vulnerability of deep learning models, particularly convolutional neural networks
(CNN)s, to adversarial attacks and presents a proactive training technique designed to counter them. We
introduce a novel volumization algorithm, which transforms 2D images into 3D volumetric representations.
When combined with 3D convolution and deep curriculum learning optimization (CLO), itsignificantly improves
the immunity of models against localized universal attacks by up to 40%. We evaluate our proposed approach
using contemporary CNN architectures and the modified Canadian Institute for Advanced Research (CIFAR-10
and CIFAR-100) and ImageNet Large Scale Visual Recognition Challenge (ILSVRC12) datasets, showcasing
accuracy improvements over previous techniques. The results indicate that the combination of the volumetric
input and curriculum learning holds significant promise for mitigating adversarial attacks without necessitating
adversary training.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
About
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Technical Specifications
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
Key Features
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface
• Compatible with MAFI CCR system
• Copatiable with IDM8000 CCR
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
Application
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
1. Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 1
• Measurement System Analysis (MSA) is the first step of the measure phase
along the DMAIC pathway to improvement.
• You will be basing the success of your improvement project on key performance
indicators that are tied to your measurement system.
• Consequently, before you begin tracking metrics you will need to complete a MSA
to validate the measurement system.
• A comprehensive MSA typically consists of six parts; Instrument Detection Limit,
Method Detection Limit, Accuracy, Linearity, Gage R&R and Long Term Stability.
• If you want to expand measurement capacity or qualify another instrument you
must complement the MSA to include Metrology Correlation and Matching.
2. Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 2
• A poor measurement system can make data meaningless and process
improvement impossible.
• Large measurement error will prevent assessment of process stability and
capability, confound Root Cause Analysis and hamper continuous improvement
activities in manufacturing operations.
• Measurement error has a direct impact on assessing the stability and capability of
a process.
• Poor metrology can make a stable process appear unstable and make a capable
process appear incapable.
• Measurement System Analysis quantifies the effect of measurement error on the
total variation of a unit operation.
• The sources of this variation may be visualized as in Figure 7.1 and the elements of
a measurement system as in Figure 7.2.
3. Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 3
Observed Process
Variation
Actual Process
Variation
Measurement
Variation
Long-term
Process Variation
Short-term
Process Variation
Variation within
Sample
Variation due to
Gage
Variation due to
Operators
Repeatability Calibration Stability Linearity
Figure 7.1 Sources of Variation
4. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 4
Figure 7.2 Measurement System Elements
Hardware
Software
Cleanliness
Humidity & Temp
Vibration
Lighting
Power Source
Setup
Calibration Frequency
Calibration Technique
Sample Preparation
Operator Procedure
Data Entry
Calculations
Equipment
Procedures Environment
Performance
5. Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 5
• Operators are often skeptical of measurement systems, especially those that
provide them with false feedback causing them to “over-steer” their process.
• This skepticism is well founded since many measurement systems are not capable
of accurately or precisely measuring the process.
• Accuracy refers to the average of individual measurements compared with the
known, true value.
• Precision refers to the grouping of the individual measurements - the tighter the
grouping, the higher the precision.
• The bull’s eye targets of Figure 7.3 best illustrate the difference between accuracy
and precision.
6. Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 6
Figure 7.3 Accuracy vs Precision – The Center of the Target is the Objective
Good Accuracy / Bad Precision
Bad Accuracy / Good Precision Good Accuracy / Good Precision
Bad Accuracy / Bad Precision
7. Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 7
• Accuracy is influenced by resolution, bias, linearity and stability whereas precision
is influenced by repeatability and reproducibility of the measurement system.
• Repeatability is the variation which occurs when the same operator repeatedly
measures the same sample on the same instrument under the same conditions.
• Reproducibility is the variation which occurs between two or more instruments or
operators measuring the same sample with the same measurement method in a
stable environment.
• The total variance in a quality characteristic of a process is described by Eqn 7.1
and Eqn 7.2.
• The percent contribution of the measurement system to the total variance may be
calculated from Eqn 7.3.
• We want to be able to measure true variations in product quality and not
variations in the measurement system so it is desired to minimize 2
measurement
• We will review the steps in a typical measurement system analysis by way of
example, first for the case of variables data and then for the case of attribute data.
8. Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 8
2
total = 2
product + 2
measurement
where 2
total = total variance
2
product = variance due to product
2
measurement = variance due to measurement system
Eqn 7.1
2
measurement = 2
repeatability + 2
reproducibility
where 2
repeatability = variance within operator/device combination
2
reproducibility = variance between operators
Eqn 7.2
2
repeatability + 2
reproducibility
Eqn 7.3
2
total
% Contribution = X 100
9. Operational Excellence
Measurement System Analysis
Operational Excellence
Instrument Detection Limit (IDL)
1/28/2017 Ronald Morgan Shewchuk 9
• Today’s measurement devices are an order of magnitude more complex than the
“gages” for which the Automotive Industry Action Group (AIAG) first developed
Gage Repeatability and Reproducibility (Gage R&R) studies.
• Typically they are electromechanical devices with internal microprocessors having
inherent signal to noise ratios.
• The Instrument Detection Limit (IDL) should be calculated from the baseline noise
of the instrument.
• Let us examine the case where a gas chromatograph (GC) is being used to measure
the concentration of some analyte of interest. Refer to Figure 7.4.
10. Operational Excellence
Measurement System Analysis
Operational Excellence
Instrument Detection Limit (IDL)
1/28/2017 Ronald Morgan Shewchuk 10
Figure 7.4 Gas Chromatogram
11. Operational Excellence
Measurement System Analysis
Operational Excellence
Instrument Detection Limit (IDL)
1/28/2017 Ronald Morgan Shewchuk 11
• The chromatogram has a baseline with peaks at different column retention times
for hydrogen, argon, oxygen, nitrogen, methane and carbon monoxide.
• Let’s say we wanted to calculate the IDL for nitrogen at retention time 5.2 min.
• We would purge and evacuate the column to make sure it is clean then
successively inject seven blanks of the carrier gas (helium).
• The baseline noise peak at retention time 5.2 min is integrated for each of the
blank injections and converted to concentration units of Nitrogen.
• The standard deviation of these concentrations is multiplied by the Student’s t
statistic for n-1 degrees of freedom at a 99% confidence interval (3.143) to
calculate the IDL.
• This is the EPA protocol as defined in 40 CFR Part 136: Guidelines Establishing Test
Procedures for the Analysis of Pollutants, Appendix B.
• Refer to Figure 7.5 below for the calculation summary.
13. Operational Excellence
Measurement System Analysis
Operational Excellence
Method Detection Limit (MDL)
1/28/2017 Ronald Morgan Shewchuk 13
• Method detection limit (MDL) is defined as the minimum concentration of a
substance that can be measured and reported with 99% confidence that the
analyte concentration is greater than zero as determined from analysis of a sample
in a given matrix containing the analyte.
• MDL is calculated in a similar way to IDL with the exception that the same sample
is measured on the instrument with n=7 trials and the sample is disconnected and
reconnected to the measurement apparatus between trials.
• This is called dynamic repeatability analysis.
• An estimate is made of the MDL and a sample prepared at or near this MDL
concentration.
• The seven trials are then measured on the instrument and the MDL calculated as
in Figure 7.6.
• MDL divided by the mean of the seven trials should be within 10-100%.
• If this is not the case, repeat the MDL analysis with a starting sample concentration
closer to the calculated MDL.
15. Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Variables Data
1/28/2017 Ronald Morgan Shewchuk 15
• A properly conducted measurement system analysis (MSA) can yield a treasure
trove of information about your measurement system.
• Repeatability, reproducibility, resolution, bias, and precision to tolerance ratio are
all deliverables of the MSA and can be used to identify areas for improvement in
your measurement system.
• It is important to conduct the MSA in the current state since this is your present
feedback mechanism for your process.
• Resist the temptation to dust off the Standard Operating Procedure and brief the
operators on the correct way to measure the parts.
• Resist the temptation to replace the NIST1 - traceable standard, which looks like it
has been kicked around the metrology laboratory a few times.
1National Institute of Standards and Technology
16. Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Variables Data
1/28/2017 Ronald Morgan Shewchuk 16
• To prepare for an MSA you must collect samples from the process that span the
specification range of the measurement in question.
• Include out-of-spec high samples and out-of-spec low samples.
• Avoid creating samples artificially in the laboratory.
• There may be complicating factors in the commercial process which influence your
measurement system.
• Include all Operators in the MSA who routinely measure the product.
• The number of samples times the number of Operators should be greater than or
equal to fifteen, with three trials for each sample.
• If this is not practical, increase the number of trials as per Figure 7.7.
17. Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Variables Data
1/28/2017 Ronald Morgan Shewchuk 17
• Code the samples such that the coding gives no indication to the expected
measurement value – this is called blind sample coding.
• Have each sample measured by an outside laboratory.
• These measurements will serve as your reference values.
• Ask each Operator to measure each sample three times in random sequence.
• Ensure that the Operators do not “compare notes”.
• We will utilize Minitab to analyze the measurement system described in Case Study III.
Samples x Operators Trials
S x O ≥ 15 3
8 ≤ S x O < 15 4
5 ≤ S x O < 8 5
S x O < 5 6
Figure 7.7 Measurement System Analysis Design
18. Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Variables Data
1/28/2017 Ronald Morgan Shewchuk 18
Case Study III: Minnesota Polymer Co.
Minnesota Polymer Co. supplies a special grade of resin to ABC Molding Co. which includes a silica modifier to improve
dimensional stability. The product code is POMBLK-15 and the silica concentration specification by weight is 15 2%. Silica
concentration is determined by taking a sample of the powdered resin and pressing it into a 4 cm disk using a 25-ton hydraulic
press. The sample disk is then analyzed by x-ray fluorescence energy dispersive spectroscopy (XRF-EDS) to measure the silica
content. Manufacturing POMBLK-15 is difficult. The silica is light and fluffy and sometimes gets stuck in the auger used to feed
the mixing tank. A new process engineer, Penelope Banks, has been hired by Minnesota Polymer. One of her first assignments
is to improve POMBLK-15 process control. SPC analysis of historical batch silica concentration results have indicated out-of-
control symptoms and poor Cpk. Before Penny makes any changes to the process she prudently decides to conduct a
measurement system analysis to find out the contribution of the measurement system to the process variation.
Minnesota Polymer is a firm believer in process ownership. The same operator who charges the raw materials, runs the
manufacturing process, collects the quality control sample, presses the sample disk and then runs the silica analysis on the XRF-
EDS instrument. The operator uses the silica concentration analysis results to adjust the silica charge on the succeeding batch.
POMBLK-15 is typically run over a five-day period in the three-shift, 24/7 operation.
Penny has collected five powder samples from POMBLK-15 process retains which span the silica specification range and included
two out-of-specification samples pulled from quarantine lots. She has asked each of the three shift operators to randomly
analyze three samples from each powder bag for silica content according to her sampling plan. Penny has sent a portion of each
sample powder to the Company’s R&D Headquarters in Hong Kong for silica analysis. These results will serve as reference
values for each sample. The following table summarizes the silica concentration measurements and Figure 7.8 captures the
screen shots of the MSA steps for Case Study III.
Sample Operator 1 Operator 2 Operator 3
Bag # Reference1
Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3
1 17.3 18.2 17.9 18.2 18.1 18.0 18.0 17.8 17.8 18.2
2 14.0 14.4 14.9 14.8 14.8 14.6 14.8 14.4 14.4 14.5
3 13.3 14.0 13.9 13.8 13.9 14.2 14.0 13.8 13.7 13.8
4 16.7 17.2 17.2 17.4 17.4 17.3 17.5 17.4 17.5 17.5
5 12.0 12.9 12.8 12.5 12.5 12.9 12.8 12.9 12.5 12.6
1
As Reported by Hong Kong R&D Center
19. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 19
Figure 7.8 Measurement System Analysis Steps – Variable Data
Open a new worksheet. Click on Stat Quality Tools Gage Study Create Gage R&R Study Worksheet on the top menu.
20. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 20
Figure 7.8 Measurement System Analysis Steps – Variable Data
Enter the Number of Operators, the Number of Replicates and the Number of Parts in the dialogue box. Click OK.
21. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 21
Figure 7.8 Measurement System Analysis Steps – Variable Data
The worksheet is modified to include a randomized run order of the samples.
22. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 22
Figure 7.8 Measurement System Analysis Steps – Variable Data
Name the adjoining column Silica Conc and transcribe the random sample measurement data to the relevant cells in the worksheet.
23. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 23
Figure 7.8 Measurement System Analysis Steps – Variable Data
Click on Stat Quality Tools Gage Study Gage R&R Study (Crossed) on the top menu.
24. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 24
Figure 7.8 Measurement System Analysis Steps – Variable Data
Select C2 Parts for Part numbers, C3 Operators for Operators and C4 Silica Conc for Measurement data in the
dialogue box. Click the radio toggle button for ANOVA under Method of Analysis. Click Options.
25. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 25
Figure 7.8 Measurement System Analysis Steps – Variable Data
Six (6) standard deviations will account for 99.73% of the Measurement System variation. Enter Lower Spec Limit
and Upper Spec Limit in the dialogue box. Click OK. Click OK.
26. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 26
Figure 7.8 Measurement System Analysis Steps – Variable Data
A new graph is created in the Minitab project file with the Gage R&R analysis results.
27. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 27
Return to the session by clicking on Window Session on the top menu to view the ANOVA analytical results.
28. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 28
• Let us more closely examine the graphical output of the Gage R&R (ANOVA) Report
for Silica Conc.
• Figure 7.9 shows the components of variation.
• A good measurement system will have the lion’s share of variation coming from
the product, not the measurement system.
• Consequently, we would like the bars for repeatability and reproducibility to be
small relative to part-to-part variation.
Figure 7.9 MSA Components of Variation
29. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 29
• Figure 7.10 captures the range SPC chart by Operators.
• The range chart should be in control.
• If it is not, a repeatability problem is present.
Figure 7.10 MSA Range Chart by Operators
30. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 30
• By contrast, the X-bar SPC chart of Figure 7.11 should be out of control.
• This seems counterintuitive but it is a healthy indication that the variability present
is due to part to part differences rather than Operator to Operator differences
Figure 7.11 MSA X-bar Chart by Operators
31. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 31
• Figure 7.12 is an individual value plot of silica concentration by sample number.
• The circles with a cross indicate the mean of the sample data and the solid circles
are individual data points.
• We want a tight grouping around the mean for each sample and we want
significant variation between the means of different samples.
• If we do not have variation between samples the MSA has been poorly designed
and we essentially have five samples of the same thing.
• This will preclude analysis of the measurement system.
Figure 7.12 MSA Silica Concentration by Sample Number
32. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 32
• Figure 7.13 is a boxplot of silica concentration by Operator.
• As in Figure 7.12 the circles with a cross indicate the mean concentration for all
samples by Operator.
• The shaded boxes represent the interquartile range (Q3-Q1) for each Operator.
• The interquartile range (IQR) is the preferred measure of spread for data sets
which are not normally distributed.
• The solid line within the IQR is the median silica concentration of all samples by
Operator.
• If Operators are performing the same, we would expect similar means, medians
and IQRs.
Figure 7.13 MSA Silica Concentration by Operator
33. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 33
• Figure 7.14 is an individual value plot used to check for Operator-Sample
interactions.
• The lines for each Operator should be reasonably parallel to each other.
• Crossing lines indicate the presence of Operator-Sample interactions.
• This can happen when Operators are struggling with samples at or near the MDL
or if the instrument signal to noise ratio varies as a function of concentration.
Figure 7.14 MSA Sample by Operator Interaction
34. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 34
• Let us now focus on the analytical output of the session window as captured in
Figure 7.8.
• Lovers of Gage R&Rs will typically look for four metrics as defined below and
expect these metrics to be within the acceptable or excellent ranges specified by
Gage R&R Metric Rules of Thumb as shown in Figure 7.15.
2
measurement
Eqn 7.4
2
total
% Contribution = X 100
measurement
Eqn 7.5
total
% Study Variation = X 100
6measurement
Eqn 7.6
USL- LSL
Two-Sided Spec % P/T = X 100
3measurement
Eqn 7.7
TOL
One-Sided Spec % P/T = X 100
35. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 35
1.41total
Eqn 7.8
measurement
Number of Distinct Categories = trunc
where 2
total = Total Variance
2
measurement = Variance due to Measurement System
total = Total Standard Deviation
measurement = Standard Deviation due to Measurement System
P/T = Precision to Tolerance Ratio
USL= Upper Spec Limit
LSL= Lower Spec Limit
TOL= Process Mean – LSL for LSL only
TOL= USL – Process Mean for USLonly
Gage R&R Metric Unacceptable Acceptable Excellent
% Contribution > 7.7% 2.0 - 7.7% < 2%
% Study Variation > 28% 14 - 28% < 14%
% P/T Ratio > 30% 8 - 30% < 8%
Number of Distinct Categories < 5 5 - 10 > 10
Figure 7.15 Gage R&R Metrics – Rules of Thumb
36. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 36
• The highlighted output of the Minitab session window indicates a % Contribution
of the measurement system of 0.55%.
• This is in the excellent region.
• % Study Variation is 7.39% which is also in the excellent region.
• Precision to Tolerance ratio is 25.37%.
• This is in the acceptable region.
• Number of distinct categories is 19, well within the excellent region.
• Overall, this is a good measurement system.
• Now, let us proceed to check for linearity and bias by adding the reference
concentrations as measured by the Hong Kong R&D Center for each of the samples
to the worksheet.
• Figure 7.16 captures the screen shots necessary for this process.
37. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 37
Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data
Return to the active worksheet by clicking on Window Worksheet 1 *** on the top menu. Name the adjoining column Reference Conc
and enter the reference sample concentration values corresponding to each sample (Part) number.
38. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 38
Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data
Click on Stat Quality Tools Gage Study Gage Linearity and Bias Study on the top menu.
39. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 39
Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data
Select C2 Parts for Part numbers, C5 Reference Conc for Reference values and C4 Silica Conc for Measurement data in the dialogue box.
Click OK.
40. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 40
Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data
A new graph is created in the Minitab project file with the Gage Linearity and Bias Study results.
41. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 41
• We can see there is a bias between the Hong Kong measurement system and
Minnesota Polymer’s measurement system.
• The bias is relatively constant over the silica concentration range of interest as
indicated by the regression line.
• The Minnesota Polymer measurement system is reading approximately 0.67 wt %
Silica higher than Hong Kong.
• This is not saying that the Hong Kong instrument is right and the Minnesota
Polymer instrument is wrong.
• It is merely saying that there is a difference between the two instruments which
must be investigated.
• This difference could have process capability implications if it is validated.
• Minnesota Polymer may be operating in the top half of the allowable spec range.
• The logical next step is for the Hong Kong R&D center to conduct an MSA of similar
design, ideally with the same sample set utilized by Minnesota Polymer.
42. Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Attribute Data
1/28/2017 Ronald Morgan Shewchuk 42
Case Study IV: Virtual Cable Co.
David Raffles Lee has just joined Virtual Cable Co., the leading telecommunications company in the southwest as Chief Executive
Officer. David comes to Virtual Cable with over thirty years of operations experience in the telecommunications industry in
Singapore. During a tour of one of the Customer Service Centers, David noticed that the customer service agents were all
encased in bulletproof glass. David queried the Customer Service Manager, Bob Londale about this and Bob responded, “It is for
the protection of our associates. Sometimes our customers become angry and they produce weapons.” David was rather
shocked about this and wanted to learn more about customer satisfaction at Virtual Cable. He formed a team to analyze the
measurement of customer satisfaction. This team prepared ten scripts of typical customer complaints with an intended
outcome of pass – customer was satisfied with the customer service agent’s response or fail – customer was dissatisfied with
the response. Twenty “customers” were coached on the scripts, one script for two customers. These customers committed the
scripts to memory and presented their service issue to three different Customer Service Agents at three different Customer
Service Centers. Each customer was issued an account number and profile to allow the Customer Service Agent to rate the
customer’s satisfaction level in the customer feedback database as required by Virtual Cable’s policy. The results are
summarized in the attached table and analyzed by the MSA attribute data steps of Figure 7.17.
• In our next case we will analyze the measurement system used to rate customer
satisfaction as described in Case Study IV below.
Operator 1 Operator 2 Operator 3
Script # Reference1
Rep 1 Rep 2 Rep 1 Rep 2 Rep 1 Rep 2
1 F F F F F F F
2 P P P P P P P
3 P P P P P P P
4 P P P P P P P
5 F F F F P F F
6 P P P P P P P
7 F F F F F F F
8 F F F F F F F
9 P P F P P F P
10 F F F F F F F
1
Intended outcome of script from Customer Satisfaction Team
43. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 43
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Open a new worksheet. Click on Stat Quality Tools Create Attribute Agreement Analysis Worksheet on the top menu.
44. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 44
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Enter the Number of samples, the Number of appraisers and the Number of replicates in the dialogue box. Click OK.
45. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 45
Figure 7.17 Measurement System Analysis Steps – Attribute Data
The worksheet is modified to include a randomized run order of the scripts (samples).
46. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 46
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Name the adjoining columns Response and Reference. Transcribe the satisfaction level rating and the reference value of the script to
the appropriate cells.
47. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 47
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Click on Stat Quality Tools Attribute Agreement Analysis on the top menu.
48. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 48
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Select C4 Response for Attribute column, C2 Samples for Samples and C3 Appraisers for Appraisers in the dialogue box. Select C5
Reference for Known standard/attribute. Click OK.
49. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 49
Figure 7.17 Measurement System Analysis Steps – Attribute Data
A new graph is created in the Minitab project file with the Attribute Assessment Agreement results.
Date of study:
Reported by:
Name of product:
Misc:
321
1 00
90
80
70
60
Appraiser
Percent
95.0% CI
Percent
321
1 00
90
80
70
60
Appraiser
Percent
95.0% CI
Percent
Assessment Agreement
Within Appraisers Appraiser vs Standard
50. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 50
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Display the analytical MSA Attribute Agreement Results by clicking on Window Session on the top menu.
51. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 51
• The attribute MSA results allow us to determine the percentage overall agreement,
the percentage agreement within appraisers (repeatability), the percentage
agreement between appraisers (reproducibility), the percentage agreement with
reference values (accuracy) and the Kappa Value (index used to determine how
much better the measurement system is than random chance).
• From the graphical results we can see that the Customer Service Agents were in
agreement with each other 90% of the time and were in agreement with the
expected (standard) result 90% of the time.
• From the analytical results we can see that the agreement between appraisers was
80% and the overall agreement vs the standard values was 80%.
• The Kappa Value for all appraisers vs the standard values was 0.90, indicative of
excellent agreement between the appraised values and reference values.
• Figure 7.18 provides benchmark interpretations for Kappa Values.
52. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 52
Figure 7.18 Rules of Thumb for Interpreting Kappa Values
Kappa Value Interpretation
-1.0 to 0.6 Agreement expected as by chance
0.6 to 0.7 Marginal agreement - significant effort required to improve measurement system
0.7 to 0.8 Good agreement - some improvement to measurement system is warranted
0.9 to 1.0 Excellent agreement
Attribute MSA - Kappa Value
• Another way of looking at this case is that out of sixty expected outcomes there
were only three miscalls on rating customer satisfaction by the Customer Service
Agents included in this study.
• Mr. Lee can have confidence in the feedback of the Virtual Cable customer
satisfaction measurement system and proceed to identify and remedy the
underlying root causes of customer dissatisfaction.
53. Operational Excellence
Measurement System Analysis
Operational Excellence
Improving the Measurement System
1/28/2017 Ronald Morgan Shewchuk 53
• Improvements to the measurement system should be focused on the root cause(s)
of high measurement system variation.
• If repeatability is poor, consider a more detailed repeatability study using one part
and one operator over an extended period of time.
• Ask the operator to measure this one sample twice per day for one month.
• Is the afternoon measurement always greater or always lesser than the morning
measurement?
• Perhaps the instrument is not adequately cooled.
• Are the measurements trending up or down during the month?
• This is an indication of instrument drift.
• Is there a gold standard for the instrument?
• This is one part that is representative of production parts, kept in a climate-
controlled room, handled only with gloves and carried around on a red velvet
pillow.
54. Operational Excellence
Measurement System Analysis
Operational Excellence
Improving the Measurement System
1/28/2017 Ronald Morgan Shewchuk 54
• Any instrument must have a gold standard.
• Even the kilogram has a gold standard.
• It is a platinum-iridium cylinder held under glass at the Bureau International des
Poids et Mesures in Sèvres, France.
• If the gold standard measures differently during the month the measurement error
is not due to the gold standard, it is due to the measurement system.
• Consider if the instrument and/or samples are affected by temperature, humidity,
vibration, dust, etc.
• Set up experiments to validate these effects with data to support your conclusions.
• If you are lobbying for the instrument to be relocated to a climate-controlled clean
room you better have the data to justify this move.
55. Operational Excellence
Measurement System Analysis
Operational Excellence
Improving the Measurement System
1/28/2017 Ronald Morgan Shewchuk 55
• If reproducibility is poor, read the Standard Operating Procedure (SOP) in detail.
• Is the procedure crystal clear without ambiguity which would lead operators to
conduct the procedure differently?
• Does the procedure specify instrument calibration before each use?
• Does the procedure indicate what to do if the instrument fails the calibration
routine?
• Observe the operators conducting the procedure.
• Are they adhering to the procedure?
• Consider utilizing the operator with the lowest variation as a mentor/coach for the
other operators.
• Ensure that the SOP is comprehensive and visual.
• Functional procedures should be dominated by pictures, diagrams, sketches, flow
charts, etc which clearly demonstrate the order of operations and call out the
critical points of the procedure.
56. Operational Excellence
Measurement System Analysis
Operational Excellence
Improving the Measurement System
1/28/2017 Ronald Morgan Shewchuk 56
• Avoid lengthy text SOP’s devoid of graphics.
• They do not facilitate memory triangulation – the use of multiple senses to recall
learning.
• Refresher training should be conducted annually on SOP’s with supervisor audit of
the Operator performing the measurement SOP.
57. Operational Excellence
Measurement System Analysis
Operational Excellence
Long Term Stability
1/28/2017 Ronald Morgan Shewchuk 57
• Now that you have performed analyses to establish the Instrument Detection
Limit, Method Detection Limit, Accuracy, Linearity, and Gage R&R metrics of your
measurement system and proven that you have a healthy measurement system;
you will need to monitor the measurement system to ensure that it remains
healthy.
• Stability is typically monitored through daily measurement of a standard on the
instrument in question.
• If a standard is not available, one of the samples from the Gage R&R can be utilized
as a “Golden Sample”.
• Each day, after the instrument is calibrated, the standard is measured on the
instrument.
• An Individuals Moving Range (IMR) SPC chart is generated as we have covered in
Chapter 6.
• If the standard is in control then the measurement system is deemed to be in
control and this provides the justification to utilize the instrument to perform
commercial analyses on process samples throughout the day
58. Operational Excellence
Measurement System Analysis
Operational Excellence
Long Term Stability
1/28/2017 Ronald Morgan Shewchuk 58
• If the standard is not in control the instrument is deemed to be nonconforming
and a Root Cause Analysis must be initiated to identify the source(s) of the
discrepancy.
• Once the discrepancy has been identified and corrected, the standard is re-run on
the instrument and the IMR chart refreshed to prove that the instrument is in
control.
• Figure 7.19 shows daily stability measurements from Case Study III, silica
concentration measurement of Golden Sample disk number two.
59. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 59
Figure 7.19 Measurement System Long Term Stability
28252219161310741
15.2
14.8
14.4
Day
IndividualValue
_
X=14.67
UCL=15.101
LCL=14.239
28252219161310741
0.6
0.4
0.2
0.0
Day
MovingRange
__
MR=0.1621
UCL=0.5295
LCL=0
I-MR Chart of Golden Sample 2 Silica Conc
60. Operational Excellence
Measurement System Analysis
Operational Excellence
Metrology Correlation and Matching
1/28/2017 Ronald Morgan Shewchuk 60
• Metrology correlation is utilized when comparing two measurement systems.
• This includes the sample preparation steps required before the actual
measurement is conducted as this is part of the measurement system.
• Metrology correlation and matching assessment is performed when replacing an
existing metrology tool with a new metrology tool, expanding measurement
capacity by adding a second tool, comparing customer metrology to supplier
metrology or comparing a metrology tool at one site to a metrology tool at
another site.
• Metrology correlation analysis is conducted when the two metrology tools are not
required to deliver the exact same output.
• This occurs when the equipment, fixtures, procedures and environment of the two
measurement tools are not exactly the same.
• This is a common situation when comparing customer metrology to supplier
metrology.
61. Operational Excellence
Measurement System Analysis
Operational Excellence
Metrology Correlation and Matching
1/28/2017 Ronald Morgan Shewchuk 61
• Metrology matching analysis is conducted when the two metrology tools are
required to deliver exactly the same output.
• This is a typical condition where a specification exists for a critical quality
characteristic.
• Before conducting metrology correlation and matching there are some
prerequisites.
• You must ensure metrologies are accurate, capable, and stable.
• This means that the two measurement systems under consideration must have
passed the success criterion for instrument detection limit, method detection limit,
accuracy, linearity, Gage R&R and long term stability.
• Correlation and matching is most likely to be successful if the measurement
procedures are standardized.
• Select a minimum of sixteen samples to be measured on both metrology tools.
• Samples should be selected such that they span the measurement range of
interest (for example – the spec range).
62. Operational Excellence
Measurement System Analysis
Operational Excellence
Metrology Correlation and Matching
1/28/2017 Ronald Morgan Shewchuk 62
• Avoid clustered samples around a certain measurement value.
• If necessary, manufacture samples to cover the spec range.
• It is acceptable to include out of spec high and low samples.
• In order for two measurement systems to be correlated, R-squared of the least
squares regression line of the current instrument vs the proposed instrument must
be 75% or higher.
• If matching is desired, there are two additional requirements; the 95% confidence
interval of the slope of the orthogonal regression line must include a slope of 1.0
and a paired t-Test passes (ie 95% confidence interval of mean includes zero).
• This ensures that bias between the two instruments is not significant.
• Let us revisit Penelope Banks at Minnesota Polymer to better understand
metrology correlation and matching protocol.
• Penny has requisitioned a redundant XRF-EDS to serve as a critical back-up to the
existing XRF-EDS instrument and to provide analysis capacity expansion for the
future
63. Operational Excellence
Measurement System Analysis
Operational Excellence
Metrology Correlation and Matching
1/28/2017 Ronald Morgan Shewchuk 63
• She has been submitting samples for analysis to both instruments for the last
sixteen weeks and has collected the following results.
• Please refer to Figure 7.20 for correlation and matching analysis steps.
Sample No. XRF-EDS1 XRF-EDS2
160403-2359D 14.2 14.4
160410-1600A 15.3 15.1
160414-0200B 13.7 13.5
160421-1400C 16.8 17.0
160427-0830C 13.5 13.3
160504-0300D 15.1 15.1
160510-1030A 13.3 13.2
160518-0100B 16.4 16.2
160525-1615C 16.6 16.5
160601-2330D 14.3 14.5
160608-0500D 15.7 15.9
160616-1330A 13.8 13.6
160625-1515C 15.7 15.8
160630-0420D 16.2 16.0
160707-2230B 13.5 13.7
160715-1920B 16.8 17.0
64. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 64
Figure 7.20 Metrology Correlation and Matching Steps
Open a new worksheet. Copy and paste the measurement data from the two instruments into the worksheet.
65. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 65
Figure 7.20 Metrology Correlation and Matching Steps
Click on Graph → Scatterplot on the top menu.
66. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 66
Figure 7.20 Metrology Correlation and Matching Steps
Select With Regression in the dialogue box. Click OK.
67. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 67
Figure 7.20 Metrology Correlation and Matching Steps
Select the reference instrument XRF-EDS1 for the X variables and XRF-EDS2 for the Y variables. Click OK.
68. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 68
Figure 7.20 Metrology Correlation and Matching Steps
A scatter plot is produced with least squares regression line.
1716151413
17
16
15
14
13
XRF-EDS1
XRF-EDS2
Scatterplot of XRF-EDS2 vs XRF-EDS1
69. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 69
Figure 7.20 Metrology Correlation and Matching Steps
Hover your cursor over the least squares regression line. The R-sq = 98.1%. Correlation is good.
70. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 70
Figure 7.20 Metrology Correlation and Matching Steps
Return to the worksheet. Click on Stat → Regression → Orthogonal Regression on the top menu.
71. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 71
Figure 7.20 Metrology Correlation and Matching Steps
Select the reference instrument XRF-EDS2 for the Response (Y) and XRF-EDS1 for the Predictor (X) variables. Click Options.
72. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 72
Figure 7.20 Metrology Correlation and Matching Steps
Select 95 for the Confidence level. Click OK → then click OK one more time.
73. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 73
Figure 7.20 Metrology Correlation and Matching Steps
A scatter plot is produced with orthogonal regression line.
1716151413
17
16
15
14
13
XRF-EDS1
XRF-EDS2
Plot of XRF-EDS2 vs XRF-EDS1 with Fitted Line
74. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 74
Figure 7.20 Metrology Correlation and Matching Steps
Click on Window → Session on the top menu. The session window indicates that the 95% Confidence Interval of the slope includes 1.0.
The two instruments are linear in accuracy.
75. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 75
Figure 7.20 Metrology Correlation and Matching Steps
Return to the worksheet. Click on Stat → Basic Statistics → Paired t on the top menu.
76. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 76
Figure 7.20 Metrology Correlation and Matching Steps
Select XRF-EDS1 for Sample 1 and XRF-EDS2 for Sample 2 in the dialogue box. Click Options.
77. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 77
Figure 7.20 Metrology Correlation and Matching Steps
Select 95.0 for Confidence level. Select 0.0 for Hypothesized difference. Select Difference ≠ hypothesized difference for Alternative
hypothesis in the dialogue box. Click OK. Then click OK one more time.
78. Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 78
Figure 7.20 Metrology Correlation and Matching Steps
The session window indicates that the 95% confidence interval for the mean difference includes zero. The P-Value for the paired t-Test
is above the significance level of 0.05. Therefore we may not reject the null hypothesis. There is no significant bias between the two
instruments.
• Penelope has proven that XRF-EDS2 is correlated and matched to XRF-EDS1.
• She may now use XRF-EDS2 for commercial shipment releases including
Certificates of Analysis to her customers.
79. Operational Excellence
Measurement System Analysis
Operational Excellence
References
1/28/2017 Ronald Morgan Shewchuk 79
Warner, Kent. Martinich, Dave. Wenz, Paul., Measurement Capability
and Correlation, Revision 4.0.2, Intel, Santa Clara, CA, 2010
AIAG, Measurement Systems Analysis, Fourth Edition, Automotive
Industry Action Group., Southfield, MI, 2010
Breyfogle, Forrest W., III., Implementing Six Sigma, Second Edition,
John Wiley & Sons, Hoboken, NJ, 2003
George, M., Maxey, P., Price, M. and Rowlands, D., The Lean Six Sigma
Pocket Toolbook, McGraw-Hill, New York, NY, 2005
Wedgwood, Ian D., Lean Sigma – A Practitioner’s Guide, Prentice Hall,
Boston, MA 2007
40 CFR Part 136: Guidelines Establishing Test Procedures for the
Analysis of Pollutants, Appendix B, Environmental Protection
Agency, Washington, DC, 2012
80. Operational Excellence
Measurement System Analysis
Operational Excellence
Internet Resources
1/28/2017 Ronald Morgan Shewchuk 80
• Automotive Industry Action Group
Automotive Industry Action Group
• Method Detection Limit (MDL) Calculators
Method Detection Limit (MDL) Calculators | CHEMIASOFT
• 40 CFR Part 136: Guidelines Establishing Test Procedures for the Analysis of
Pollutants, Appendix B
40 CFR Part 136, Subchapter D
• Percentiles of the t-Distribution
http://sites.stat.psu.edu/~mga/401/tables/t.pdf