SlideShare a Scribd company logo
Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 1
• Measurement System Analysis (MSA) is the first step of the measure phase
along the DMAIC pathway to improvement.
• You will be basing the success of your improvement project on key performance
indicators that are tied to your measurement system.
• Consequently, before you begin tracking metrics you will need to complete a MSA
to validate the measurement system.
• A comprehensive MSA typically consists of six parts; Instrument Detection Limit,
Method Detection Limit, Accuracy, Linearity, Gage R&R and Long Term Stability.
• If you want to expand measurement capacity or qualify another instrument you
must complement the MSA to include Metrology Correlation and Matching.
Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 2
• A poor measurement system can make data meaningless and process
improvement impossible.
• Large measurement error will prevent assessment of process stability and
capability, confound Root Cause Analysis and hamper continuous improvement
activities in manufacturing operations.
• Measurement error has a direct impact on assessing the stability and capability of
a process.
• Poor metrology can make a stable process appear unstable and make a capable
process appear incapable.
• Measurement System Analysis quantifies the effect of measurement error on the
total variation of a unit operation.
• The sources of this variation may be visualized as in Figure 7.1 and the elements of
a measurement system as in Figure 7.2.
Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 3
Observed Process
Variation
Actual Process
Variation
Measurement
Variation
Long-term
Process Variation
Short-term
Process Variation
Variation within
Sample
Variation due to
Gage
Variation due to
Operators
Repeatability Calibration Stability Linearity
Figure 7.1 Sources of Variation
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 4
Figure 7.2 Measurement System Elements
Hardware
Software
Cleanliness
Humidity & Temp
Vibration
Lighting
Power Source
Setup
Calibration Frequency
Calibration Technique
Sample Preparation
Operator Procedure
Data Entry
Calculations
Equipment
Procedures Environment
Performance
Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 5
• Operators are often skeptical of measurement systems, especially those that
provide them with false feedback causing them to “over-steer” their process.
• This skepticism is well founded since many measurement systems are not capable
of accurately or precisely measuring the process.
• Accuracy refers to the average of individual measurements compared with the
known, true value.
• Precision refers to the grouping of the individual measurements - the tighter the
grouping, the higher the precision.
• The bull’s eye targets of Figure 7.3 best illustrate the difference between accuracy
and precision.
Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 6
Figure 7.3 Accuracy vs Precision – The Center of the Target is the Objective
Good Accuracy / Bad Precision
Bad Accuracy / Good Precision Good Accuracy / Good Precision
Bad Accuracy / Bad Precision
Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 7
• Accuracy is influenced by resolution, bias, linearity and stability whereas precision
is influenced by repeatability and reproducibility of the measurement system.
• Repeatability is the variation which occurs when the same operator repeatedly
measures the same sample on the same instrument under the same conditions.
• Reproducibility is the variation which occurs between two or more instruments or
operators measuring the same sample with the same measurement method in a
stable environment.
• The total variance in a quality characteristic of a process is described by Eqn 7.1
and Eqn 7.2.
• The percent contribution of the measurement system to the total variance may be
calculated from Eqn 7.3.
• We want to be able to measure true variations in product quality and not
variations in the measurement system so it is desired to minimize 2
measurement
• We will review the steps in a typical measurement system analysis by way of
example, first for the case of variables data and then for the case of attribute data.
Operational Excellence
Measurement System Analysis
Operational Excellence
Introduction
1/28/2017 Ronald Morgan Shewchuk 8
2
total = 2
product + 2
measurement
where 2
total = total variance
2
product = variance due to product
2
measurement = variance due to measurement system
Eqn 7.1
2
measurement = 2
repeatability + 2
reproducibility
where 2
repeatability = variance within operator/device combination
2
reproducibility = variance between operators
Eqn 7.2
2
repeatability + 2
reproducibility
Eqn 7.3
2
total
% Contribution = X 100
Operational Excellence
Measurement System Analysis
Operational Excellence
Instrument Detection Limit (IDL)
1/28/2017 Ronald Morgan Shewchuk 9
• Today’s measurement devices are an order of magnitude more complex than the
“gages” for which the Automotive Industry Action Group (AIAG) first developed
Gage Repeatability and Reproducibility (Gage R&R) studies.
• Typically they are electromechanical devices with internal microprocessors having
inherent signal to noise ratios.
• The Instrument Detection Limit (IDL) should be calculated from the baseline noise
of the instrument.
• Let us examine the case where a gas chromatograph (GC) is being used to measure
the concentration of some analyte of interest. Refer to Figure 7.4.
Operational Excellence
Measurement System Analysis
Operational Excellence
Instrument Detection Limit (IDL)
1/28/2017 Ronald Morgan Shewchuk 10
Figure 7.4 Gas Chromatogram
Operational Excellence
Measurement System Analysis
Operational Excellence
Instrument Detection Limit (IDL)
1/28/2017 Ronald Morgan Shewchuk 11
• The chromatogram has a baseline with peaks at different column retention times
for hydrogen, argon, oxygen, nitrogen, methane and carbon monoxide.
• Let’s say we wanted to calculate the IDL for nitrogen at retention time 5.2 min.
• We would purge and evacuate the column to make sure it is clean then
successively inject seven blanks of the carrier gas (helium).
• The baseline noise peak at retention time 5.2 min is integrated for each of the
blank injections and converted to concentration units of Nitrogen.
• The standard deviation of these concentrations is multiplied by the Student’s t
statistic for n-1 degrees of freedom at a 99% confidence interval (3.143) to
calculate the IDL.
• This is the EPA protocol as defined in 40 CFR Part 136: Guidelines Establishing Test
Procedures for the Analysis of Pollutants, Appendix B.
• Refer to Figure 7.5 below for the calculation summary.
Operational Excellence
Measurement System Analysis
Operational Excellence
Instrument Detection Limit (IDL)
1/28/2017 Ronald Morgan Shewchuk 12
Injection
No.
N2
(ppm)
1 0.01449
2 0.01453
3 0.01456
4 0.01459
5 0.01442
6 0.01440
7 0.01447
StDev 0.00007044
Mean 0.01449
RSD 0.49%
IDL 0.0002214
df 90.0% 95.0% 97.5% 99.0% 99.5% 99.9%
1 3.078 6.314 12.706 31.821 63.657 318.309
2 1.886 2.92 4.303 6.965 9.925 22.327
3 1.638 2.353 3.183 4.541 5.841 10.215
4 1.533 2.132 2.777 3.747 4.604 7.173
5 1.476 2.015 2.571 3.365 4.032 5.893
6 1.44 1.943 2.447 3.143 3.708 5.208
7 1.415 1.895 2.365 2.998 3.5 4.785
8 1.397 1.86 2.306 2.897 3.355 4.501
9 1.383 1.833 2.262 2.822 3.25 4.297
10 1.372 1.812 2.228 2.764 3.169 4.144
Percentiles of the t -Distribution
df = n - 1
IDL = T(df, 1-α=0.99) * Stdev
IDL = 3.143(0.00007044)
= 0.0002214 ppm N2
Figure 7.5 Instrument Detection Limit (IDL) Calculation
Operational Excellence
Measurement System Analysis
Operational Excellence
Method Detection Limit (MDL)
1/28/2017 Ronald Morgan Shewchuk 13
• Method detection limit (MDL) is defined as the minimum concentration of a
substance that can be measured and reported with 99% confidence that the
analyte concentration is greater than zero as determined from analysis of a sample
in a given matrix containing the analyte.
• MDL is calculated in a similar way to IDL with the exception that the same sample
is measured on the instrument with n=7 trials and the sample is disconnected and
reconnected to the measurement apparatus between trials.
• This is called dynamic repeatability analysis.
• An estimate is made of the MDL and a sample prepared at or near this MDL
concentration.
• The seven trials are then measured on the instrument and the MDL calculated as
in Figure 7.6.
• MDL divided by the mean of the seven trials should be within 10-100%.
• If this is not the case, repeat the MDL analysis with a starting sample concentration
closer to the calculated MDL.
Operational Excellence
Measurement System Analysis
Operational Excellence
Method Detection Limit (MDL)
1/28/2017 Ronald Morgan Shewchuk 14
df 90.0% 95.0% 97.5% 99.0% 99.5% 99.9%
1 3.078 6.314 12.706 31.821 63.657 318.309
2 1.886 2.92 4.303 6.965 9.925 22.327
3 1.638 2.353 3.183 4.541 5.841 10.215
4 1.533 2.132 2.777 3.747 4.604 7.173
5 1.476 2.015 2.571 3.365 4.032 5.893
6 1.44 1.943 2.447 3.143 3.708 5.208
7 1.415 1.895 2.365 2.998 3.5 4.785
8 1.397 1.86 2.306 2.897 3.355 4.501
9 1.383 1.833 2.262 2.822 3.25 4.297
10 1.372 1.812 2.228 2.764 3.169 4.144
Percentiles of the t -Distribution
df = n - 1
MDL = T(df, 1-α=0.99) * Stdev
MDL = 3.143(0.01801)
= 0.05660 ppm N2
Figure 7.6 Method Detection Limit (MDL) Calculation
Injection
No.
N2
(ppm)
1 0.3596
2 0.3010
3 0.3227
4 0.3239
5 0.3335
6 0.3196
7 0.3365
StDev 0.01801
Mean 0.3281
RSD 5.49%
MDL 0.05660
MDL/X-bar 17.2%
Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Variables Data
1/28/2017 Ronald Morgan Shewchuk 15
• A properly conducted measurement system analysis (MSA) can yield a treasure
trove of information about your measurement system.
• Repeatability, reproducibility, resolution, bias, and precision to tolerance ratio are
all deliverables of the MSA and can be used to identify areas for improvement in
your measurement system.
• It is important to conduct the MSA in the current state since this is your present
feedback mechanism for your process.
• Resist the temptation to dust off the Standard Operating Procedure and brief the
operators on the correct way to measure the parts.
• Resist the temptation to replace the NIST1 - traceable standard, which looks like it
has been kicked around the metrology laboratory a few times.
1National Institute of Standards and Technology
Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Variables Data
1/28/2017 Ronald Morgan Shewchuk 16
• To prepare for an MSA you must collect samples from the process that span the
specification range of the measurement in question.
• Include out-of-spec high samples and out-of-spec low samples.
• Avoid creating samples artificially in the laboratory.
• There may be complicating factors in the commercial process which influence your
measurement system.
• Include all Operators in the MSA who routinely measure the product.
• The number of samples times the number of Operators should be greater than or
equal to fifteen, with three trials for each sample.
• If this is not practical, increase the number of trials as per Figure 7.7.
Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Variables Data
1/28/2017 Ronald Morgan Shewchuk 17
• Code the samples such that the coding gives no indication to the expected
measurement value – this is called blind sample coding.
• Have each sample measured by an outside laboratory.
• These measurements will serve as your reference values.
• Ask each Operator to measure each sample three times in random sequence.
• Ensure that the Operators do not “compare notes”.
• We will utilize Minitab to analyze the measurement system described in Case Study III.
Samples x Operators Trials
S x O ≥ 15 3
8 ≤ S x O < 15 4
5 ≤ S x O < 8 5
S x O < 5 6
Figure 7.7 Measurement System Analysis Design
Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Variables Data
1/28/2017 Ronald Morgan Shewchuk 18
Case Study III: Minnesota Polymer Co.
Minnesota Polymer Co. supplies a special grade of resin to ABC Molding Co. which includes a silica modifier to improve
dimensional stability. The product code is POMBLK-15 and the silica concentration specification by weight is 15  2%. Silica
concentration is determined by taking a sample of the powdered resin and pressing it into a 4 cm disk using a 25-ton hydraulic
press. The sample disk is then analyzed by x-ray fluorescence energy dispersive spectroscopy (XRF-EDS) to measure the silica
content. Manufacturing POMBLK-15 is difficult. The silica is light and fluffy and sometimes gets stuck in the auger used to feed
the mixing tank. A new process engineer, Penelope Banks, has been hired by Minnesota Polymer. One of her first assignments
is to improve POMBLK-15 process control. SPC analysis of historical batch silica concentration results have indicated out-of-
control symptoms and poor Cpk. Before Penny makes any changes to the process she prudently decides to conduct a
measurement system analysis to find out the contribution of the measurement system to the process variation.
Minnesota Polymer is a firm believer in process ownership. The same operator who charges the raw materials, runs the
manufacturing process, collects the quality control sample, presses the sample disk and then runs the silica analysis on the XRF-
EDS instrument. The operator uses the silica concentration analysis results to adjust the silica charge on the succeeding batch.
POMBLK-15 is typically run over a five-day period in the three-shift, 24/7 operation.
Penny has collected five powder samples from POMBLK-15 process retains which span the silica specification range and included
two out-of-specification samples pulled from quarantine lots. She has asked each of the three shift operators to randomly
analyze three samples from each powder bag for silica content according to her sampling plan. Penny has sent a portion of each
sample powder to the Company’s R&D Headquarters in Hong Kong for silica analysis. These results will serve as reference
values for each sample. The following table summarizes the silica concentration measurements and Figure 7.8 captures the
screen shots of the MSA steps for Case Study III.
Sample Operator 1 Operator 2 Operator 3
Bag # Reference1
Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3
1 17.3 18.2 17.9 18.2 18.1 18.0 18.0 17.8 17.8 18.2
2 14.0 14.4 14.9 14.8 14.8 14.6 14.8 14.4 14.4 14.5
3 13.3 14.0 13.9 13.8 13.9 14.2 14.0 13.8 13.7 13.8
4 16.7 17.2 17.2 17.4 17.4 17.3 17.5 17.4 17.5 17.5
5 12.0 12.9 12.8 12.5 12.5 12.9 12.8 12.9 12.5 12.6
1
As Reported by Hong Kong R&D Center
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 19
Figure 7.8 Measurement System Analysis Steps – Variable Data
Open a new worksheet. Click on Stat  Quality Tools  Gage Study  Create Gage R&R Study Worksheet on the top menu.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 20
Figure 7.8 Measurement System Analysis Steps – Variable Data
Enter the Number of Operators, the Number of Replicates and the Number of Parts in the dialogue box. Click OK.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 21
Figure 7.8 Measurement System Analysis Steps – Variable Data
The worksheet is modified to include a randomized run order of the samples.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 22
Figure 7.8 Measurement System Analysis Steps – Variable Data
Name the adjoining column Silica Conc and transcribe the random sample measurement data to the relevant cells in the worksheet.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 23
Figure 7.8 Measurement System Analysis Steps – Variable Data
Click on Stat  Quality Tools  Gage Study  Gage R&R Study (Crossed) on the top menu.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 24
Figure 7.8 Measurement System Analysis Steps – Variable Data
Select C2 Parts for Part numbers, C3 Operators for Operators and C4 Silica Conc for Measurement data in the
dialogue box. Click the radio toggle button for ANOVA under Method of Analysis. Click Options.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 25
Figure 7.8 Measurement System Analysis Steps – Variable Data
Six (6) standard deviations will account for 99.73% of the Measurement System variation. Enter Lower Spec Limit
and Upper Spec Limit in the dialogue box. Click OK. Click OK.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 26
Figure 7.8 Measurement System Analysis Steps – Variable Data
A new graph is created in the Minitab project file with the Gage R&R analysis results.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 27
Return to the session by clicking on Window  Session on the top menu to view the ANOVA analytical results.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 28
• Let us more closely examine the graphical output of the Gage R&R (ANOVA) Report
for Silica Conc.
• Figure 7.9 shows the components of variation.
• A good measurement system will have the lion’s share of variation coming from
the product, not the measurement system.
• Consequently, we would like the bars for repeatability and reproducibility to be
small relative to part-to-part variation.
Figure 7.9 MSA Components of Variation
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 29
• Figure 7.10 captures the range SPC chart by Operators.
• The range chart should be in control.
• If it is not, a repeatability problem is present.
Figure 7.10 MSA Range Chart by Operators
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 30
• By contrast, the X-bar SPC chart of Figure 7.11 should be out of control.
• This seems counterintuitive but it is a healthy indication that the variability present
is due to part to part differences rather than Operator to Operator differences
Figure 7.11 MSA X-bar Chart by Operators
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 31
• Figure 7.12 is an individual value plot of silica concentration by sample number.
• The circles with a cross indicate the mean of the sample data and the solid circles
are individual data points.
• We want a tight grouping around the mean for each sample and we want
significant variation between the means of different samples.
• If we do not have variation between samples the MSA has been poorly designed
and we essentially have five samples of the same thing.
• This will preclude analysis of the measurement system.
Figure 7.12 MSA Silica Concentration by Sample Number
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 32
• Figure 7.13 is a boxplot of silica concentration by Operator.
• As in Figure 7.12 the circles with a cross indicate the mean concentration for all
samples by Operator.
• The shaded boxes represent the interquartile range (Q3-Q1) for each Operator.
• The interquartile range (IQR) is the preferred measure of spread for data sets
which are not normally distributed.
• The solid line within the IQR is the median silica concentration of all samples by
Operator.
• If Operators are performing the same, we would expect similar means, medians
and IQRs.
Figure 7.13 MSA Silica Concentration by Operator
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 33
• Figure 7.14 is an individual value plot used to check for Operator-Sample
interactions.
• The lines for each Operator should be reasonably parallel to each other.
• Crossing lines indicate the presence of Operator-Sample interactions.
• This can happen when Operators are struggling with samples at or near the MDL
or if the instrument signal to noise ratio varies as a function of concentration.
Figure 7.14 MSA Sample by Operator Interaction
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 34
• Let us now focus on the analytical output of the session window as captured in
Figure 7.8.
• Lovers of Gage R&Rs will typically look for four metrics as defined below and
expect these metrics to be within the acceptable or excellent ranges specified by
Gage R&R Metric Rules of Thumb as shown in Figure 7.15.
2
measurement
Eqn 7.4
2
total
% Contribution = X 100
measurement
Eqn 7.5
total
% Study Variation = X 100
6measurement
Eqn 7.6
USL- LSL
Two-Sided Spec % P/T = X 100
3measurement
Eqn 7.7
TOL
One-Sided Spec % P/T = X 100
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 35
1.41total
Eqn 7.8
measurement
Number of Distinct Categories = trunc
where 2
total = Total Variance
2
measurement = Variance due to Measurement System
total = Total Standard Deviation
measurement = Standard Deviation due to Measurement System
P/T = Precision to Tolerance Ratio
USL= Upper Spec Limit
LSL= Lower Spec Limit
TOL= Process Mean – LSL for LSL only
TOL= USL – Process Mean for USLonly
Gage R&R Metric Unacceptable Acceptable Excellent
% Contribution > 7.7% 2.0 - 7.7% < 2%
% Study Variation > 28% 14 - 28% < 14%
% P/T Ratio > 30% 8 - 30% < 8%
Number of Distinct Categories < 5 5 - 10 > 10
Figure 7.15 Gage R&R Metrics – Rules of Thumb
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 36
• The highlighted output of the Minitab session window indicates a % Contribution
of the measurement system of 0.55%.
• This is in the excellent region.
• % Study Variation is 7.39% which is also in the excellent region.
• Precision to Tolerance ratio is 25.37%.
• This is in the acceptable region.
• Number of distinct categories is 19, well within the excellent region.
• Overall, this is a good measurement system.
• Now, let us proceed to check for linearity and bias by adding the reference
concentrations as measured by the Hong Kong R&D Center for each of the samples
to the worksheet.
• Figure 7.16 captures the screen shots necessary for this process.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 37
Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data
Return to the active worksheet by clicking on Window  Worksheet 1 *** on the top menu. Name the adjoining column Reference Conc
and enter the reference sample concentration values corresponding to each sample (Part) number.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 38
Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data
Click on Stat  Quality Tools  Gage Study  Gage Linearity and Bias Study on the top menu.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 39
Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data
Select C2 Parts for Part numbers, C5 Reference Conc for Reference values and C4 Silica Conc for Measurement data in the dialogue box.
Click OK.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 40
Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data
A new graph is created in the Minitab project file with the Gage Linearity and Bias Study results.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 41
• We can see there is a bias between the Hong Kong measurement system and
Minnesota Polymer’s measurement system.
• The bias is relatively constant over the silica concentration range of interest as
indicated by the regression line.
• The Minnesota Polymer measurement system is reading approximately 0.67 wt %
Silica higher than Hong Kong.
• This is not saying that the Hong Kong instrument is right and the Minnesota
Polymer instrument is wrong.
• It is merely saying that there is a difference between the two instruments which
must be investigated.
• This difference could have process capability implications if it is validated.
• Minnesota Polymer may be operating in the top half of the allowable spec range.
• The logical next step is for the Hong Kong R&D center to conduct an MSA of similar
design, ideally with the same sample set utilized by Minnesota Polymer.
Operational Excellence
Measurement System Analysis
Operational Excellence
Measurement System Analysis – Attribute Data
1/28/2017 Ronald Morgan Shewchuk 42
Case Study IV: Virtual Cable Co.
David Raffles Lee has just joined Virtual Cable Co., the leading telecommunications company in the southwest as Chief Executive
Officer. David comes to Virtual Cable with over thirty years of operations experience in the telecommunications industry in
Singapore. During a tour of one of the Customer Service Centers, David noticed that the customer service agents were all
encased in bulletproof glass. David queried the Customer Service Manager, Bob Londale about this and Bob responded, “It is for
the protection of our associates. Sometimes our customers become angry and they produce weapons.” David was rather
shocked about this and wanted to learn more about customer satisfaction at Virtual Cable. He formed a team to analyze the
measurement of customer satisfaction. This team prepared ten scripts of typical customer complaints with an intended
outcome of pass – customer was satisfied with the customer service agent’s response or fail – customer was dissatisfied with
the response. Twenty “customers” were coached on the scripts, one script for two customers. These customers committed the
scripts to memory and presented their service issue to three different Customer Service Agents at three different Customer
Service Centers. Each customer was issued an account number and profile to allow the Customer Service Agent to rate the
customer’s satisfaction level in the customer feedback database as required by Virtual Cable’s policy. The results are
summarized in the attached table and analyzed by the MSA attribute data steps of Figure 7.17.
• In our next case we will analyze the measurement system used to rate customer
satisfaction as described in Case Study IV below.
Operator 1 Operator 2 Operator 3
Script # Reference1
Rep 1 Rep 2 Rep 1 Rep 2 Rep 1 Rep 2
1 F F F F F F F
2 P P P P P P P
3 P P P P P P P
4 P P P P P P P
5 F F F F P F F
6 P P P P P P P
7 F F F F F F F
8 F F F F F F F
9 P P F P P F P
10 F F F F F F F
1
Intended outcome of script from Customer Satisfaction Team
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 43
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Open a new worksheet. Click on Stat  Quality Tools  Create Attribute Agreement Analysis Worksheet on the top menu.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 44
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Enter the Number of samples, the Number of appraisers and the Number of replicates in the dialogue box. Click OK.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 45
Figure 7.17 Measurement System Analysis Steps – Attribute Data
The worksheet is modified to include a randomized run order of the scripts (samples).
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 46
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Name the adjoining columns Response and Reference. Transcribe the satisfaction level rating and the reference value of the script to
the appropriate cells.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 47
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Click on Stat  Quality Tools  Attribute Agreement Analysis on the top menu.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 48
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Select C4 Response for Attribute column, C2 Samples for Samples and C3 Appraisers for Appraisers in the dialogue box. Select C5
Reference for Known standard/attribute. Click OK.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 49
Figure 7.17 Measurement System Analysis Steps – Attribute Data
A new graph is created in the Minitab project file with the Attribute Assessment Agreement results.
Date of study:
Reported by:
Name of product:
Misc:
321
1 00
90
80
70
60
Appraiser
Percent
95.0% CI
Percent
321
1 00
90
80
70
60
Appraiser
Percent
95.0% CI
Percent
Assessment Agreement
Within Appraisers Appraiser vs Standard
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 50
Figure 7.17 Measurement System Analysis Steps – Attribute Data
Display the analytical MSA Attribute Agreement Results by clicking on Window  Session on the top menu.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 51
• The attribute MSA results allow us to determine the percentage overall agreement,
the percentage agreement within appraisers (repeatability), the percentage
agreement between appraisers (reproducibility), the percentage agreement with
reference values (accuracy) and the Kappa Value (index used to determine how
much better the measurement system is than random chance).
• From the graphical results we can see that the Customer Service Agents were in
agreement with each other 90% of the time and were in agreement with the
expected (standard) result 90% of the time.
• From the analytical results we can see that the agreement between appraisers was
80% and the overall agreement vs the standard values was 80%.
• The Kappa Value for all appraisers vs the standard values was 0.90, indicative of
excellent agreement between the appraised values and reference values.
• Figure 7.18 provides benchmark interpretations for Kappa Values.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 52
Figure 7.18 Rules of Thumb for Interpreting Kappa Values
Kappa Value Interpretation
-1.0 to 0.6 Agreement expected as by chance
0.6 to 0.7 Marginal agreement - significant effort required to improve measurement system
0.7 to 0.8 Good agreement - some improvement to measurement system is warranted
0.9 to 1.0 Excellent agreement
Attribute MSA - Kappa Value
• Another way of looking at this case is that out of sixty expected outcomes there
were only three miscalls on rating customer satisfaction by the Customer Service
Agents included in this study.
• Mr. Lee can have confidence in the feedback of the Virtual Cable customer
satisfaction measurement system and proceed to identify and remedy the
underlying root causes of customer dissatisfaction.
Operational Excellence
Measurement System Analysis
Operational Excellence
Improving the Measurement System
1/28/2017 Ronald Morgan Shewchuk 53
• Improvements to the measurement system should be focused on the root cause(s)
of high measurement system variation.
• If repeatability is poor, consider a more detailed repeatability study using one part
and one operator over an extended period of time.
• Ask the operator to measure this one sample twice per day for one month.
• Is the afternoon measurement always greater or always lesser than the morning
measurement?
• Perhaps the instrument is not adequately cooled.
• Are the measurements trending up or down during the month?
• This is an indication of instrument drift.
• Is there a gold standard for the instrument?
• This is one part that is representative of production parts, kept in a climate-
controlled room, handled only with gloves and carried around on a red velvet
pillow.
Operational Excellence
Measurement System Analysis
Operational Excellence
Improving the Measurement System
1/28/2017 Ronald Morgan Shewchuk 54
• Any instrument must have a gold standard.
• Even the kilogram has a gold standard.
• It is a platinum-iridium cylinder held under glass at the Bureau International des
Poids et Mesures in Sèvres, France.
• If the gold standard measures differently during the month the measurement error
is not due to the gold standard, it is due to the measurement system.
• Consider if the instrument and/or samples are affected by temperature, humidity,
vibration, dust, etc.
• Set up experiments to validate these effects with data to support your conclusions.
• If you are lobbying for the instrument to be relocated to a climate-controlled clean
room you better have the data to justify this move.
Operational Excellence
Measurement System Analysis
Operational Excellence
Improving the Measurement System
1/28/2017 Ronald Morgan Shewchuk 55
• If reproducibility is poor, read the Standard Operating Procedure (SOP) in detail.
• Is the procedure crystal clear without ambiguity which would lead operators to
conduct the procedure differently?
• Does the procedure specify instrument calibration before each use?
• Does the procedure indicate what to do if the instrument fails the calibration
routine?
• Observe the operators conducting the procedure.
• Are they adhering to the procedure?
• Consider utilizing the operator with the lowest variation as a mentor/coach for the
other operators.
• Ensure that the SOP is comprehensive and visual.
• Functional procedures should be dominated by pictures, diagrams, sketches, flow
charts, etc which clearly demonstrate the order of operations and call out the
critical points of the procedure.
Operational Excellence
Measurement System Analysis
Operational Excellence
Improving the Measurement System
1/28/2017 Ronald Morgan Shewchuk 56
• Avoid lengthy text SOP’s devoid of graphics.
• They do not facilitate memory triangulation – the use of multiple senses to recall
learning.
• Refresher training should be conducted annually on SOP’s with supervisor audit of
the Operator performing the measurement SOP.
Operational Excellence
Measurement System Analysis
Operational Excellence
Long Term Stability
1/28/2017 Ronald Morgan Shewchuk 57
• Now that you have performed analyses to establish the Instrument Detection
Limit, Method Detection Limit, Accuracy, Linearity, and Gage R&R metrics of your
measurement system and proven that you have a healthy measurement system;
you will need to monitor the measurement system to ensure that it remains
healthy.
• Stability is typically monitored through daily measurement of a standard on the
instrument in question.
• If a standard is not available, one of the samples from the Gage R&R can be utilized
as a “Golden Sample”.
• Each day, after the instrument is calibrated, the standard is measured on the
instrument.
• An Individuals Moving Range (IMR) SPC chart is generated as we have covered in
Chapter 6.
• If the standard is in control then the measurement system is deemed to be in
control and this provides the justification to utilize the instrument to perform
commercial analyses on process samples throughout the day
Operational Excellence
Measurement System Analysis
Operational Excellence
Long Term Stability
1/28/2017 Ronald Morgan Shewchuk 58
• If the standard is not in control the instrument is deemed to be nonconforming
and a Root Cause Analysis must be initiated to identify the source(s) of the
discrepancy.
• Once the discrepancy has been identified and corrected, the standard is re-run on
the instrument and the IMR chart refreshed to prove that the instrument is in
control.
• Figure 7.19 shows daily stability measurements from Case Study III, silica
concentration measurement of Golden Sample disk number two.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 59
Figure 7.19 Measurement System Long Term Stability
28252219161310741
15.2
14.8
14.4
Day
IndividualValue
_
X=14.67
UCL=15.101
LCL=14.239
28252219161310741
0.6
0.4
0.2
0.0
Day
MovingRange
__
MR=0.1621
UCL=0.5295
LCL=0
I-MR Chart of Golden Sample 2 Silica Conc
Operational Excellence
Measurement System Analysis
Operational Excellence
Metrology Correlation and Matching
1/28/2017 Ronald Morgan Shewchuk 60
• Metrology correlation is utilized when comparing two measurement systems.
• This includes the sample preparation steps required before the actual
measurement is conducted as this is part of the measurement system.
• Metrology correlation and matching assessment is performed when replacing an
existing metrology tool with a new metrology tool, expanding measurement
capacity by adding a second tool, comparing customer metrology to supplier
metrology or comparing a metrology tool at one site to a metrology tool at
another site.
• Metrology correlation analysis is conducted when the two metrology tools are not
required to deliver the exact same output.
• This occurs when the equipment, fixtures, procedures and environment of the two
measurement tools are not exactly the same.
• This is a common situation when comparing customer metrology to supplier
metrology.
Operational Excellence
Measurement System Analysis
Operational Excellence
Metrology Correlation and Matching
1/28/2017 Ronald Morgan Shewchuk 61
• Metrology matching analysis is conducted when the two metrology tools are
required to deliver exactly the same output.
• This is a typical condition where a specification exists for a critical quality
characteristic.
• Before conducting metrology correlation and matching there are some
prerequisites.
• You must ensure metrologies are accurate, capable, and stable.
• This means that the two measurement systems under consideration must have
passed the success criterion for instrument detection limit, method detection limit,
accuracy, linearity, Gage R&R and long term stability.
• Correlation and matching is most likely to be successful if the measurement
procedures are standardized.
• Select a minimum of sixteen samples to be measured on both metrology tools.
• Samples should be selected such that they span the measurement range of
interest (for example – the spec range).
Operational Excellence
Measurement System Analysis
Operational Excellence
Metrology Correlation and Matching
1/28/2017 Ronald Morgan Shewchuk 62
• Avoid clustered samples around a certain measurement value.
• If necessary, manufacture samples to cover the spec range.
• It is acceptable to include out of spec high and low samples.
• In order for two measurement systems to be correlated, R-squared of the least
squares regression line of the current instrument vs the proposed instrument must
be 75% or higher.
• If matching is desired, there are two additional requirements; the 95% confidence
interval of the slope of the orthogonal regression line must include a slope of 1.0
and a paired t-Test passes (ie 95% confidence interval of mean includes zero).
• This ensures that bias between the two instruments is not significant.
• Let us revisit Penelope Banks at Minnesota Polymer to better understand
metrology correlation and matching protocol.
• Penny has requisitioned a redundant XRF-EDS to serve as a critical back-up to the
existing XRF-EDS instrument and to provide analysis capacity expansion for the
future
Operational Excellence
Measurement System Analysis
Operational Excellence
Metrology Correlation and Matching
1/28/2017 Ronald Morgan Shewchuk 63
• She has been submitting samples for analysis to both instruments for the last
sixteen weeks and has collected the following results.
• Please refer to Figure 7.20 for correlation and matching analysis steps.
Sample No. XRF-EDS1 XRF-EDS2
160403-2359D 14.2 14.4
160410-1600A 15.3 15.1
160414-0200B 13.7 13.5
160421-1400C 16.8 17.0
160427-0830C 13.5 13.3
160504-0300D 15.1 15.1
160510-1030A 13.3 13.2
160518-0100B 16.4 16.2
160525-1615C 16.6 16.5
160601-2330D 14.3 14.5
160608-0500D 15.7 15.9
160616-1330A 13.8 13.6
160625-1515C 15.7 15.8
160630-0420D 16.2 16.0
160707-2230B 13.5 13.7
160715-1920B 16.8 17.0
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 64
Figure 7.20 Metrology Correlation and Matching Steps
Open a new worksheet. Copy and paste the measurement data from the two instruments into the worksheet.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 65
Figure 7.20 Metrology Correlation and Matching Steps
Click on Graph → Scatterplot on the top menu.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 66
Figure 7.20 Metrology Correlation and Matching Steps
Select With Regression in the dialogue box. Click OK.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 67
Figure 7.20 Metrology Correlation and Matching Steps
Select the reference instrument XRF-EDS1 for the X variables and XRF-EDS2 for the Y variables. Click OK.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 68
Figure 7.20 Metrology Correlation and Matching Steps
A scatter plot is produced with least squares regression line.
1716151413
17
16
15
14
13
XRF-EDS1
XRF-EDS2
Scatterplot of XRF-EDS2 vs XRF-EDS1
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 69
Figure 7.20 Metrology Correlation and Matching Steps
Hover your cursor over the least squares regression line. The R-sq = 98.1%. Correlation is good.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 70
Figure 7.20 Metrology Correlation and Matching Steps
Return to the worksheet. Click on Stat → Regression → Orthogonal Regression on the top menu.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 71
Figure 7.20 Metrology Correlation and Matching Steps
Select the reference instrument XRF-EDS2 for the Response (Y) and XRF-EDS1 for the Predictor (X) variables. Click Options.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 72
Figure 7.20 Metrology Correlation and Matching Steps
Select 95 for the Confidence level. Click OK → then click OK one more time.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 73
Figure 7.20 Metrology Correlation and Matching Steps
A scatter plot is produced with orthogonal regression line.
1716151413
17
16
15
14
13
XRF-EDS1
XRF-EDS2
Plot of XRF-EDS2 vs XRF-EDS1 with Fitted Line
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 74
Figure 7.20 Metrology Correlation and Matching Steps
Click on Window → Session on the top menu. The session window indicates that the 95% Confidence Interval of the slope includes 1.0.
The two instruments are linear in accuracy.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 75
Figure 7.20 Metrology Correlation and Matching Steps
Return to the worksheet. Click on Stat → Basic Statistics → Paired t on the top menu.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 76
Figure 7.20 Metrology Correlation and Matching Steps
Select XRF-EDS1 for Sample 1 and XRF-EDS2 for Sample 2 in the dialogue box. Click Options.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 77
Figure 7.20 Metrology Correlation and Matching Steps
Select 95.0 for Confidence level. Select 0.0 for Hypothesized difference. Select Difference ≠ hypothesized difference for Alternative
hypothesis in the dialogue box. Click OK. Then click OK one more time.
Operational Excellence
Measurement System Analysis
Operational Excellence
1/28/2017 Ronald Morgan Shewchuk 78
Figure 7.20 Metrology Correlation and Matching Steps
The session window indicates that the 95% confidence interval for the mean difference includes zero. The P-Value for the paired t-Test
is above the significance level of 0.05. Therefore we may not reject the null hypothesis. There is no significant bias between the two
instruments.
• Penelope has proven that XRF-EDS2 is correlated and matched to XRF-EDS1.
• She may now use XRF-EDS2 for commercial shipment releases including
Certificates of Analysis to her customers.
Operational Excellence
Measurement System Analysis
Operational Excellence
References
1/28/2017 Ronald Morgan Shewchuk 79
Warner, Kent. Martinich, Dave. Wenz, Paul., Measurement Capability
and Correlation, Revision 4.0.2, Intel, Santa Clara, CA, 2010
AIAG, Measurement Systems Analysis, Fourth Edition, Automotive
Industry Action Group., Southfield, MI, 2010
Breyfogle, Forrest W., III., Implementing Six Sigma, Second Edition,
John Wiley & Sons, Hoboken, NJ, 2003
George, M., Maxey, P., Price, M. and Rowlands, D., The Lean Six Sigma
Pocket Toolbook, McGraw-Hill, New York, NY, 2005
Wedgwood, Ian D., Lean Sigma – A Practitioner’s Guide, Prentice Hall,
Boston, MA 2007
40 CFR Part 136: Guidelines Establishing Test Procedures for the
Analysis of Pollutants, Appendix B, Environmental Protection
Agency, Washington, DC, 2012
Operational Excellence
Measurement System Analysis
Operational Excellence
Internet Resources
1/28/2017 Ronald Morgan Shewchuk 80
• Automotive Industry Action Group
Automotive Industry Action Group
• Method Detection Limit (MDL) Calculators
Method Detection Limit (MDL) Calculators | CHEMIASOFT
• 40 CFR Part 136: Guidelines Establishing Test Procedures for the Analysis of
Pollutants, Appendix B
40 CFR Part 136, Subchapter D
• Percentiles of the t-Distribution
http://sites.stat.psu.edu/~mga/401/tables/t.pdf

More Related Content

What's hot

NG BB 25 Measurement System Analysis - Attribute
NG BB 25 Measurement System Analysis - AttributeNG BB 25 Measurement System Analysis - Attribute
NG BB 25 Measurement System Analysis - AttributeLeanleaders.org
 
Spc training
Spc trainingSpc training
Spc training
PRASHANT KSHIRSAGAR
 
Measurement System Analysis (MSA)
Measurement System Analysis (MSA)Measurement System Analysis (MSA)
Measurement System Analysis (MSA)Ram Kumar
 
Spc training
Spc training Spc training
Spc training
VIBHASH SINGH
 
10. measurement system analysis (msa)
10. measurement system analysis (msa)10. measurement system analysis (msa)
10. measurement system analysis (msa)
Hakeem-Ur- Rehman
 
Measurement system analysis
Measurement system analysisMeasurement system analysis
Measurement system analysis
PPT4U
 
Attribute MSA
Attribute MSA Attribute MSA
Attribute MSA
dishashah4993
 
MSA presentation
MSA presentationMSA presentation
MSA presentation
sanjay deo
 
Statistical process control ppt @ doms
Statistical process control ppt @ doms Statistical process control ppt @ doms
Statistical process control ppt @ doms
Babasab Patil
 
Msa la
Msa laMsa la
Msa la
Paul Robere
 
Ppap training ppt
Ppap training   ppt Ppap training   ppt
Ppap training ppt
Jitesh Gaurav
 
Statistical Process Control
Statistical Process ControlStatistical Process Control
Statistical Process Control
johnreilly
 
NG BB 23 Measurement System Analysis - Introduction
NG BB 23 Measurement System Analysis - IntroductionNG BB 23 Measurement System Analysis - Introduction
NG BB 23 Measurement System Analysis - IntroductionLeanleaders.org
 
Measurement system analysis
Measurement system analysisMeasurement system analysis
Measurement system analysis
Tina Arora
 
Measurement System Analysis
Measurement System AnalysisMeasurement System Analysis
Measurement System Analysis
Qualimation Technologies
 
Spc lecture presentation (bonnie corrror)
Spc lecture presentation (bonnie corrror)Spc lecture presentation (bonnie corrror)
Spc lecture presentation (bonnie corrror)
Jitesh Gaurav
 
7 qc tools
7 qc tools7 qc tools
7 qc tools
gurjeetdhillon
 

What's hot (20)

NG BB 25 Measurement System Analysis - Attribute
NG BB 25 Measurement System Analysis - AttributeNG BB 25 Measurement System Analysis - Attribute
NG BB 25 Measurement System Analysis - Attribute
 
Spc training
Spc trainingSpc training
Spc training
 
Measurement System Analysis (MSA)
Measurement System Analysis (MSA)Measurement System Analysis (MSA)
Measurement System Analysis (MSA)
 
Spc training
Spc training Spc training
Spc training
 
10. measurement system analysis (msa)
10. measurement system analysis (msa)10. measurement system analysis (msa)
10. measurement system analysis (msa)
 
Measurement system analysis
Measurement system analysisMeasurement system analysis
Measurement system analysis
 
Attribute MSA
Attribute MSA Attribute MSA
Attribute MSA
 
MSA presentation
MSA presentationMSA presentation
MSA presentation
 
Statistical process control ppt @ doms
Statistical process control ppt @ doms Statistical process control ppt @ doms
Statistical process control ppt @ doms
 
Msa la
Msa laMsa la
Msa la
 
Spc
SpcSpc
Spc
 
Ppap training ppt
Ppap training   ppt Ppap training   ppt
Ppap training ppt
 
Statistical Process Control
Statistical Process ControlStatistical Process Control
Statistical Process Control
 
Control charts
Control chartsControl charts
Control charts
 
NG BB 23 Measurement System Analysis - Introduction
NG BB 23 Measurement System Analysis - IntroductionNG BB 23 Measurement System Analysis - Introduction
NG BB 23 Measurement System Analysis - Introduction
 
Measurement system analysis
Measurement system analysisMeasurement system analysis
Measurement system analysis
 
Measurement System Analysis
Measurement System AnalysisMeasurement System Analysis
Measurement System Analysis
 
Msa presentation
Msa presentationMsa presentation
Msa presentation
 
Spc lecture presentation (bonnie corrror)
Spc lecture presentation (bonnie corrror)Spc lecture presentation (bonnie corrror)
Spc lecture presentation (bonnie corrror)
 
7 qc tools
7 qc tools7 qc tools
7 qc tools
 

Viewers also liked

intra and inter personal relations
intra and inter personal relationsintra and inter personal relations
intra and inter personal relations
Ganesh Sahu
 
Measuring capacity lesson3
Measuring capacity lesson3Measuring capacity lesson3
Measuring capacity lesson3
Lidia Marie
 
Unit 1 Service Operations Management
Unit 1 Service Operations ManagementUnit 1 Service Operations Management
Unit 1 Service Operations Management
Gopinath Guru
 
Service Operation - Manajemen Layanan Teknologi Informasi
Service Operation - Manajemen Layanan Teknologi InformasiService Operation - Manajemen Layanan Teknologi Informasi
Service Operation - Manajemen Layanan Teknologi Informasi
Muhammad Idil Haq Amir
 
Legacy Software Maintenance And Management
Legacy Software Maintenance And ManagementLegacy Software Maintenance And Management
Legacy Software Maintenance And Management
ValueCoders
 
Reverse Engineering
Reverse EngineeringReverse Engineering
Reverse Engineeringsiddu019
 
Reverse engineering
Reverse engineeringReverse engineering
Reverse engineering
Hicube Infosec
 
Reverse Engineering of Software Architecture
Reverse Engineering of Software ArchitectureReverse Engineering of Software Architecture
Reverse Engineering of Software ArchitectureDharmalingam Ganesan
 
Reverse engineering
Reverse engineeringReverse engineering
Reverse engineering
Daniel Stenberg
 
Maintenance, Re-engineering &Reverse Engineering in Software Engineering
Maintenance,Re-engineering &Reverse Engineering in Software EngineeringMaintenance,Re-engineering &Reverse Engineering in Software Engineering
Maintenance, Re-engineering &Reverse Engineering in Software Engineering
Manish Kumar
 
Tools for capacity planning, measurement of capacity, capacity planning process
Tools for capacity planning, measurement of capacity, capacity planning processTools for capacity planning, measurement of capacity, capacity planning process
Tools for capacity planning, measurement of capacity, capacity planning process
Rohan Monis
 
Reverse engineering
Reverse engineeringReverse engineering
Reverse engineeringSaswat Padhi
 
reverse engineering
reverse engineeringreverse engineering
reverse engineering
ayush_nitt
 
Software reverse engineering
Software reverse engineeringSoftware reverse engineering
Software reverse engineering
Parminder Singh
 
Capacity Planning with Free Tools
Capacity Planning with Free ToolsCapacity Planning with Free Tools
Capacity Planning with Free Tools
Adrian Cockcroft
 
Service Operation Processes
Service Operation ProcessesService Operation Processes
Service Operation Processesnuwulang
 
Capacity Management
Capacity ManagementCapacity Management
Capacity Management
Antonio Gonzalez
 

Viewers also liked (20)

Capacity 1
Capacity 1Capacity 1
Capacity 1
 
intra and inter personal relations
intra and inter personal relationsintra and inter personal relations
intra and inter personal relations
 
Measuring capacity lesson3
Measuring capacity lesson3Measuring capacity lesson3
Measuring capacity lesson3
 
Unit 1 Service Operations Management
Unit 1 Service Operations ManagementUnit 1 Service Operations Management
Unit 1 Service Operations Management
 
Service Operation - Manajemen Layanan Teknologi Informasi
Service Operation - Manajemen Layanan Teknologi InformasiService Operation - Manajemen Layanan Teknologi Informasi
Service Operation - Manajemen Layanan Teknologi Informasi
 
Legacy Software Maintenance And Management
Legacy Software Maintenance And ManagementLegacy Software Maintenance And Management
Legacy Software Maintenance And Management
 
Reverse Engineering
Reverse EngineeringReverse Engineering
Reverse Engineering
 
Reverse engineering
Reverse engineeringReverse engineering
Reverse engineering
 
Reverse Engineering of Software Architecture
Reverse Engineering of Software ArchitectureReverse Engineering of Software Architecture
Reverse Engineering of Software Architecture
 
Reverse engineering
Reverse engineeringReverse engineering
Reverse engineering
 
Maintenance, Re-engineering &Reverse Engineering in Software Engineering
Maintenance,Re-engineering &Reverse Engineering in Software EngineeringMaintenance,Re-engineering &Reverse Engineering in Software Engineering
Maintenance, Re-engineering &Reverse Engineering in Software Engineering
 
Tools for capacity planning, measurement of capacity, capacity planning process
Tools for capacity planning, measurement of capacity, capacity planning processTools for capacity planning, measurement of capacity, capacity planning process
Tools for capacity planning, measurement of capacity, capacity planning process
 
Line balancing
Line balancing Line balancing
Line balancing
 
Reverse engineering
Reverse engineeringReverse engineering
Reverse engineering
 
reverse engineering
reverse engineeringreverse engineering
reverse engineering
 
Software reverse engineering
Software reverse engineeringSoftware reverse engineering
Software reverse engineering
 
Capacity Planning with Free Tools
Capacity Planning with Free ToolsCapacity Planning with Free Tools
Capacity Planning with Free Tools
 
Service Operation Processes
Service Operation ProcessesService Operation Processes
Service Operation Processes
 
Capacity Management
Capacity ManagementCapacity Management
Capacity Management
 
Facility layout
Facility layoutFacility layout
Facility layout
 

Similar to Measurement System Analysis

STATISTICAL PROCESS CONTROL
STATISTICAL PROCESS CONTROLSTATISTICAL PROCESS CONTROL
STATISTICAL PROCESS CONTROL
Vivek Thorve
 
Statistical Process Control Part 1
Statistical Process Control Part 1Statistical Process Control Part 1
Statistical Process Control Part 1
Malay Pandya
 
Statistical Process Control
Statistical Process ControlStatistical Process Control
Statistical Process Control
Tushar Naik
 
MSA R&R for training in manufacturing industry
MSA R&R for training in manufacturing industryMSA R&R for training in manufacturing industry
MSA R&R for training in manufacturing industry
abhishek558363
 
6Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
6Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)6Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
6Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
Bibhuti Prasad Nanda
 
Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...
Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...
Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...
Gabor Szabo, CQE
 
Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
Bibhuti Prasad Nanda
 
QA QC Program for Waste Water Analysis ppt
QA QC Program for Waste Water Analysis pptQA QC Program for Waste Water Analysis ppt
QA QC Program for Waste Water Analysis ppt
KrisnaBagtasos1
 
Process capability
Process capabilityProcess capability
Process capability
padam nagar
 
Design of Experiments
Design of ExperimentsDesign of Experiments
Design of Experiments
Ronald Shewchuk
 
DEFINITIONS- CALIBRATION.pptx
DEFINITIONS- CALIBRATION.pptxDEFINITIONS- CALIBRATION.pptx
DEFINITIONS- CALIBRATION.pptx
BasavarajEswarappa1
 
Quality Journey- Measurement System Analysis .pdf
Quality Journey- Measurement System Analysis .pdfQuality Journey- Measurement System Analysis .pdf
Quality Journey- Measurement System Analysis .pdf
NileshJajoo2
 
Measurement system analysis Presentation.ppt
Measurement system analysis Presentation.pptMeasurement system analysis Presentation.ppt
Measurement system analysis Presentation.ppt
jawadullah25
 
Rodebaugh sixsigma[1]
Rodebaugh sixsigma[1]Rodebaugh sixsigma[1]
Rodebaugh sixsigma[1]
Jitesh Gaurav
 
Measuremen Systems Analysis Training Module
Measuremen Systems Analysis Training ModuleMeasuremen Systems Analysis Training Module
Measuremen Systems Analysis Training Module
Frank-G. Adler
 
Statistical Quality Control
Statistical Quality ControlStatistical Quality Control
Statistical Quality Control
Mahmudul Hasan
 
Statistical_Quality_Control.ppt
Statistical_Quality_Control.pptStatistical_Quality_Control.ppt
Statistical_Quality_Control.ppt
ssusera85eeb1
 
Statistical Process Control Part 2
Statistical Process Control Part 2Statistical Process Control Part 2
Statistical Process Control Part 2
Malay Pandya
 
Critical Checks for Pharmaceuticals and Healthcare: Validating Your Data Inte...
Critical Checks for Pharmaceuticals and Healthcare: Validating Your Data Inte...Critical Checks for Pharmaceuticals and Healthcare: Validating Your Data Inte...
Critical Checks for Pharmaceuticals and Healthcare: Validating Your Data Inte...
Minitab, LLC
 

Similar to Measurement System Analysis (20)

STATISTICAL PROCESS CONTROL
STATISTICAL PROCESS CONTROLSTATISTICAL PROCESS CONTROL
STATISTICAL PROCESS CONTROL
 
Statistical Process Control Part 1
Statistical Process Control Part 1Statistical Process Control Part 1
Statistical Process Control Part 1
 
Statistical Process Control
Statistical Process ControlStatistical Process Control
Statistical Process Control
 
MSA R&R for training in manufacturing industry
MSA R&R for training in manufacturing industryMSA R&R for training in manufacturing industry
MSA R&R for training in manufacturing industry
 
6Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
6Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)6Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
6Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
 
Introduction to SPC
Introduction to SPCIntroduction to SPC
Introduction to SPC
 
Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...
Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...
Measurement Systems Analysis - Variable Gage R&R Study Metrics, Applications ...
 
Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
Six sigma-in-measurement-systems-evaluating-the-hidden-factory (2)
 
QA QC Program for Waste Water Analysis ppt
QA QC Program for Waste Water Analysis pptQA QC Program for Waste Water Analysis ppt
QA QC Program for Waste Water Analysis ppt
 
Process capability
Process capabilityProcess capability
Process capability
 
Design of Experiments
Design of ExperimentsDesign of Experiments
Design of Experiments
 
DEFINITIONS- CALIBRATION.pptx
DEFINITIONS- CALIBRATION.pptxDEFINITIONS- CALIBRATION.pptx
DEFINITIONS- CALIBRATION.pptx
 
Quality Journey- Measurement System Analysis .pdf
Quality Journey- Measurement System Analysis .pdfQuality Journey- Measurement System Analysis .pdf
Quality Journey- Measurement System Analysis .pdf
 
Measurement system analysis Presentation.ppt
Measurement system analysis Presentation.pptMeasurement system analysis Presentation.ppt
Measurement system analysis Presentation.ppt
 
Rodebaugh sixsigma[1]
Rodebaugh sixsigma[1]Rodebaugh sixsigma[1]
Rodebaugh sixsigma[1]
 
Measuremen Systems Analysis Training Module
Measuremen Systems Analysis Training ModuleMeasuremen Systems Analysis Training Module
Measuremen Systems Analysis Training Module
 
Statistical Quality Control
Statistical Quality ControlStatistical Quality Control
Statistical Quality Control
 
Statistical_Quality_Control.ppt
Statistical_Quality_Control.pptStatistical_Quality_Control.ppt
Statistical_Quality_Control.ppt
 
Statistical Process Control Part 2
Statistical Process Control Part 2Statistical Process Control Part 2
Statistical Process Control Part 2
 
Critical Checks for Pharmaceuticals and Healthcare: Validating Your Data Inte...
Critical Checks for Pharmaceuticals and Healthcare: Validating Your Data Inte...Critical Checks for Pharmaceuticals and Healthcare: Validating Your Data Inte...
Critical Checks for Pharmaceuticals and Healthcare: Validating Your Data Inte...
 

More from Ronald Shewchuk

Project Management
Project ManagementProject Management
Project Management
Ronald Shewchuk
 
Operational Excellence – Getting Started
Operational Excellence – Getting StartedOperational Excellence – Getting Started
Operational Excellence – Getting Started
Ronald Shewchuk
 
Bad Actor Analysis
Bad Actor AnalysisBad Actor Analysis
Bad Actor Analysis
Ronald Shewchuk
 
Evolutionary Operation
Evolutionary OperationEvolutionary Operation
Evolutionary Operation
Ronald Shewchuk
 
Reliability Centered Maintenance
Reliability Centered MaintenanceReliability Centered Maintenance
Reliability Centered Maintenance
Ronald Shewchuk
 
Process Control
Process ControlProcess Control
Process Control
Ronald Shewchuk
 
5S Plus Safety
5S Plus Safety5S Plus Safety
5S Plus Safety
Ronald Shewchuk
 
Value stream mapping
Value stream mappingValue stream mapping
Value stream mapping
Ronald Shewchuk
 
Response Surface Regression
Response Surface RegressionResponse Surface Regression
Response Surface Regression
Ronald Shewchuk
 
Solving Manufacturing Problems
Solving Manufacturing ProblemsSolving Manufacturing Problems
Solving Manufacturing Problems
Ronald Shewchuk
 

More from Ronald Shewchuk (10)

Project Management
Project ManagementProject Management
Project Management
 
Operational Excellence – Getting Started
Operational Excellence – Getting StartedOperational Excellence – Getting Started
Operational Excellence – Getting Started
 
Bad Actor Analysis
Bad Actor AnalysisBad Actor Analysis
Bad Actor Analysis
 
Evolutionary Operation
Evolutionary OperationEvolutionary Operation
Evolutionary Operation
 
Reliability Centered Maintenance
Reliability Centered MaintenanceReliability Centered Maintenance
Reliability Centered Maintenance
 
Process Control
Process ControlProcess Control
Process Control
 
5S Plus Safety
5S Plus Safety5S Plus Safety
5S Plus Safety
 
Value stream mapping
Value stream mappingValue stream mapping
Value stream mapping
 
Response Surface Regression
Response Surface RegressionResponse Surface Regression
Response Surface Regression
 
Solving Manufacturing Problems
Solving Manufacturing ProblemsSolving Manufacturing Problems
Solving Manufacturing Problems
 

Recently uploaded

CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
karthi keyan
 
Student information management system project report ii.pdf
Student information management system project report ii.pdfStudent information management system project report ii.pdf
Student information management system project report ii.pdf
Kamal Acharya
 
Planning Of Procurement o different goods and services
Planning Of Procurement o different goods and servicesPlanning Of Procurement o different goods and services
Planning Of Procurement o different goods and services
JoytuBarua2
 
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&BDesign and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Sreedhar Chowdam
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
ClaraZara1
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
AmarGB2
 
weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdf
Pratik Pawar
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
Kamal Acharya
 
AP LAB PPT.pdf ap lab ppt no title specific
AP LAB PPT.pdf ap lab ppt no title specificAP LAB PPT.pdf ap lab ppt no title specific
AP LAB PPT.pdf ap lab ppt no title specific
BrazilAccount1
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
MdTanvirMahtab2
 
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
ydteq
 
Hierarchical Digital Twin of a Naval Power System
Hierarchical Digital Twin of a Naval Power SystemHierarchical Digital Twin of a Naval Power System
Hierarchical Digital Twin of a Naval Power System
Kerry Sado
 
ML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptxML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptx
Vijay Dialani, PhD
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
gerogepatton
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
Osamah Alsalih
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
Robbie Edward Sayers
 
English lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdfEnglish lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdf
BrazilAccount1
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
VENKATESHvenky89705
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
Neometrix_Engineering_Pvt_Ltd
 
Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024
Massimo Talia
 

Recently uploaded (20)

CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
 
Student information management system project report ii.pdf
Student information management system project report ii.pdfStudent information management system project report ii.pdf
Student information management system project report ii.pdf
 
Planning Of Procurement o different goods and services
Planning Of Procurement o different goods and servicesPlanning Of Procurement o different goods and services
Planning Of Procurement o different goods and services
 
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&BDesign and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
 
weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdf
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
 
AP LAB PPT.pdf ap lab ppt no title specific
AP LAB PPT.pdf ap lab ppt no title specificAP LAB PPT.pdf ap lab ppt no title specific
AP LAB PPT.pdf ap lab ppt no title specific
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
 
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
 
Hierarchical Digital Twin of a Naval Power System
Hierarchical Digital Twin of a Naval Power SystemHierarchical Digital Twin of a Naval Power System
Hierarchical Digital Twin of a Naval Power System
 
ML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptxML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptx
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
 
English lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdfEnglish lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdf
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
 
Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024
 

Measurement System Analysis

  • 1. Operational Excellence Measurement System Analysis Operational Excellence Introduction 1/28/2017 Ronald Morgan Shewchuk 1 • Measurement System Analysis (MSA) is the first step of the measure phase along the DMAIC pathway to improvement. • You will be basing the success of your improvement project on key performance indicators that are tied to your measurement system. • Consequently, before you begin tracking metrics you will need to complete a MSA to validate the measurement system. • A comprehensive MSA typically consists of six parts; Instrument Detection Limit, Method Detection Limit, Accuracy, Linearity, Gage R&R and Long Term Stability. • If you want to expand measurement capacity or qualify another instrument you must complement the MSA to include Metrology Correlation and Matching.
  • 2. Operational Excellence Measurement System Analysis Operational Excellence Introduction 1/28/2017 Ronald Morgan Shewchuk 2 • A poor measurement system can make data meaningless and process improvement impossible. • Large measurement error will prevent assessment of process stability and capability, confound Root Cause Analysis and hamper continuous improvement activities in manufacturing operations. • Measurement error has a direct impact on assessing the stability and capability of a process. • Poor metrology can make a stable process appear unstable and make a capable process appear incapable. • Measurement System Analysis quantifies the effect of measurement error on the total variation of a unit operation. • The sources of this variation may be visualized as in Figure 7.1 and the elements of a measurement system as in Figure 7.2.
  • 3. Operational Excellence Measurement System Analysis Operational Excellence Introduction 1/28/2017 Ronald Morgan Shewchuk 3 Observed Process Variation Actual Process Variation Measurement Variation Long-term Process Variation Short-term Process Variation Variation within Sample Variation due to Gage Variation due to Operators Repeatability Calibration Stability Linearity Figure 7.1 Sources of Variation
  • 4. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 4 Figure 7.2 Measurement System Elements Hardware Software Cleanliness Humidity & Temp Vibration Lighting Power Source Setup Calibration Frequency Calibration Technique Sample Preparation Operator Procedure Data Entry Calculations Equipment Procedures Environment Performance
  • 5. Operational Excellence Measurement System Analysis Operational Excellence Introduction 1/28/2017 Ronald Morgan Shewchuk 5 • Operators are often skeptical of measurement systems, especially those that provide them with false feedback causing them to “over-steer” their process. • This skepticism is well founded since many measurement systems are not capable of accurately or precisely measuring the process. • Accuracy refers to the average of individual measurements compared with the known, true value. • Precision refers to the grouping of the individual measurements - the tighter the grouping, the higher the precision. • The bull’s eye targets of Figure 7.3 best illustrate the difference between accuracy and precision.
  • 6. Operational Excellence Measurement System Analysis Operational Excellence Introduction 1/28/2017 Ronald Morgan Shewchuk 6 Figure 7.3 Accuracy vs Precision – The Center of the Target is the Objective Good Accuracy / Bad Precision Bad Accuracy / Good Precision Good Accuracy / Good Precision Bad Accuracy / Bad Precision
  • 7. Operational Excellence Measurement System Analysis Operational Excellence Introduction 1/28/2017 Ronald Morgan Shewchuk 7 • Accuracy is influenced by resolution, bias, linearity and stability whereas precision is influenced by repeatability and reproducibility of the measurement system. • Repeatability is the variation which occurs when the same operator repeatedly measures the same sample on the same instrument under the same conditions. • Reproducibility is the variation which occurs between two or more instruments or operators measuring the same sample with the same measurement method in a stable environment. • The total variance in a quality characteristic of a process is described by Eqn 7.1 and Eqn 7.2. • The percent contribution of the measurement system to the total variance may be calculated from Eqn 7.3. • We want to be able to measure true variations in product quality and not variations in the measurement system so it is desired to minimize 2 measurement • We will review the steps in a typical measurement system analysis by way of example, first for the case of variables data and then for the case of attribute data.
  • 8. Operational Excellence Measurement System Analysis Operational Excellence Introduction 1/28/2017 Ronald Morgan Shewchuk 8 2 total = 2 product + 2 measurement where 2 total = total variance 2 product = variance due to product 2 measurement = variance due to measurement system Eqn 7.1 2 measurement = 2 repeatability + 2 reproducibility where 2 repeatability = variance within operator/device combination 2 reproducibility = variance between operators Eqn 7.2 2 repeatability + 2 reproducibility Eqn 7.3 2 total % Contribution = X 100
  • 9. Operational Excellence Measurement System Analysis Operational Excellence Instrument Detection Limit (IDL) 1/28/2017 Ronald Morgan Shewchuk 9 • Today’s measurement devices are an order of magnitude more complex than the “gages” for which the Automotive Industry Action Group (AIAG) first developed Gage Repeatability and Reproducibility (Gage R&R) studies. • Typically they are electromechanical devices with internal microprocessors having inherent signal to noise ratios. • The Instrument Detection Limit (IDL) should be calculated from the baseline noise of the instrument. • Let us examine the case where a gas chromatograph (GC) is being used to measure the concentration of some analyte of interest. Refer to Figure 7.4.
  • 10. Operational Excellence Measurement System Analysis Operational Excellence Instrument Detection Limit (IDL) 1/28/2017 Ronald Morgan Shewchuk 10 Figure 7.4 Gas Chromatogram
  • 11. Operational Excellence Measurement System Analysis Operational Excellence Instrument Detection Limit (IDL) 1/28/2017 Ronald Morgan Shewchuk 11 • The chromatogram has a baseline with peaks at different column retention times for hydrogen, argon, oxygen, nitrogen, methane and carbon monoxide. • Let’s say we wanted to calculate the IDL for nitrogen at retention time 5.2 min. • We would purge and evacuate the column to make sure it is clean then successively inject seven blanks of the carrier gas (helium). • The baseline noise peak at retention time 5.2 min is integrated for each of the blank injections and converted to concentration units of Nitrogen. • The standard deviation of these concentrations is multiplied by the Student’s t statistic for n-1 degrees of freedom at a 99% confidence interval (3.143) to calculate the IDL. • This is the EPA protocol as defined in 40 CFR Part 136: Guidelines Establishing Test Procedures for the Analysis of Pollutants, Appendix B. • Refer to Figure 7.5 below for the calculation summary.
  • 12. Operational Excellence Measurement System Analysis Operational Excellence Instrument Detection Limit (IDL) 1/28/2017 Ronald Morgan Shewchuk 12 Injection No. N2 (ppm) 1 0.01449 2 0.01453 3 0.01456 4 0.01459 5 0.01442 6 0.01440 7 0.01447 StDev 0.00007044 Mean 0.01449 RSD 0.49% IDL 0.0002214 df 90.0% 95.0% 97.5% 99.0% 99.5% 99.9% 1 3.078 6.314 12.706 31.821 63.657 318.309 2 1.886 2.92 4.303 6.965 9.925 22.327 3 1.638 2.353 3.183 4.541 5.841 10.215 4 1.533 2.132 2.777 3.747 4.604 7.173 5 1.476 2.015 2.571 3.365 4.032 5.893 6 1.44 1.943 2.447 3.143 3.708 5.208 7 1.415 1.895 2.365 2.998 3.5 4.785 8 1.397 1.86 2.306 2.897 3.355 4.501 9 1.383 1.833 2.262 2.822 3.25 4.297 10 1.372 1.812 2.228 2.764 3.169 4.144 Percentiles of the t -Distribution df = n - 1 IDL = T(df, 1-α=0.99) * Stdev IDL = 3.143(0.00007044) = 0.0002214 ppm N2 Figure 7.5 Instrument Detection Limit (IDL) Calculation
  • 13. Operational Excellence Measurement System Analysis Operational Excellence Method Detection Limit (MDL) 1/28/2017 Ronald Morgan Shewchuk 13 • Method detection limit (MDL) is defined as the minimum concentration of a substance that can be measured and reported with 99% confidence that the analyte concentration is greater than zero as determined from analysis of a sample in a given matrix containing the analyte. • MDL is calculated in a similar way to IDL with the exception that the same sample is measured on the instrument with n=7 trials and the sample is disconnected and reconnected to the measurement apparatus between trials. • This is called dynamic repeatability analysis. • An estimate is made of the MDL and a sample prepared at or near this MDL concentration. • The seven trials are then measured on the instrument and the MDL calculated as in Figure 7.6. • MDL divided by the mean of the seven trials should be within 10-100%. • If this is not the case, repeat the MDL analysis with a starting sample concentration closer to the calculated MDL.
  • 14. Operational Excellence Measurement System Analysis Operational Excellence Method Detection Limit (MDL) 1/28/2017 Ronald Morgan Shewchuk 14 df 90.0% 95.0% 97.5% 99.0% 99.5% 99.9% 1 3.078 6.314 12.706 31.821 63.657 318.309 2 1.886 2.92 4.303 6.965 9.925 22.327 3 1.638 2.353 3.183 4.541 5.841 10.215 4 1.533 2.132 2.777 3.747 4.604 7.173 5 1.476 2.015 2.571 3.365 4.032 5.893 6 1.44 1.943 2.447 3.143 3.708 5.208 7 1.415 1.895 2.365 2.998 3.5 4.785 8 1.397 1.86 2.306 2.897 3.355 4.501 9 1.383 1.833 2.262 2.822 3.25 4.297 10 1.372 1.812 2.228 2.764 3.169 4.144 Percentiles of the t -Distribution df = n - 1 MDL = T(df, 1-α=0.99) * Stdev MDL = 3.143(0.01801) = 0.05660 ppm N2 Figure 7.6 Method Detection Limit (MDL) Calculation Injection No. N2 (ppm) 1 0.3596 2 0.3010 3 0.3227 4 0.3239 5 0.3335 6 0.3196 7 0.3365 StDev 0.01801 Mean 0.3281 RSD 5.49% MDL 0.05660 MDL/X-bar 17.2%
  • 15. Operational Excellence Measurement System Analysis Operational Excellence Measurement System Analysis – Variables Data 1/28/2017 Ronald Morgan Shewchuk 15 • A properly conducted measurement system analysis (MSA) can yield a treasure trove of information about your measurement system. • Repeatability, reproducibility, resolution, bias, and precision to tolerance ratio are all deliverables of the MSA and can be used to identify areas for improvement in your measurement system. • It is important to conduct the MSA in the current state since this is your present feedback mechanism for your process. • Resist the temptation to dust off the Standard Operating Procedure and brief the operators on the correct way to measure the parts. • Resist the temptation to replace the NIST1 - traceable standard, which looks like it has been kicked around the metrology laboratory a few times. 1National Institute of Standards and Technology
  • 16. Operational Excellence Measurement System Analysis Operational Excellence Measurement System Analysis – Variables Data 1/28/2017 Ronald Morgan Shewchuk 16 • To prepare for an MSA you must collect samples from the process that span the specification range of the measurement in question. • Include out-of-spec high samples and out-of-spec low samples. • Avoid creating samples artificially in the laboratory. • There may be complicating factors in the commercial process which influence your measurement system. • Include all Operators in the MSA who routinely measure the product. • The number of samples times the number of Operators should be greater than or equal to fifteen, with three trials for each sample. • If this is not practical, increase the number of trials as per Figure 7.7.
  • 17. Operational Excellence Measurement System Analysis Operational Excellence Measurement System Analysis – Variables Data 1/28/2017 Ronald Morgan Shewchuk 17 • Code the samples such that the coding gives no indication to the expected measurement value – this is called blind sample coding. • Have each sample measured by an outside laboratory. • These measurements will serve as your reference values. • Ask each Operator to measure each sample three times in random sequence. • Ensure that the Operators do not “compare notes”. • We will utilize Minitab to analyze the measurement system described in Case Study III. Samples x Operators Trials S x O ≥ 15 3 8 ≤ S x O < 15 4 5 ≤ S x O < 8 5 S x O < 5 6 Figure 7.7 Measurement System Analysis Design
  • 18. Operational Excellence Measurement System Analysis Operational Excellence Measurement System Analysis – Variables Data 1/28/2017 Ronald Morgan Shewchuk 18 Case Study III: Minnesota Polymer Co. Minnesota Polymer Co. supplies a special grade of resin to ABC Molding Co. which includes a silica modifier to improve dimensional stability. The product code is POMBLK-15 and the silica concentration specification by weight is 15  2%. Silica concentration is determined by taking a sample of the powdered resin and pressing it into a 4 cm disk using a 25-ton hydraulic press. The sample disk is then analyzed by x-ray fluorescence energy dispersive spectroscopy (XRF-EDS) to measure the silica content. Manufacturing POMBLK-15 is difficult. The silica is light and fluffy and sometimes gets stuck in the auger used to feed the mixing tank. A new process engineer, Penelope Banks, has been hired by Minnesota Polymer. One of her first assignments is to improve POMBLK-15 process control. SPC analysis of historical batch silica concentration results have indicated out-of- control symptoms and poor Cpk. Before Penny makes any changes to the process she prudently decides to conduct a measurement system analysis to find out the contribution of the measurement system to the process variation. Minnesota Polymer is a firm believer in process ownership. The same operator who charges the raw materials, runs the manufacturing process, collects the quality control sample, presses the sample disk and then runs the silica analysis on the XRF- EDS instrument. The operator uses the silica concentration analysis results to adjust the silica charge on the succeeding batch. POMBLK-15 is typically run over a five-day period in the three-shift, 24/7 operation. Penny has collected five powder samples from POMBLK-15 process retains which span the silica specification range and included two out-of-specification samples pulled from quarantine lots. She has asked each of the three shift operators to randomly analyze three samples from each powder bag for silica content according to her sampling plan. Penny has sent a portion of each sample powder to the Company’s R&D Headquarters in Hong Kong for silica analysis. These results will serve as reference values for each sample. The following table summarizes the silica concentration measurements and Figure 7.8 captures the screen shots of the MSA steps for Case Study III. Sample Operator 1 Operator 2 Operator 3 Bag # Reference1 Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 1 17.3 18.2 17.9 18.2 18.1 18.0 18.0 17.8 17.8 18.2 2 14.0 14.4 14.9 14.8 14.8 14.6 14.8 14.4 14.4 14.5 3 13.3 14.0 13.9 13.8 13.9 14.2 14.0 13.8 13.7 13.8 4 16.7 17.2 17.2 17.4 17.4 17.3 17.5 17.4 17.5 17.5 5 12.0 12.9 12.8 12.5 12.5 12.9 12.8 12.9 12.5 12.6 1 As Reported by Hong Kong R&D Center
  • 19. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 19 Figure 7.8 Measurement System Analysis Steps – Variable Data Open a new worksheet. Click on Stat  Quality Tools  Gage Study  Create Gage R&R Study Worksheet on the top menu.
  • 20. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 20 Figure 7.8 Measurement System Analysis Steps – Variable Data Enter the Number of Operators, the Number of Replicates and the Number of Parts in the dialogue box. Click OK.
  • 21. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 21 Figure 7.8 Measurement System Analysis Steps – Variable Data The worksheet is modified to include a randomized run order of the samples.
  • 22. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 22 Figure 7.8 Measurement System Analysis Steps – Variable Data Name the adjoining column Silica Conc and transcribe the random sample measurement data to the relevant cells in the worksheet.
  • 23. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 23 Figure 7.8 Measurement System Analysis Steps – Variable Data Click on Stat  Quality Tools  Gage Study  Gage R&R Study (Crossed) on the top menu.
  • 24. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 24 Figure 7.8 Measurement System Analysis Steps – Variable Data Select C2 Parts for Part numbers, C3 Operators for Operators and C4 Silica Conc for Measurement data in the dialogue box. Click the radio toggle button for ANOVA under Method of Analysis. Click Options.
  • 25. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 25 Figure 7.8 Measurement System Analysis Steps – Variable Data Six (6) standard deviations will account for 99.73% of the Measurement System variation. Enter Lower Spec Limit and Upper Spec Limit in the dialogue box. Click OK. Click OK.
  • 26. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 26 Figure 7.8 Measurement System Analysis Steps – Variable Data A new graph is created in the Minitab project file with the Gage R&R analysis results.
  • 27. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 27 Return to the session by clicking on Window  Session on the top menu to view the ANOVA analytical results.
  • 28. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 28 • Let us more closely examine the graphical output of the Gage R&R (ANOVA) Report for Silica Conc. • Figure 7.9 shows the components of variation. • A good measurement system will have the lion’s share of variation coming from the product, not the measurement system. • Consequently, we would like the bars for repeatability and reproducibility to be small relative to part-to-part variation. Figure 7.9 MSA Components of Variation
  • 29. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 29 • Figure 7.10 captures the range SPC chart by Operators. • The range chart should be in control. • If it is not, a repeatability problem is present. Figure 7.10 MSA Range Chart by Operators
  • 30. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 30 • By contrast, the X-bar SPC chart of Figure 7.11 should be out of control. • This seems counterintuitive but it is a healthy indication that the variability present is due to part to part differences rather than Operator to Operator differences Figure 7.11 MSA X-bar Chart by Operators
  • 31. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 31 • Figure 7.12 is an individual value plot of silica concentration by sample number. • The circles with a cross indicate the mean of the sample data and the solid circles are individual data points. • We want a tight grouping around the mean for each sample and we want significant variation between the means of different samples. • If we do not have variation between samples the MSA has been poorly designed and we essentially have five samples of the same thing. • This will preclude analysis of the measurement system. Figure 7.12 MSA Silica Concentration by Sample Number
  • 32. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 32 • Figure 7.13 is a boxplot of silica concentration by Operator. • As in Figure 7.12 the circles with a cross indicate the mean concentration for all samples by Operator. • The shaded boxes represent the interquartile range (Q3-Q1) for each Operator. • The interquartile range (IQR) is the preferred measure of spread for data sets which are not normally distributed. • The solid line within the IQR is the median silica concentration of all samples by Operator. • If Operators are performing the same, we would expect similar means, medians and IQRs. Figure 7.13 MSA Silica Concentration by Operator
  • 33. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 33 • Figure 7.14 is an individual value plot used to check for Operator-Sample interactions. • The lines for each Operator should be reasonably parallel to each other. • Crossing lines indicate the presence of Operator-Sample interactions. • This can happen when Operators are struggling with samples at or near the MDL or if the instrument signal to noise ratio varies as a function of concentration. Figure 7.14 MSA Sample by Operator Interaction
  • 34. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 34 • Let us now focus on the analytical output of the session window as captured in Figure 7.8. • Lovers of Gage R&Rs will typically look for four metrics as defined below and expect these metrics to be within the acceptable or excellent ranges specified by Gage R&R Metric Rules of Thumb as shown in Figure 7.15. 2 measurement Eqn 7.4 2 total % Contribution = X 100 measurement Eqn 7.5 total % Study Variation = X 100 6measurement Eqn 7.6 USL- LSL Two-Sided Spec % P/T = X 100 3measurement Eqn 7.7 TOL One-Sided Spec % P/T = X 100
  • 35. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 35 1.41total Eqn 7.8 measurement Number of Distinct Categories = trunc where 2 total = Total Variance 2 measurement = Variance due to Measurement System total = Total Standard Deviation measurement = Standard Deviation due to Measurement System P/T = Precision to Tolerance Ratio USL= Upper Spec Limit LSL= Lower Spec Limit TOL= Process Mean – LSL for LSL only TOL= USL – Process Mean for USLonly Gage R&R Metric Unacceptable Acceptable Excellent % Contribution > 7.7% 2.0 - 7.7% < 2% % Study Variation > 28% 14 - 28% < 14% % P/T Ratio > 30% 8 - 30% < 8% Number of Distinct Categories < 5 5 - 10 > 10 Figure 7.15 Gage R&R Metrics – Rules of Thumb
  • 36. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 36 • The highlighted output of the Minitab session window indicates a % Contribution of the measurement system of 0.55%. • This is in the excellent region. • % Study Variation is 7.39% which is also in the excellent region. • Precision to Tolerance ratio is 25.37%. • This is in the acceptable region. • Number of distinct categories is 19, well within the excellent region. • Overall, this is a good measurement system. • Now, let us proceed to check for linearity and bias by adding the reference concentrations as measured by the Hong Kong R&D Center for each of the samples to the worksheet. • Figure 7.16 captures the screen shots necessary for this process.
  • 37. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 37 Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data Return to the active worksheet by clicking on Window  Worksheet 1 *** on the top menu. Name the adjoining column Reference Conc and enter the reference sample concentration values corresponding to each sample (Part) number.
  • 38. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 38 Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data Click on Stat  Quality Tools  Gage Study  Gage Linearity and Bias Study on the top menu.
  • 39. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 39 Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data Select C2 Parts for Part numbers, C5 Reference Conc for Reference values and C4 Silica Conc for Measurement data in the dialogue box. Click OK.
  • 40. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 40 Figure 7.16 Gage Linearity and Bias Study Steps – Variable Data A new graph is created in the Minitab project file with the Gage Linearity and Bias Study results.
  • 41. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 41 • We can see there is a bias between the Hong Kong measurement system and Minnesota Polymer’s measurement system. • The bias is relatively constant over the silica concentration range of interest as indicated by the regression line. • The Minnesota Polymer measurement system is reading approximately 0.67 wt % Silica higher than Hong Kong. • This is not saying that the Hong Kong instrument is right and the Minnesota Polymer instrument is wrong. • It is merely saying that there is a difference between the two instruments which must be investigated. • This difference could have process capability implications if it is validated. • Minnesota Polymer may be operating in the top half of the allowable spec range. • The logical next step is for the Hong Kong R&D center to conduct an MSA of similar design, ideally with the same sample set utilized by Minnesota Polymer.
  • 42. Operational Excellence Measurement System Analysis Operational Excellence Measurement System Analysis – Attribute Data 1/28/2017 Ronald Morgan Shewchuk 42 Case Study IV: Virtual Cable Co. David Raffles Lee has just joined Virtual Cable Co., the leading telecommunications company in the southwest as Chief Executive Officer. David comes to Virtual Cable with over thirty years of operations experience in the telecommunications industry in Singapore. During a tour of one of the Customer Service Centers, David noticed that the customer service agents were all encased in bulletproof glass. David queried the Customer Service Manager, Bob Londale about this and Bob responded, “It is for the protection of our associates. Sometimes our customers become angry and they produce weapons.” David was rather shocked about this and wanted to learn more about customer satisfaction at Virtual Cable. He formed a team to analyze the measurement of customer satisfaction. This team prepared ten scripts of typical customer complaints with an intended outcome of pass – customer was satisfied with the customer service agent’s response or fail – customer was dissatisfied with the response. Twenty “customers” were coached on the scripts, one script for two customers. These customers committed the scripts to memory and presented their service issue to three different Customer Service Agents at three different Customer Service Centers. Each customer was issued an account number and profile to allow the Customer Service Agent to rate the customer’s satisfaction level in the customer feedback database as required by Virtual Cable’s policy. The results are summarized in the attached table and analyzed by the MSA attribute data steps of Figure 7.17. • In our next case we will analyze the measurement system used to rate customer satisfaction as described in Case Study IV below. Operator 1 Operator 2 Operator 3 Script # Reference1 Rep 1 Rep 2 Rep 1 Rep 2 Rep 1 Rep 2 1 F F F F F F F 2 P P P P P P P 3 P P P P P P P 4 P P P P P P P 5 F F F F P F F 6 P P P P P P P 7 F F F F F F F 8 F F F F F F F 9 P P F P P F P 10 F F F F F F F 1 Intended outcome of script from Customer Satisfaction Team
  • 43. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 43 Figure 7.17 Measurement System Analysis Steps – Attribute Data Open a new worksheet. Click on Stat  Quality Tools  Create Attribute Agreement Analysis Worksheet on the top menu.
  • 44. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 44 Figure 7.17 Measurement System Analysis Steps – Attribute Data Enter the Number of samples, the Number of appraisers and the Number of replicates in the dialogue box. Click OK.
  • 45. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 45 Figure 7.17 Measurement System Analysis Steps – Attribute Data The worksheet is modified to include a randomized run order of the scripts (samples).
  • 46. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 46 Figure 7.17 Measurement System Analysis Steps – Attribute Data Name the adjoining columns Response and Reference. Transcribe the satisfaction level rating and the reference value of the script to the appropriate cells.
  • 47. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 47 Figure 7.17 Measurement System Analysis Steps – Attribute Data Click on Stat  Quality Tools  Attribute Agreement Analysis on the top menu.
  • 48. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 48 Figure 7.17 Measurement System Analysis Steps – Attribute Data Select C4 Response for Attribute column, C2 Samples for Samples and C3 Appraisers for Appraisers in the dialogue box. Select C5 Reference for Known standard/attribute. Click OK.
  • 49. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 49 Figure 7.17 Measurement System Analysis Steps – Attribute Data A new graph is created in the Minitab project file with the Attribute Assessment Agreement results. Date of study: Reported by: Name of product: Misc: 321 1 00 90 80 70 60 Appraiser Percent 95.0% CI Percent 321 1 00 90 80 70 60 Appraiser Percent 95.0% CI Percent Assessment Agreement Within Appraisers Appraiser vs Standard
  • 50. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 50 Figure 7.17 Measurement System Analysis Steps – Attribute Data Display the analytical MSA Attribute Agreement Results by clicking on Window  Session on the top menu.
  • 51. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 51 • The attribute MSA results allow us to determine the percentage overall agreement, the percentage agreement within appraisers (repeatability), the percentage agreement between appraisers (reproducibility), the percentage agreement with reference values (accuracy) and the Kappa Value (index used to determine how much better the measurement system is than random chance). • From the graphical results we can see that the Customer Service Agents were in agreement with each other 90% of the time and were in agreement with the expected (standard) result 90% of the time. • From the analytical results we can see that the agreement between appraisers was 80% and the overall agreement vs the standard values was 80%. • The Kappa Value for all appraisers vs the standard values was 0.90, indicative of excellent agreement between the appraised values and reference values. • Figure 7.18 provides benchmark interpretations for Kappa Values.
  • 52. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 52 Figure 7.18 Rules of Thumb for Interpreting Kappa Values Kappa Value Interpretation -1.0 to 0.6 Agreement expected as by chance 0.6 to 0.7 Marginal agreement - significant effort required to improve measurement system 0.7 to 0.8 Good agreement - some improvement to measurement system is warranted 0.9 to 1.0 Excellent agreement Attribute MSA - Kappa Value • Another way of looking at this case is that out of sixty expected outcomes there were only three miscalls on rating customer satisfaction by the Customer Service Agents included in this study. • Mr. Lee can have confidence in the feedback of the Virtual Cable customer satisfaction measurement system and proceed to identify and remedy the underlying root causes of customer dissatisfaction.
  • 53. Operational Excellence Measurement System Analysis Operational Excellence Improving the Measurement System 1/28/2017 Ronald Morgan Shewchuk 53 • Improvements to the measurement system should be focused on the root cause(s) of high measurement system variation. • If repeatability is poor, consider a more detailed repeatability study using one part and one operator over an extended period of time. • Ask the operator to measure this one sample twice per day for one month. • Is the afternoon measurement always greater or always lesser than the morning measurement? • Perhaps the instrument is not adequately cooled. • Are the measurements trending up or down during the month? • This is an indication of instrument drift. • Is there a gold standard for the instrument? • This is one part that is representative of production parts, kept in a climate- controlled room, handled only with gloves and carried around on a red velvet pillow.
  • 54. Operational Excellence Measurement System Analysis Operational Excellence Improving the Measurement System 1/28/2017 Ronald Morgan Shewchuk 54 • Any instrument must have a gold standard. • Even the kilogram has a gold standard. • It is a platinum-iridium cylinder held under glass at the Bureau International des Poids et Mesures in Sèvres, France. • If the gold standard measures differently during the month the measurement error is not due to the gold standard, it is due to the measurement system. • Consider if the instrument and/or samples are affected by temperature, humidity, vibration, dust, etc. • Set up experiments to validate these effects with data to support your conclusions. • If you are lobbying for the instrument to be relocated to a climate-controlled clean room you better have the data to justify this move.
  • 55. Operational Excellence Measurement System Analysis Operational Excellence Improving the Measurement System 1/28/2017 Ronald Morgan Shewchuk 55 • If reproducibility is poor, read the Standard Operating Procedure (SOP) in detail. • Is the procedure crystal clear without ambiguity which would lead operators to conduct the procedure differently? • Does the procedure specify instrument calibration before each use? • Does the procedure indicate what to do if the instrument fails the calibration routine? • Observe the operators conducting the procedure. • Are they adhering to the procedure? • Consider utilizing the operator with the lowest variation as a mentor/coach for the other operators. • Ensure that the SOP is comprehensive and visual. • Functional procedures should be dominated by pictures, diagrams, sketches, flow charts, etc which clearly demonstrate the order of operations and call out the critical points of the procedure.
  • 56. Operational Excellence Measurement System Analysis Operational Excellence Improving the Measurement System 1/28/2017 Ronald Morgan Shewchuk 56 • Avoid lengthy text SOP’s devoid of graphics. • They do not facilitate memory triangulation – the use of multiple senses to recall learning. • Refresher training should be conducted annually on SOP’s with supervisor audit of the Operator performing the measurement SOP.
  • 57. Operational Excellence Measurement System Analysis Operational Excellence Long Term Stability 1/28/2017 Ronald Morgan Shewchuk 57 • Now that you have performed analyses to establish the Instrument Detection Limit, Method Detection Limit, Accuracy, Linearity, and Gage R&R metrics of your measurement system and proven that you have a healthy measurement system; you will need to monitor the measurement system to ensure that it remains healthy. • Stability is typically monitored through daily measurement of a standard on the instrument in question. • If a standard is not available, one of the samples from the Gage R&R can be utilized as a “Golden Sample”. • Each day, after the instrument is calibrated, the standard is measured on the instrument. • An Individuals Moving Range (IMR) SPC chart is generated as we have covered in Chapter 6. • If the standard is in control then the measurement system is deemed to be in control and this provides the justification to utilize the instrument to perform commercial analyses on process samples throughout the day
  • 58. Operational Excellence Measurement System Analysis Operational Excellence Long Term Stability 1/28/2017 Ronald Morgan Shewchuk 58 • If the standard is not in control the instrument is deemed to be nonconforming and a Root Cause Analysis must be initiated to identify the source(s) of the discrepancy. • Once the discrepancy has been identified and corrected, the standard is re-run on the instrument and the IMR chart refreshed to prove that the instrument is in control. • Figure 7.19 shows daily stability measurements from Case Study III, silica concentration measurement of Golden Sample disk number two.
  • 59. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 59 Figure 7.19 Measurement System Long Term Stability 28252219161310741 15.2 14.8 14.4 Day IndividualValue _ X=14.67 UCL=15.101 LCL=14.239 28252219161310741 0.6 0.4 0.2 0.0 Day MovingRange __ MR=0.1621 UCL=0.5295 LCL=0 I-MR Chart of Golden Sample 2 Silica Conc
  • 60. Operational Excellence Measurement System Analysis Operational Excellence Metrology Correlation and Matching 1/28/2017 Ronald Morgan Shewchuk 60 • Metrology correlation is utilized when comparing two measurement systems. • This includes the sample preparation steps required before the actual measurement is conducted as this is part of the measurement system. • Metrology correlation and matching assessment is performed when replacing an existing metrology tool with a new metrology tool, expanding measurement capacity by adding a second tool, comparing customer metrology to supplier metrology or comparing a metrology tool at one site to a metrology tool at another site. • Metrology correlation analysis is conducted when the two metrology tools are not required to deliver the exact same output. • This occurs when the equipment, fixtures, procedures and environment of the two measurement tools are not exactly the same. • This is a common situation when comparing customer metrology to supplier metrology.
  • 61. Operational Excellence Measurement System Analysis Operational Excellence Metrology Correlation and Matching 1/28/2017 Ronald Morgan Shewchuk 61 • Metrology matching analysis is conducted when the two metrology tools are required to deliver exactly the same output. • This is a typical condition where a specification exists for a critical quality characteristic. • Before conducting metrology correlation and matching there are some prerequisites. • You must ensure metrologies are accurate, capable, and stable. • This means that the two measurement systems under consideration must have passed the success criterion for instrument detection limit, method detection limit, accuracy, linearity, Gage R&R and long term stability. • Correlation and matching is most likely to be successful if the measurement procedures are standardized. • Select a minimum of sixteen samples to be measured on both metrology tools. • Samples should be selected such that they span the measurement range of interest (for example – the spec range).
  • 62. Operational Excellence Measurement System Analysis Operational Excellence Metrology Correlation and Matching 1/28/2017 Ronald Morgan Shewchuk 62 • Avoid clustered samples around a certain measurement value. • If necessary, manufacture samples to cover the spec range. • It is acceptable to include out of spec high and low samples. • In order for two measurement systems to be correlated, R-squared of the least squares regression line of the current instrument vs the proposed instrument must be 75% or higher. • If matching is desired, there are two additional requirements; the 95% confidence interval of the slope of the orthogonal regression line must include a slope of 1.0 and a paired t-Test passes (ie 95% confidence interval of mean includes zero). • This ensures that bias between the two instruments is not significant. • Let us revisit Penelope Banks at Minnesota Polymer to better understand metrology correlation and matching protocol. • Penny has requisitioned a redundant XRF-EDS to serve as a critical back-up to the existing XRF-EDS instrument and to provide analysis capacity expansion for the future
  • 63. Operational Excellence Measurement System Analysis Operational Excellence Metrology Correlation and Matching 1/28/2017 Ronald Morgan Shewchuk 63 • She has been submitting samples for analysis to both instruments for the last sixteen weeks and has collected the following results. • Please refer to Figure 7.20 for correlation and matching analysis steps. Sample No. XRF-EDS1 XRF-EDS2 160403-2359D 14.2 14.4 160410-1600A 15.3 15.1 160414-0200B 13.7 13.5 160421-1400C 16.8 17.0 160427-0830C 13.5 13.3 160504-0300D 15.1 15.1 160510-1030A 13.3 13.2 160518-0100B 16.4 16.2 160525-1615C 16.6 16.5 160601-2330D 14.3 14.5 160608-0500D 15.7 15.9 160616-1330A 13.8 13.6 160625-1515C 15.7 15.8 160630-0420D 16.2 16.0 160707-2230B 13.5 13.7 160715-1920B 16.8 17.0
  • 64. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 64 Figure 7.20 Metrology Correlation and Matching Steps Open a new worksheet. Copy and paste the measurement data from the two instruments into the worksheet.
  • 65. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 65 Figure 7.20 Metrology Correlation and Matching Steps Click on Graph → Scatterplot on the top menu.
  • 66. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 66 Figure 7.20 Metrology Correlation and Matching Steps Select With Regression in the dialogue box. Click OK.
  • 67. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 67 Figure 7.20 Metrology Correlation and Matching Steps Select the reference instrument XRF-EDS1 for the X variables and XRF-EDS2 for the Y variables. Click OK.
  • 68. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 68 Figure 7.20 Metrology Correlation and Matching Steps A scatter plot is produced with least squares regression line. 1716151413 17 16 15 14 13 XRF-EDS1 XRF-EDS2 Scatterplot of XRF-EDS2 vs XRF-EDS1
  • 69. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 69 Figure 7.20 Metrology Correlation and Matching Steps Hover your cursor over the least squares regression line. The R-sq = 98.1%. Correlation is good.
  • 70. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 70 Figure 7.20 Metrology Correlation and Matching Steps Return to the worksheet. Click on Stat → Regression → Orthogonal Regression on the top menu.
  • 71. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 71 Figure 7.20 Metrology Correlation and Matching Steps Select the reference instrument XRF-EDS2 for the Response (Y) and XRF-EDS1 for the Predictor (X) variables. Click Options.
  • 72. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 72 Figure 7.20 Metrology Correlation and Matching Steps Select 95 for the Confidence level. Click OK → then click OK one more time.
  • 73. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 73 Figure 7.20 Metrology Correlation and Matching Steps A scatter plot is produced with orthogonal regression line. 1716151413 17 16 15 14 13 XRF-EDS1 XRF-EDS2 Plot of XRF-EDS2 vs XRF-EDS1 with Fitted Line
  • 74. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 74 Figure 7.20 Metrology Correlation and Matching Steps Click on Window → Session on the top menu. The session window indicates that the 95% Confidence Interval of the slope includes 1.0. The two instruments are linear in accuracy.
  • 75. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 75 Figure 7.20 Metrology Correlation and Matching Steps Return to the worksheet. Click on Stat → Basic Statistics → Paired t on the top menu.
  • 76. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 76 Figure 7.20 Metrology Correlation and Matching Steps Select XRF-EDS1 for Sample 1 and XRF-EDS2 for Sample 2 in the dialogue box. Click Options.
  • 77. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 77 Figure 7.20 Metrology Correlation and Matching Steps Select 95.0 for Confidence level. Select 0.0 for Hypothesized difference. Select Difference ≠ hypothesized difference for Alternative hypothesis in the dialogue box. Click OK. Then click OK one more time.
  • 78. Operational Excellence Measurement System Analysis Operational Excellence 1/28/2017 Ronald Morgan Shewchuk 78 Figure 7.20 Metrology Correlation and Matching Steps The session window indicates that the 95% confidence interval for the mean difference includes zero. The P-Value for the paired t-Test is above the significance level of 0.05. Therefore we may not reject the null hypothesis. There is no significant bias between the two instruments. • Penelope has proven that XRF-EDS2 is correlated and matched to XRF-EDS1. • She may now use XRF-EDS2 for commercial shipment releases including Certificates of Analysis to her customers.
  • 79. Operational Excellence Measurement System Analysis Operational Excellence References 1/28/2017 Ronald Morgan Shewchuk 79 Warner, Kent. Martinich, Dave. Wenz, Paul., Measurement Capability and Correlation, Revision 4.0.2, Intel, Santa Clara, CA, 2010 AIAG, Measurement Systems Analysis, Fourth Edition, Automotive Industry Action Group., Southfield, MI, 2010 Breyfogle, Forrest W., III., Implementing Six Sigma, Second Edition, John Wiley & Sons, Hoboken, NJ, 2003 George, M., Maxey, P., Price, M. and Rowlands, D., The Lean Six Sigma Pocket Toolbook, McGraw-Hill, New York, NY, 2005 Wedgwood, Ian D., Lean Sigma – A Practitioner’s Guide, Prentice Hall, Boston, MA 2007 40 CFR Part 136: Guidelines Establishing Test Procedures for the Analysis of Pollutants, Appendix B, Environmental Protection Agency, Washington, DC, 2012
  • 80. Operational Excellence Measurement System Analysis Operational Excellence Internet Resources 1/28/2017 Ronald Morgan Shewchuk 80 • Automotive Industry Action Group Automotive Industry Action Group • Method Detection Limit (MDL) Calculators Method Detection Limit (MDL) Calculators | CHEMIASOFT • 40 CFR Part 136: Guidelines Establishing Test Procedures for the Analysis of Pollutants, Appendix B 40 CFR Part 136, Subchapter D • Percentiles of the t-Distribution http://sites.stat.psu.edu/~mga/401/tables/t.pdf