OMTEC 2017
FDA Update
FDA’s focus as reflected in FDA-483
Observations with trends over time
(historical view)
On the horizon - FDA’s Case for Quality
Initiative - Medical Device Quality Metrics
with practical application (future view)
- Quality and Regulatory Solutions
- FDA Compliance
- Quality System Management & Readiness
- Supplier Quality Management
- Metrics & Data
Eduard Toerek, President - QUARA Innovations
etoerek@quarainnovations.com
Reinhold Toerek, Vice-President - QUARA Innovations
rtoerek@quarainnovations.com
www.quarainnovations.com © 2017 Quara Innovations LLC . For use permission, contact info@quarainnovations.com
2
Source: FDA’s Field Accomplishment and Compliance Tracking System (FACTS)
Key Finding CY2016
o Slight increase in overall number of
QS Surveillance Inspections in CY16
vs. CY15
o Increase in No Action Indicated (NAI)
inspection outcomes (both domestic
and foreign)
o Fewer 483s were issued
to firms in CY16
o All QS subsystems saw a drop in
the number of 483 observations
o Increase in Foreign Inspections and
decrease in Domestic Inspections
(consistent with the increase in
foreign firms actively registered
and listed)
o Number of WLs issued dropped
from 121 (CY15) to 57 (CY16)
3
FDA Establishment Inspections
Form FDA-483 Inspectional Observations
• Issued at the conclusion of an inspection when an investigator
has observed any conditions that my constitute violations of
the Food Drug and Cosmetic (FD&C) Act and related Acts
• Used to document concerns discovered during FDA Inspections
• Multiple observations may be listed on the FDA-483 Form
• Note: FDA-483s do NOT cover every possible deviation from
law and regulation
4
FDA Establishment Inspections
Inspection Outcomes / Classification
• Official Action Indicated (OAI)
– An OAI inspection classification occurs when significant objectionable conditions or
practices were found and regulatory action is warranted to address the
establishment's lack of compliance with statute(s) or regulation(s).
• Voluntary Action Indicated (VAI)
– A VAI inspection classification occurs when objectionable conditions or practices were
found that do not meet the threshold of regulatory significance. Inspections classified with
VAI violations are typically more technical violations of the FDCA.
• No Action Indicated (NAI)
– An NAI inspection classification occurs when no objectionable conditions or practices were found during the
inspection or the significance of the documented objectionable conditions found does not justify further actions.
• If no enforcement action is contemplated, or after enforcement action is concluded, FDA provides
inspected establishments with a final inspection report, called an Establishment Inspection Report (EIR)
Source: www.fda.gov
5
Top 10 Foreign Inspection Location (CY15 – CY16)
Country Name CY 2015 # of
Inspections
Country Name CY 2016 # of
Inspections
China 126 China 179
Germany 90 Germany 71
Japan 44 Japan 60
Canada 42 United Kingdom 50
United Kingdom 35 Taiwan 35
Taiwan 35 France 29
France 30 Switzerland 29
Italy 26 Italy 27
S. Korea 22 Canada 26
Ireland 19 Ireland 25
Source: FDA
6
QS Medical Device Inspections Outcomes, FY 2016
Domestic Inspection Outcomes % Foreign Inspection Outcomes %
NAI 779 54% NAI 351 48%
VAI 567 39% VAI 288 40%
OAI 104 7% OAI 86 12%
Total 1,450 725
Source: FDA
7 Source: www.fda.gov
CAPA and P&PC continue to be the most
frequently cited QS subsystems.
8
21 CFR 820 QS Regulation Subsystems
P&PC Description CAPA Description
820.50 Purchasing Controls 820.90 Nonconforming product
820.60 Identification 820.100 Corrective and preventive action
820.65 Traceability 820.198 Complaint files
820.70 Production and process controls MGMT Description
820.72 Inspection, measuring, and test equipment 820.5 Quality system
820.75 Process validation 820.20 Management responsibility
820.80 Receiving, in-process, and finished device acceptance 820.22 Quality audit
820.86 Acceptance status 820.25 Personnel
820.120 Device labeling DES Description
820.130 Device packaging 820.30 Design controls
820.140 Handling DOC Description
820.150 Storage 820.40 Document controls
820.160 Distribution 820.180 General records requirements
820.170 Installation 820.181 Device Master Record
820.200 Servicing 820.184 Device History Record
820.250 Statistical techniques 820.186 Quality System Record
9 Source: www.fda.gov
10 Source: www.fda.gov, Analysis by QUARA
11
Most Frequently Cited 483 Observations (FY06 – FY16)
1) Procedures for Corrective and Preventive Action have not been
[adequately] established. Specifically, ***
2) Procedures for receiving, reviewing, and evaluating Complaints by a formally
designated unit have not been [adequately] established. Specifically,***
3) Written MDR procedures have not been [developed]
[maintained] [implemented]. Specifically, ***
4) Corrective and Preventive Action activities and/or results
have not been [adequately] documented. Specifically, ***
5) A Process whose results cannot be fully verified by subsequent inspection and
test has not been [adequately] Validated according to established procedures.
Specifically, ***
Source: FY06-16 Inspectional Observations
12
Corrective and Preventive Action (CAPA)
• Triggers
• Problem Statement
• Risk Assessment
• Containment
• Corrections
• Investigations
• Root Cause
• Verification
• Corrective Action
• Effectiveness Checks
• Documentation
13
Metrics
Looking ahead:
FDA Case for Quality
14
FDA Case for Quality
• Launched in 2011 and is part of CDRH’s 2016-2017 Strategic Priority to
promote a culture of quality and organizational excellence.
– Core Components
• Focus on Quality
• Enhanced Data Transparency
• Stakeholder Engagement
• MDIC has sponsored four Case for Quality Working Groups since 2014
– Maturity Model
• Enable an organization to assess the capability of its quality system
to reliably develop and manufacture high quality medical devices.
– Metrics
• Create well-defined, stakeholder-verified (FDA and industry)
product quality metrics to predictively assess product quality.
– Advanced analytics
• Offer hospital providers information and analysis techniques to evaluate
medical device quality and subsequent patient value.
– Competencies
• Construct techniques that improve competency across systems and individual functions.
www.fda.gov www.mdic.org
15
FDA – On the Horizon: Case for Quality
What is this Case for Quality Initiative?
• …shift historical focus from compliance & enforcement action to device quality
• ...launched after device quality data demonstrated lack of improvement in
risk to patient safety & the number of enforcement actions taken by
FDA year after year
• ...goal is to proactively and predictively measure risk to product quality
• ...Right The First Time mentality shifting as close to initial days of development
as possible
• ...manageable system of metrics across the Total Product Lifecycle
• ...ultimate goal is continual improvement with root cause of failure taken
back to earliest stages of development as possible
Source: MDIC Quality Metrics Best Practices
16
Where is the FDA Case for Quality Going?
Source: www.fda.gov MDIC CfQ Metrics Workshop
17
What Does This Mean?
FDA Regulatory Paradigm Shift
• What does a focus on quality and organizational
excellence mean for FDA and innovation?
– Increased manufacturing and product confidence
– Faster time to markets, better information to drive
regulatory decisions, improved resource allocation
– A focus on what is most important to patients
Remove participants from the agency work plan for routine inspections
Waive pre-approval inspections where appropriate
Engagement and meetings on issue resolution
Modifying submission requirements and faster FDA response
Accelerated approval path
Competitive market around product excellence
Source: www.fda.gov
18
Maturity Model Workstream - Goal Statement
Develop a program which leverages CMMI (Capability Maturity Model Integration) as the
standard maturity model by which medical device organizations may measure their
capability to produce high quality devices. FDA will adjust their engagement activities and
submission requirements as a recognition of this independent assessment of quality
maturity.
Source: MDIC CfQ
19
Maturity Model
20
Total Product Lifecycle
21
Pilot Study Metrics
Total # of changes (product & process across projects)
total # of projects
Total # of changes (product & process for each project)and/or
# of units mfg. Right First Time within or across lots
# of units started
Post-Production Metric
Production Revised Metric
Pre-Production Revised Metric
Aggregation of weighted (risk based) post market metrics: Service Records,
Installation Failures, Complaints, MDR’s, Recalls Rates, Recalls Total….
22
MDIC “Best Practices”
• Metric output can be used to understand root causes
• Combine metric output with other metrics to understand
a more holistic picture and analyze trends
• Goal is to provide a feedback loop to improve systems from the
earliest point possible that allowed the failure to occur originally
Purpose: To help organizations understand how best to
use the output from the metrics to inform
decisions and trigger actions
23
Total Product Lifecycle
• This process [approach] can be used to identify ways to
measure previously untracked areas of quality and/or risk
• Assess which critical requirements the metric is correlated to,
in order to be sure it has the potential to be effective
• Be sure to assess the usefulness of the metric over time
– Is it a flat-line result over time?
– Are any decisions or actions ever taken as a result of tracking?
– Has the metric demonstrated acceptable improvement and steady-state?
– Is unacceptable quality and/or risk experienced even though
this metric is consistently acceptable?
Source: MDIC Case for Quality, Metrics
24
Pre-Production: Design Robustness
• Goal of the Pre-Production Metric: to drive the Right-First-Time (RTF) mindset in the
research and development phase such that post-design transfer changes due to
inadequate product/process development are not needed.
Only include changes required due to inadequate product or process development (harmonize definition across organization)
25
ECO
ECR
Manufacturing FieldDesign
Change
E
C
R
Mfg or
Design
Cost of:
• Verification
• Validation
• Process Validation
• Ext. Certification
• Registration/Clearance
• Cost/FA
• Cost/CAPA
• Cost/Design Change
Impacted by changes
• IEC (Elec. Safety…)
• Toxicology/Bio-compatibility
• Environmental (WEEE, RoHS…)
• Declaration of Conformity
• Supplier Qualification
• Cleaning/Sterilization
• Risk Management
• Labeling
• Training
• Disposition (Inventory, Scrap, Rework…)
Summing Junction for Post-RfDT
product and process changes due to
inadequate product/process
development
Change
C
R
Pre-Production / Design Robustness
Data Sources
• NCRs
• SCARs
• Deviations
• Improvement Projects
Data Sources
• Complaints
• Adverse Events (MDR, MDV...)
• Recalls / Field Actions
• Installation Reports
• Service Events
• Product Returns
MDIC Pre-Production Metric
Design Xfr
Release for Design Transfer (RfDT)
Design Phases/Processes
• Requirements
• Specifications
• Verification
• Validation
• Process Validation
• Clinical Trials
• Prototyping
• Usability
• Bio-compatibility
• Risk Management
• Design Review
• Lifecycle testing
• Compliance (i.e. to standards)
• Labeling
Manufacturing Processes
• Inspection
• Process Validation
• Inventory Control
• Environmental Control
• Process Capability
• Control / Reaction Plans
• FMEA
• Preventive Maintenance
• Calibration
Y
N
Design
Mfg
Y
N
Total # of changes (product & process across projects)
total # of projects
and/or
Total # of changes (product & process for each project)
Note: Each organization must determine what constitutes a project
26
Issues within 12 months of Design Transfer (RfDT)
27
How well do you really know your systems?
• Can you metrics be interpreted accurately?
• What questions should you be asking your organization
in regards to data?
• Are you, your suppliers and your customers all singing
from the same hymnal?
Decide what data you need
How will this data be used?
• Capability?
• Defects?
• Yield?
• etc.
Data Collection / Metrics / Dashboards
Develop data collection plan
• What will we measure?
• How will we measure it?
• Where will we measure it?
• How often will we measure it?
Ensure data integrity (MSA)
• Accurate
• Precise
• Repeatable
• Reproducible
Determine how data will be presented
Determine Reaction to data
• Containment
• Correction
• Improvement
• Design Change
28
What do we need to know about the process?
29
Data Collection Concepts / Preparation
Do we know what defective means?
• Have we defined the acceptable
and unacceptable aspects of
the process?
• For example: If you were
asked to count the number of
defective M&Ms in a bag, we’d
get a wide range of answers.
Why?
30
Data Collection Concepts / Preparation
Do we know what acceptable means?
• Not everyone sees the same things. What is
acceptable to some may be unacceptable to others.
That’s not a manufacturing defect,
It’s supposed to have a hole in the middle
31
Data Collection Plan
Develop Data Collection Plan
• A well-prepared Data Collection Plan helps ensure
successful analysis of the problem. Data Collection Plans
should answer the following set of questions:
– What data will be collected (including data type)
– Attribute data — qualitative (Yes/No, Pass/Fail, Damage/No Damage)
– Variable data — quantitative (Time, Dimensions, Percentage)
– Why the data is needed
– Where data will be collected
– How the data will be collected
– Who will collect the data
Key Concept
Collecting an appropriate amount of the
right data.
Too much data can add complexity to the
data review and analysis. Too little data may
force the team to engage in a secondary data
collection effort. Likewise, correctly
specifying what data is to be collected
(enough to get a complete picture of the
process) will help the team avoid
unnecessarily repeating initial collection
activities.
32
Can you trust your data?
• What does this mean?
• Stability
• Accurate (Bias)
• Linearity
• Repeatability
• Reproducibility
• Are we measuring the “right” thing?
– For example, does our data match with what our
customers are saying?
– Liftgate Example
33
Measurement System Analysis (MSA)
• The purpose of performing a Measurement
System Analysis is to ensure the information
collected is a true representation of what is
occurring in the process.
• It is important to remember that Total Variation
is the sum of Process Variation and
Measurement System Variation. Therefore,
minimizing measurement variation ensures
that only process variation is reflected
by the data.
At the conclusion of the Measurement System
Analysis, you should know:
• Whether the measurement system is “capable” of gathering
data that accurately reflect variation in the process
• Whether there is measurement error, how big it
is and a method of accounting for it
• What confidence level can be attached
to the measurements collected
• Whether or not measurement increments
are small enough to show variation
• Sources of measurement error
• Whether the measurement system will be stable over time
Measurement System:
The thing being measured +
the device(s) used to take the
measurement + the person
doing the measuring
34
Components of Measurement System Error
• Resolution/Discrimination
• Accuracy (bias effects)
• Linearity
• Stability (consistency)
• Repeatability-test-retest (Precision)
• Reproducibility (Precision)
Each component of measurement error contributes to variation,
causing wrong decisions to be made.
35
We did the MSA; so what?
• Stable: Capacity of measurement system to obtain
the same result when measuring the same part
over a significant period of time. (Stability implies
only common cause variations.)
• Accurate (Bias): The closeness of a measured
value to a standard or known value
• Linear: A measure of bias over the range
of the measurement device.
• Repeatable: Can the same person
measure the same part multiple
times with the same measurement
device and get the same value?
• Reproducible: Can different people
measure the same part with the same
measurement device and get the
same value?
The analysis answers the questions, is the Measurement System:
Measurement system
corrections resulting from
MSA lead to precise and
accurate data
36
We did the MSA, so what?
The analysis answers the questions; is the Measurement
System:
• What is the measurement error?
• What do I do with this measurement error?
• Capable of measuring the process?
• If I improve my process, will the measurement system still be
ok?
– The output of an MSA provides indices that represent the
measurement systems ability to measure within the process spread
and also within the tolerance band. If process improvements are
made or the tolerance limits are tightened beyond the
measurement system capability, the changes to the measurement
system are necessary.
• Will the MSA ever need to be repeated?
37
Let’s Start Measuring…oh wait...
What about a sampling plan?
Sampling Approaches:
Random Systematic
Sampling Sampling
Each unit has the Sample every nth
same chance of one (ie: every 3rd)
being selected.
Stratified Subgroup
Random Sampling
Sampling
Randomly sample Sample n units
a proportionate every nth time
number from (ie: 3 units every
each group. Hour); then
calculate the mean
(proportion) for
each subgroup.
SamplePopulation
SamplePopulation
AABBBBCDDD
A
A
A
A
B
B B
B
BBB
C
C
D D D
D D D
B
Sample
Population
or Process
Preserve time order
Sample
Population
or Process
Preserve time order
SampleProcess
9:00 9:30 10:3010:00
Preserve time order
SampleProcess
9:00 9:30 10:3010:00
Preserve time order
Sampling Considerations
• Where
o Location in the process where process steps directly
affect outputs (strong relationship).
o Maximize opportunity for problem
identification (cause data).
• Frequency
o Dependent on volume of transactions and/or
activity
o Unstable process – more frequent sampling
o Stable process – less frequent sampling
o Dependent on how precise the measurement must
be to make a meaningful business decision
• Considerations
o Is the sample representative of the process or
population?
o Is the process stable?
o Is the sample random?
o Is there an equal probability of selecting any data
point?
38
What’s the Point? (Take-aways)
Do we / Have we:
• Identified the appropriate data that we need to collect?
• Determined how to collect it?
• Determined where to collect it?
• Determined when to collect it?
• Determined the sampling plan?
• Studied the measurement system?
• Determined how to manage the data?
• Understand the sources of variation in our processes?
• Developed reaction plans?
• Know how to form a proper problem statement?
• Know how to effectively conduct an investigation?
• Understand interim and permanent corrective action?
• Verified root cause?
• Can we eliminate unnecessary measurements?
Thank You!!!Eduard Toerek, President - QUARA
Innovations
etoerek@quarainnovations.com
Reinhold Toerek, Vice-President - QUARA
Innovations
rtoerek@quarainnovations.com
www.quarainnovations.com © 2017 Quara Innovations LLC . For use permission, contact info@quarainnovations.com
40
FDA Update: Inspections, Observations and Metrics - OMTEC 2017

FDA Update: Inspections, Observations and Metrics - OMTEC 2017

  • 2.
    OMTEC 2017 FDA Update FDA’sfocus as reflected in FDA-483 Observations with trends over time (historical view) On the horizon - FDA’s Case for Quality Initiative - Medical Device Quality Metrics with practical application (future view) - Quality and Regulatory Solutions - FDA Compliance - Quality System Management & Readiness - Supplier Quality Management - Metrics & Data Eduard Toerek, President - QUARA Innovations etoerek@quarainnovations.com Reinhold Toerek, Vice-President - QUARA Innovations rtoerek@quarainnovations.com www.quarainnovations.com © 2017 Quara Innovations LLC . For use permission, contact info@quarainnovations.com
  • 3.
    2 Source: FDA’s FieldAccomplishment and Compliance Tracking System (FACTS) Key Finding CY2016 o Slight increase in overall number of QS Surveillance Inspections in CY16 vs. CY15 o Increase in No Action Indicated (NAI) inspection outcomes (both domestic and foreign) o Fewer 483s were issued to firms in CY16 o All QS subsystems saw a drop in the number of 483 observations o Increase in Foreign Inspections and decrease in Domestic Inspections (consistent with the increase in foreign firms actively registered and listed) o Number of WLs issued dropped from 121 (CY15) to 57 (CY16)
  • 4.
    3 FDA Establishment Inspections FormFDA-483 Inspectional Observations • Issued at the conclusion of an inspection when an investigator has observed any conditions that my constitute violations of the Food Drug and Cosmetic (FD&C) Act and related Acts • Used to document concerns discovered during FDA Inspections • Multiple observations may be listed on the FDA-483 Form • Note: FDA-483s do NOT cover every possible deviation from law and regulation
  • 5.
    4 FDA Establishment Inspections InspectionOutcomes / Classification • Official Action Indicated (OAI) – An OAI inspection classification occurs when significant objectionable conditions or practices were found and regulatory action is warranted to address the establishment's lack of compliance with statute(s) or regulation(s). • Voluntary Action Indicated (VAI) – A VAI inspection classification occurs when objectionable conditions or practices were found that do not meet the threshold of regulatory significance. Inspections classified with VAI violations are typically more technical violations of the FDCA. • No Action Indicated (NAI) – An NAI inspection classification occurs when no objectionable conditions or practices were found during the inspection or the significance of the documented objectionable conditions found does not justify further actions. • If no enforcement action is contemplated, or after enforcement action is concluded, FDA provides inspected establishments with a final inspection report, called an Establishment Inspection Report (EIR) Source: www.fda.gov
  • 6.
    5 Top 10 ForeignInspection Location (CY15 – CY16) Country Name CY 2015 # of Inspections Country Name CY 2016 # of Inspections China 126 China 179 Germany 90 Germany 71 Japan 44 Japan 60 Canada 42 United Kingdom 50 United Kingdom 35 Taiwan 35 Taiwan 35 France 29 France 30 Switzerland 29 Italy 26 Italy 27 S. Korea 22 Canada 26 Ireland 19 Ireland 25 Source: FDA
  • 7.
    6 QS Medical DeviceInspections Outcomes, FY 2016 Domestic Inspection Outcomes % Foreign Inspection Outcomes % NAI 779 54% NAI 351 48% VAI 567 39% VAI 288 40% OAI 104 7% OAI 86 12% Total 1,450 725 Source: FDA
  • 8.
    7 Source: www.fda.gov CAPAand P&PC continue to be the most frequently cited QS subsystems.
  • 9.
    8 21 CFR 820QS Regulation Subsystems P&PC Description CAPA Description 820.50 Purchasing Controls 820.90 Nonconforming product 820.60 Identification 820.100 Corrective and preventive action 820.65 Traceability 820.198 Complaint files 820.70 Production and process controls MGMT Description 820.72 Inspection, measuring, and test equipment 820.5 Quality system 820.75 Process validation 820.20 Management responsibility 820.80 Receiving, in-process, and finished device acceptance 820.22 Quality audit 820.86 Acceptance status 820.25 Personnel 820.120 Device labeling DES Description 820.130 Device packaging 820.30 Design controls 820.140 Handling DOC Description 820.150 Storage 820.40 Document controls 820.160 Distribution 820.180 General records requirements 820.170 Installation 820.181 Device Master Record 820.200 Servicing 820.184 Device History Record 820.250 Statistical techniques 820.186 Quality System Record
  • 10.
  • 11.
    10 Source: www.fda.gov,Analysis by QUARA
  • 12.
    11 Most Frequently Cited483 Observations (FY06 – FY16) 1) Procedures for Corrective and Preventive Action have not been [adequately] established. Specifically, *** 2) Procedures for receiving, reviewing, and evaluating Complaints by a formally designated unit have not been [adequately] established. Specifically,*** 3) Written MDR procedures have not been [developed] [maintained] [implemented]. Specifically, *** 4) Corrective and Preventive Action activities and/or results have not been [adequately] documented. Specifically, *** 5) A Process whose results cannot be fully verified by subsequent inspection and test has not been [adequately] Validated according to established procedures. Specifically, *** Source: FY06-16 Inspectional Observations
  • 13.
    12 Corrective and PreventiveAction (CAPA) • Triggers • Problem Statement • Risk Assessment • Containment • Corrections • Investigations • Root Cause • Verification • Corrective Action • Effectiveness Checks • Documentation
  • 14.
  • 15.
    14 FDA Case forQuality • Launched in 2011 and is part of CDRH’s 2016-2017 Strategic Priority to promote a culture of quality and organizational excellence. – Core Components • Focus on Quality • Enhanced Data Transparency • Stakeholder Engagement • MDIC has sponsored four Case for Quality Working Groups since 2014 – Maturity Model • Enable an organization to assess the capability of its quality system to reliably develop and manufacture high quality medical devices. – Metrics • Create well-defined, stakeholder-verified (FDA and industry) product quality metrics to predictively assess product quality. – Advanced analytics • Offer hospital providers information and analysis techniques to evaluate medical device quality and subsequent patient value. – Competencies • Construct techniques that improve competency across systems and individual functions. www.fda.gov www.mdic.org
  • 16.
    15 FDA – Onthe Horizon: Case for Quality What is this Case for Quality Initiative? • …shift historical focus from compliance & enforcement action to device quality • ...launched after device quality data demonstrated lack of improvement in risk to patient safety & the number of enforcement actions taken by FDA year after year • ...goal is to proactively and predictively measure risk to product quality • ...Right The First Time mentality shifting as close to initial days of development as possible • ...manageable system of metrics across the Total Product Lifecycle • ...ultimate goal is continual improvement with root cause of failure taken back to earliest stages of development as possible Source: MDIC Quality Metrics Best Practices
  • 17.
    16 Where is theFDA Case for Quality Going? Source: www.fda.gov MDIC CfQ Metrics Workshop
  • 18.
    17 What Does ThisMean? FDA Regulatory Paradigm Shift • What does a focus on quality and organizational excellence mean for FDA and innovation? – Increased manufacturing and product confidence – Faster time to markets, better information to drive regulatory decisions, improved resource allocation – A focus on what is most important to patients Remove participants from the agency work plan for routine inspections Waive pre-approval inspections where appropriate Engagement and meetings on issue resolution Modifying submission requirements and faster FDA response Accelerated approval path Competitive market around product excellence Source: www.fda.gov
  • 19.
    18 Maturity Model Workstream- Goal Statement Develop a program which leverages CMMI (Capability Maturity Model Integration) as the standard maturity model by which medical device organizations may measure their capability to produce high quality devices. FDA will adjust their engagement activities and submission requirements as a recognition of this independent assessment of quality maturity. Source: MDIC CfQ
  • 20.
  • 21.
  • 22.
    21 Pilot Study Metrics Total# of changes (product & process across projects) total # of projects Total # of changes (product & process for each project)and/or # of units mfg. Right First Time within or across lots # of units started Post-Production Metric Production Revised Metric Pre-Production Revised Metric Aggregation of weighted (risk based) post market metrics: Service Records, Installation Failures, Complaints, MDR’s, Recalls Rates, Recalls Total….
  • 23.
    22 MDIC “Best Practices” •Metric output can be used to understand root causes • Combine metric output with other metrics to understand a more holistic picture and analyze trends • Goal is to provide a feedback loop to improve systems from the earliest point possible that allowed the failure to occur originally Purpose: To help organizations understand how best to use the output from the metrics to inform decisions and trigger actions
  • 24.
    23 Total Product Lifecycle •This process [approach] can be used to identify ways to measure previously untracked areas of quality and/or risk • Assess which critical requirements the metric is correlated to, in order to be sure it has the potential to be effective • Be sure to assess the usefulness of the metric over time – Is it a flat-line result over time? – Are any decisions or actions ever taken as a result of tracking? – Has the metric demonstrated acceptable improvement and steady-state? – Is unacceptable quality and/or risk experienced even though this metric is consistently acceptable? Source: MDIC Case for Quality, Metrics
  • 25.
    24 Pre-Production: Design Robustness •Goal of the Pre-Production Metric: to drive the Right-First-Time (RTF) mindset in the research and development phase such that post-design transfer changes due to inadequate product/process development are not needed. Only include changes required due to inadequate product or process development (harmonize definition across organization)
  • 26.
    25 ECO ECR Manufacturing FieldDesign Change E C R Mfg or Design Costof: • Verification • Validation • Process Validation • Ext. Certification • Registration/Clearance • Cost/FA • Cost/CAPA • Cost/Design Change Impacted by changes • IEC (Elec. Safety…) • Toxicology/Bio-compatibility • Environmental (WEEE, RoHS…) • Declaration of Conformity • Supplier Qualification • Cleaning/Sterilization • Risk Management • Labeling • Training • Disposition (Inventory, Scrap, Rework…) Summing Junction for Post-RfDT product and process changes due to inadequate product/process development Change C R Pre-Production / Design Robustness Data Sources • NCRs • SCARs • Deviations • Improvement Projects Data Sources • Complaints • Adverse Events (MDR, MDV...) • Recalls / Field Actions • Installation Reports • Service Events • Product Returns MDIC Pre-Production Metric Design Xfr Release for Design Transfer (RfDT) Design Phases/Processes • Requirements • Specifications • Verification • Validation • Process Validation • Clinical Trials • Prototyping • Usability • Bio-compatibility • Risk Management • Design Review • Lifecycle testing • Compliance (i.e. to standards) • Labeling Manufacturing Processes • Inspection • Process Validation • Inventory Control • Environmental Control • Process Capability • Control / Reaction Plans • FMEA • Preventive Maintenance • Calibration Y N Design Mfg Y N Total # of changes (product & process across projects) total # of projects and/or Total # of changes (product & process for each project) Note: Each organization must determine what constitutes a project
  • 27.
    26 Issues within 12months of Design Transfer (RfDT)
  • 28.
    27 How well doyou really know your systems? • Can you metrics be interpreted accurately? • What questions should you be asking your organization in regards to data? • Are you, your suppliers and your customers all singing from the same hymnal? Decide what data you need How will this data be used? • Capability? • Defects? • Yield? • etc. Data Collection / Metrics / Dashboards Develop data collection plan • What will we measure? • How will we measure it? • Where will we measure it? • How often will we measure it? Ensure data integrity (MSA) • Accurate • Precise • Repeatable • Reproducible Determine how data will be presented Determine Reaction to data • Containment • Correction • Improvement • Design Change
  • 29.
    28 What do weneed to know about the process?
  • 30.
    29 Data Collection Concepts/ Preparation Do we know what defective means? • Have we defined the acceptable and unacceptable aspects of the process? • For example: If you were asked to count the number of defective M&Ms in a bag, we’d get a wide range of answers. Why?
  • 31.
    30 Data Collection Concepts/ Preparation Do we know what acceptable means? • Not everyone sees the same things. What is acceptable to some may be unacceptable to others. That’s not a manufacturing defect, It’s supposed to have a hole in the middle
  • 32.
    31 Data Collection Plan DevelopData Collection Plan • A well-prepared Data Collection Plan helps ensure successful analysis of the problem. Data Collection Plans should answer the following set of questions: – What data will be collected (including data type) – Attribute data — qualitative (Yes/No, Pass/Fail, Damage/No Damage) – Variable data — quantitative (Time, Dimensions, Percentage) – Why the data is needed – Where data will be collected – How the data will be collected – Who will collect the data Key Concept Collecting an appropriate amount of the right data. Too much data can add complexity to the data review and analysis. Too little data may force the team to engage in a secondary data collection effort. Likewise, correctly specifying what data is to be collected (enough to get a complete picture of the process) will help the team avoid unnecessarily repeating initial collection activities.
  • 33.
    32 Can you trustyour data? • What does this mean? • Stability • Accurate (Bias) • Linearity • Repeatability • Reproducibility • Are we measuring the “right” thing? – For example, does our data match with what our customers are saying? – Liftgate Example
  • 34.
    33 Measurement System Analysis(MSA) • The purpose of performing a Measurement System Analysis is to ensure the information collected is a true representation of what is occurring in the process. • It is important to remember that Total Variation is the sum of Process Variation and Measurement System Variation. Therefore, minimizing measurement variation ensures that only process variation is reflected by the data. At the conclusion of the Measurement System Analysis, you should know: • Whether the measurement system is “capable” of gathering data that accurately reflect variation in the process • Whether there is measurement error, how big it is and a method of accounting for it • What confidence level can be attached to the measurements collected • Whether or not measurement increments are small enough to show variation • Sources of measurement error • Whether the measurement system will be stable over time Measurement System: The thing being measured + the device(s) used to take the measurement + the person doing the measuring
  • 35.
    34 Components of MeasurementSystem Error • Resolution/Discrimination • Accuracy (bias effects) • Linearity • Stability (consistency) • Repeatability-test-retest (Precision) • Reproducibility (Precision) Each component of measurement error contributes to variation, causing wrong decisions to be made.
  • 36.
    35 We did theMSA; so what? • Stable: Capacity of measurement system to obtain the same result when measuring the same part over a significant period of time. (Stability implies only common cause variations.) • Accurate (Bias): The closeness of a measured value to a standard or known value • Linear: A measure of bias over the range of the measurement device. • Repeatable: Can the same person measure the same part multiple times with the same measurement device and get the same value? • Reproducible: Can different people measure the same part with the same measurement device and get the same value? The analysis answers the questions, is the Measurement System: Measurement system corrections resulting from MSA lead to precise and accurate data
  • 37.
    36 We did theMSA, so what? The analysis answers the questions; is the Measurement System: • What is the measurement error? • What do I do with this measurement error? • Capable of measuring the process? • If I improve my process, will the measurement system still be ok? – The output of an MSA provides indices that represent the measurement systems ability to measure within the process spread and also within the tolerance band. If process improvements are made or the tolerance limits are tightened beyond the measurement system capability, the changes to the measurement system are necessary. • Will the MSA ever need to be repeated?
  • 38.
    37 Let’s Start Measuring…ohwait... What about a sampling plan? Sampling Approaches: Random Systematic Sampling Sampling Each unit has the Sample every nth same chance of one (ie: every 3rd) being selected. Stratified Subgroup Random Sampling Sampling Randomly sample Sample n units a proportionate every nth time number from (ie: 3 units every each group. Hour); then calculate the mean (proportion) for each subgroup. SamplePopulation SamplePopulation AABBBBCDDD A A A A B B B B BBB C C D D D D D D B Sample Population or Process Preserve time order Sample Population or Process Preserve time order SampleProcess 9:00 9:30 10:3010:00 Preserve time order SampleProcess 9:00 9:30 10:3010:00 Preserve time order Sampling Considerations • Where o Location in the process where process steps directly affect outputs (strong relationship). o Maximize opportunity for problem identification (cause data). • Frequency o Dependent on volume of transactions and/or activity o Unstable process – more frequent sampling o Stable process – less frequent sampling o Dependent on how precise the measurement must be to make a meaningful business decision • Considerations o Is the sample representative of the process or population? o Is the process stable? o Is the sample random? o Is there an equal probability of selecting any data point?
  • 39.
    38 What’s the Point?(Take-aways) Do we / Have we: • Identified the appropriate data that we need to collect? • Determined how to collect it? • Determined where to collect it? • Determined when to collect it? • Determined the sampling plan? • Studied the measurement system? • Determined how to manage the data? • Understand the sources of variation in our processes? • Developed reaction plans? • Know how to form a proper problem statement? • Know how to effectively conduct an investigation? • Understand interim and permanent corrective action? • Verified root cause? • Can we eliminate unnecessary measurements?
  • 40.
    Thank You!!!Eduard Toerek,President - QUARA Innovations etoerek@quarainnovations.com Reinhold Toerek, Vice-President - QUARA Innovations rtoerek@quarainnovations.com www.quarainnovations.com © 2017 Quara Innovations LLC . For use permission, contact info@quarainnovations.com
  • 41.