0
Pittsburgh, PA 15213-3890




Six Sigma Tools
for Early Adopters

Rick Hefner, Northrop Grumman
Jeannine Siviy, SEI
SEPG C...
Background
Six Sigma has proven to be a powerful enabler for process
improvement
 • CMMI adoption
 • Process improvement f...
Agenda
Six Sigma Benefits for Early Adopters – What the Books Don’t
Tell You
The 7 Six Sigma Tools Everyone Can Use
I Hear...
What is Six Sigma?
 Six Sigma is a management philosophy based on meeting
 business objectives by reducing variation
  • A...
A General Purpose Problem-Solving
    Methodology: DMAIC
   Problem or goal statement (Y)



 Define                  Meas...
DMAIC Roadmap
Define                  Measure             Analyze         Improve          Control

Define                ...
Toolkit
  Define                 Measure                Analyze               Improve            Control
Benchmark        ...
Process Improvement – Design
   for Six Sigma (e.g., DMADV)
Define                   Measure          Analyze       Design...
Organizational Adoption:
Roles & Responsibilities
Champions – Facilitate the leadership,
 implementation, and deployment
S...
Valuable Tools for Engineers
Six Sigma provides a comprehensive set of tools for:
 • Soliciting and understanding customer...
Agenda
Six Sigma Benefits for Early Adopters – What the Books Don’t
Tell You
The 7 Six Sigma Tools Everyone Can Use
I Hear...
Toolkit
  Define                 Measure                Analyze               Improve            Control
Benchmark        ...
Measure Guidance Questions
Define                 Measure              Analyze          Improve           Control

Define ...
Identifying Needed Data
What are the process outputs and performance measures?
What are the inputs?
What are the relations...
Controlled and Uncontrolled Factors
      Controlled factors are within the project team’s scope of
      authority and ar...
Natural Segmentation
    Description
    A logical reasoning about which data groupings have different
    performance, of...
Segmentation Example
 Call Center Support Costs
                                                           Support Effort
...
Segmentation vs. Stratification
   Segmentation—
    • grouping the data according to one of the data elements (e.g.,
    ...
Process Mapping
    Process map—a representation of major activities/tasks,
    subprocesses, process boundaries, key proc...
Example: Development Process Map
                                                                                     Unit...
Input / Output Analysis
 Supplier                  Inputs            Process    Outputs            Customer

             ...
System Engr
    Designers
    Programmers
    Testers
    Quality       Alternative Process Map—Swim Lanes




[MPDI]     ...
Measure Guidance Questions
Define                 Measure               Analyze           Improve            Control

    ...
Analyze Guidance Questions
Define                  Measure             Analyze         Improve           Control

Define  ...
Summarizing & Baselining the Data
    What is baselining?
      Establishing a snapshot of performance (distribution of th...
The 7 Basic Tools
   Description
    • Fundamental data plotting and diagramming tools
       - cause & effect diagram
   ...
7 Basic Tools: Cause & Effect
     Management                      People                                             Trad...
Effort (person months)
                          7 Basic Tools: Chart Examples 2
                                  Develop...
7 Basic Tools: Chart Examples
                          Defects Removed By Type                                           ...
0
                                                                    10
                                                 ...
7 Basic Tools: Chart Examples
   SPC Chart: Individual, Moving Range
                        60


                        ...
Example: Cost/Schedule Monthly
               Performance Baseline
                      0
               -1000
          ...
Descriptive Statistics
 Measures of central tendency: location, middle, the balance point.

                   Mean =     ...
Graphical Methods Summary
  Purpose                                 Graphical Method
   See Relationships in Data         ...
Agenda
Six Sigma Benefits for Early Adopters – What the Books Don’t
Tell You
The 7 Six Sigma Tools Everyone Can Use
I Hear...
The Voices of Six Sigma
Six Sigma includes powerful techniques
for understanding the problem you are
trying to solve
  • V...
Voice of Customer (VOC)
A process used to capture the requirements/feedback from the
customer (internal or external)
 • Pr...
Kano Diagram




© 2006 by Carnegie Mellon University   page 38
Quality Function Deployment
A link between customer attributes and
design parameters
 • What attributes
    are critical t...
Requirements Development
VOC approaches provide powerful methods for eliciting,
analyzing, and validating requirements
Can...
Voice of Process
Characteristics of the process:
 • What it is capable of achieving
 • Whether it is under control
 • What...
Control Chart
                                       A time-ordered plot of process
                                      ...
Key Features of a Control Chart
                                              Upper
                          12          ...
A Stable (Predictable) Process
                                  U Chart of Defects Detected in Requirements Definition

 ...
Variation
                       12
                                                                                 Commo...
What if the Process Isn’t Stable?
You may be able to explain out




                                               Defect...
Voice of Business
The "voice of the business" is the term used to describe
the stated and unstated needs or requirements o...
Agenda
Six Sigma Benefits for Early Adopters – What the Books Don’t
Tell You
The 7 Six Sigma Tools Everyone Can Use
I Hear...
Evaluating Data Quality
   Does the measurement system yield accurate, precise, and
   reproducible data?
    • A measurem...
Discussion: What if I Skip This Step?
     What if…
      • All 0’s in the inspection database are really missing data?
  ...
Evaluating Data Quality: Simple Checks
   Use common sense, basic tools, and good powers of
   observation.
   Look at the...
Practical Tips
   Map the data collection process.
    • Know the assumptions associated with the data
   Look at indicato...
Formal MSE Provides Answers…
   • How big is the measurement error?

   • What are the sources of measurement error?

   •...
Sources of Variation in a Formal MSE



            Total
          Variability
                                    =     ...
Characteristics of a Formal MSE
   • Precision (reproducibility and repeatability – R&R)


   • Accuracy (bias)


   • Sta...
Accuracy (Bias)




                  Accurate                Not accurate


 Accuracy—The closeness of (average) reading ...
Precision vs. Accuracy
             (σ)                                 (µ)




         Accurate                         ...
Precision
    Spread refers to the standard deviation of a distribution.
    The standard deviation of the measurement sys...
Repeatability
    Repeatability is the inherent variability of the measurement
    system.
    Measured by σRPT, the stand...
Reproducibility
    Reproducibility is the variation that results when different
    conditions are used to make the measu...
Simple MSE for Continuous Data—1

• Have 10 objects to measure (projects to forecast,
  modules of code to inspect, tests ...
MSE Metrics-Precision
%Gauge Repeatability & Reproducibility (%GR&R):
The fraction of total variation consumed by measurem...
How Much Variation is Tolerable?
               If the %GRR is…               Then measurement error is…

                ...
MSE Calculations for Attribute Data 1
   (see addenda for formulas, example)


   Conducting measurement system evaluation...
MSE Calculations for Attribute Data 2
   Quick Rule of Thumb Approach

   1. Randomly select 20 items to measure
       • ...
Analyze Guidance Questions

Define                  Measure             Analyze          Improve              Control

Def...
Exploring the Data
   What do the data look like?
   What is driving the variation?

   Probing questions during data expl...
Exercise: Outliers
What is an outlier?
 • a data point which does not appear to follow the characteristic
   distribution ...
Removing Outliers
There is not a widely-accepted automated approach to
removing outliers.
Approaches
 • Visual
    - exami...
Interquartile Range
   Description
         • A quantitative method for identifying possible outliers in a
           data...
Interquartile Range: Example
                                                                                          4
 ...
Tips About Outliers
   Outliers can be clue to process understanding:
   learn from them.
   If outliers lead you to measu...
When Not to Remove Outliers
   When you don’t understand the process.
   Because you “don’t like the data points” or they ...
Summary: Addressing Data Quality
Issues
Identify & remove data with poor quality
Identify & remove outliers
  • Remember: ...
Agenda
Six Sigma Benefits for Early Adopters – What the Books Don’t
Tell You
The 7 Six Sigma Tools Everyone Can Use
I Hear...
A Typical Six Sigma Project in
Engineering
    The organization notes that systems integration has been problematic on
   ...
Applicability to Engineering
System engineering processes are fuzzy
 • Systems engineering "parts" are produced using proc...
Additional Challenges
Difficulty in collecting subjective, reliable data
 • Humans are prone to errors and can bias data
 ...
The Engineering Life-Cycle
   Requirements            Top Level   Detailed
                                               ...
Typical Choices in Industry
Most customers care about:
 • Delivered defects                                               ...
Peer Reviews
Can we predict the number of
errors found in a peer review?
                                                 ...
Quantitative Management
                          Suppose your project conducted
                          several peer re...
Understanding the Process
                                                                        12
                     ...
Improving the Process

Reduce the variation
 • Train people on the process                             15
                ...
CMMI Level 4
Organizational Process Performance
 • Establishes a quantitative understanding of the performance of the
   o...
How Six Sigma Helps Process
Improvement
PI efforts often generate or have little
direct impact on the business goals
 • Co...
How Six Sigma Helps CMMI
  Adoption
  For an individual process:
   • CMM/CMMI identifies what             SG 1 Establish ...
Approaches to Process Improvement
Data-Driven (e.g., Six Sigma, Lean)        Model-Driven (e.g., CMM, CMMI)




Clarify wh...
Agenda
Six Sigma Benefits for Early Adopters – What the Books Don’t
Tell You
The 7 Six Sigma Tools Everyone Can Use
I Hear...
What Is Your Current Reality?
What are your managers saying? Asking?
How do your views differ?
What would you like to conv...
Example: NGC Mission Systems
The Six Sigma adoption decision
Started as a CEO mandate, but embraced by the organization
 •...
Example: Motorola
The CMMI adoption decision: Will it benefit existing Six Sigma initiatives?
Executive sponsorship and en...
Change Management

               D x V x F > R
D = Dissatisfaction with the present
V = Vision for the Future
F = First (...
What Drives Process Improvement?
Performance issues: product, project—
 • and, eventually, process issues
Regulations and ...
Value Proposition:
Six Sigma as Strategic Enabler
The SEI conducted a research project to explore the feasibility of
Six S...
Research Conclusions
Six Sigma is feasible as an enabler of the adoption of software,
systems, and IT improvement models a...
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Six Sigma Tools for Early Adopters
Upcoming SlideShare
Loading in...5
×

Six Sigma Tools for Early Adopters

2,306

Published on

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,306
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
201
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Transcript of "Six Sigma Tools for Early Adopters"

  1. 1. Pittsburgh, PA 15213-3890 Six Sigma Tools for Early Adopters Rick Hefner, Northrop Grumman Jeannine Siviy, SEI SEPG Conference 6 March 2006 The final version of our slides is available at http://www.sei.cmu.edu/sema/presentations (cards with this web address are available at the front of the room) © 2006 by Carnegie Mellon University page 1
  2. 2. Background Six Sigma has proven to be a powerful enabler for process improvement • CMMI adoption • Process improvement for measurable ROI • Statistical analysis This tutorial is about gleaning value from the Six Sigma world, to raise the caliber of engineering, regardless of the corporate stance on Six Sigma © 2006 by Carnegie Mellon University page 2
  3. 3. Agenda Six Sigma Benefits for Early Adopters – What the Books Don’t Tell You The 7 Six Sigma Tools Everyone Can Use I Hear Voices Dirty Data (and How to Fix It) Statistical Process Control – Where Does It Apply to Engineering? Convincing Senior Management: The Value Proposition Summary Cheap Sources of Information and Tools MSE Addendum © 2006 by Carnegie Mellon University page 3
  4. 4. What is Six Sigma? Six Sigma is a management philosophy based on meeting business objectives by reducing variation • A disciplined, data-driven methodology for decision making and process improvement To increase process performance, you have to decrease variation Too late Too early Too late Too early Greater predictability in the process Defects Defects Less waste and rework, which Reduce lowers costs variation Delivery Time Delivery Time Products and services that perform better Spread of variation Spread of variation and last longer too wide compared narrow compared to to specifications specifications Happier customers © 2006 by Carnegie Mellon University page 4
  5. 5. A General Purpose Problem-Solving Methodology: DMAIC Problem or goal statement (Y) Define Measure Analyze Improve Control • Refine problem & goal • An improvement journey to achieve goals and resolve statements. problems by discovering and understanding • Define project scope & relationships between process inputs and outputs, boundaries. such as Y = f(defect profile, yield) = f(review rate, method, complexity……) © 2006 by Carnegie Mellon University page 5
  6. 6. DMAIC Roadmap Define Measure Analyze Improve Control Define Identify Explore Identify Define project needed data possible control scope data solutions method Establish Obtain Characterize Select Implement formal data set process & solution project problem Document Evaluate Implement data quality Update (pilot as improvement needed) project scope Summarize & scale & baseline Evaluate data [Hallowell-Siviy 05] Phase Exit Review © 2006 by Carnegie Mellon University page 6
  7. 7. Toolkit Define Measure Analyze Improve Control Benchmark GQIM and Cause & Effect Design of Statistical Indicator Diagrams/ Experiments Controls: Contract/Charter Matrix Templates Modeling • Control Kano Model Failure Modes & Charts Data Collection ANOVA Voice of the Effects Analysis • Time Series Methods Customer Statistical Tolerancing methods Measurement Inference Voice of the Robust Design System Non-Statistical Business Reliability Analysis Evaluation Systems Controls: Quality Function Root Cause Thinking Deployment • Procedural Analysis, adherence Decision & Risk including 5 Whys Analysis • Performance Hypothesis Test Mgmt PSM Perform Analysis Model • Preventive measures 7 Basic Tools (Histogram, Scatter Plot, Run Chart, Flow Chart, Brainstorming, Pareto Chart), Control charts (for diagnostic purposes), Baseline, Process Flow Map, Project Management, “Management by Fact”, Sampling Techniques, Survey Methods, Defect Metrics © 2006 by Carnegie Mellon University page 7
  8. 8. Process Improvement – Design for Six Sigma (e.g., DMADV) Define Measure Analyze Design Verify Define Identify Explore Develop Evaluate project customers data detailed pilot scope design Scale-up Research Design Establish Refine design VOC solution predicted formal project performance Document Benchmark Predict performance Develop pilot Quantify CTQs © 2006 by Carnegie Mellon University page 8
  9. 9. Organizational Adoption: Roles & Responsibilities Champions – Facilitate the leadership, implementation, and deployment Sponsors – Provide resources Process Owners – Responsible for the processes being improved Master Black Belts – Serve as mentors for Black Belts Black Belts – Lead Six Sigma projects • Requires 4 weeks of training Green Belts – Serve on improvement teams under a Black Belt • Requires 2 weeks of training © 2006 by Carnegie Mellon University page 9
  10. 10. Valuable Tools for Engineers Six Sigma provides a comprehensive set of tools for: • Soliciting and understanding customer needs (requirements, delighters, perceptions of quality) • Defining and improving processes (inputs/outputs, customer/suppliers, essential/nonessential activities, capability, stability/predictability) • Understanding data (trends, relationships, variation) These tools can be used even if your organization is not implementing Six Sigma © 2006 by Carnegie Mellon University page 10
  11. 11. Agenda Six Sigma Benefits for Early Adopters – What the Books Don’t Tell You The 7 Six Sigma Tools Everyone Can Use I Hear Voices Dirty Data (and How to Fix It) Statistical Process Control – Where Does It Apply to Engineering? Convincing Senior Management: The Value Proposition Summary Cheap Sources of Information and Tools MSE Addendum © 2006 by Carnegie Mellon University page 11
  12. 12. Toolkit Define Measure Analyze Improve Control Benchmark GQIM and Cause & Effect Design of Statistical Indicator Diagrams/ Matrix Experiments Controls: Contract/Charter Templates Failure Modes & Modeling • Control Kano Model Data Collection Effects Analysis Charts ANOVA Voice of the Methods Statistical • Time Series Customer Inference Tolerancing methods Measurement Voice of the System Reliability Analysis Robust Design Business Evaluation Non-Statistical Root Cause Systems Controls: Quality Function Analysis, incl 5 Thinking Deployment • Procedural Whys adherence Decision & Risk Hypothesis Test Analysis • Performance Mgmt PSM Perform Analysis Model • Preventive measures 7 Basic Tools (Histogram, Scatter Plot, Run Chart, Flow Chart, Brainstorming, Pareto Chart), Control charts (for diagnostic purposes), Baseline, Process Flow Map, Project Management, “Management by Fact”, Sampling Techniques, Survey Methods, Defect Metrics © 2006 by Carnegie Mellon University page 12
  13. 13. Measure Guidance Questions Define Measure Analyze Improve Control Define Identify • Explore are the process outputs and What Identify Define project needed performance measures? data possible control scope data solutions method • What are the process inputs? Establish Obtain •Characterize is needed to understand What info Implement Implement data set process & (pilot as formal relationships between inputs and problem needed) project outputs? Among inputs? Document Evaluate data • What information is needed to monitor Update Select quality the progress of solution improvement this improvement project scope project? & scale Summarize Evaluate & baseline data [MPDI] © 2006 by Carnegie Mellon University page 13
  14. 14. Identifying Needed Data What are the process outputs and performance measures? What are the inputs? What are the relationships among outputs and inputs? We need to find out what contributes to performance: • What are the process outputs (y’s) that drive performance? • What are key process inputs (x’s) that drive outputs and overall performance? Techniques to address these questions • segmentation / stratification Using these techniques • input and output analysis yields a list of relevant, • Y to x trees hypothesized, process factors to measure and • cause & effect diagrams evaluate. [MPDI] © 2006 by Carnegie Mellon University page 14
  15. 15. Controlled and Uncontrolled Factors Controlled factors are within the project team’s scope of authority and are accessed during the course of the project. Studying their influence may inform: • cause-and-effect work during Analyze • solution work during Improve • monitor and control work during Control Uncontrolled factors are factors we do not or cannot control. We need to acknowledge their presence and, if necessary, characterize their influence on Y. A robust process is insensitive to the influence of uncontrollable factors. [MPDI] © 2006 by Carnegie Mellon University page 15
  16. 16. Natural Segmentation Description A logical reasoning about which data groupings have different performance, often verified by basic descriptive statistics. Procedure • Consider what factors, groupings, segments, and situations may be driving the mean performance and the variation in Y. • Draw a vertical tree diagram, continually reconsidering this question to a degree of detail that makes sense. • Calculate basic descriptive statistics, where available and appropriate, to identify areas worthy of real focus. Y [MPDI] © 2006 by Carnegie Mellon University page 16
  17. 17. Segmentation Example Call Center Support Costs Support Effort 99,906 Cases 275,580 Hours 80% of Case Effort Problems Complaints Questions Changes 63,095 Cases 6,869 Cases 15,316 Cases Case 14,624 Cases 180,380 Hours 17,740 Hours 27,207 Hours 50,245 hours 65% of NB Effort 6% of NB Effort 11% of NB Effort Type 18% of NB Effort Service Area 1 Service Area 2 Service Area 3 Other Service 12,025 Cases 8,308 Cases 21,085 Cases 21,677 Cases Area 29,280 Hours 29,199 Hours 63,287 Hours 58,614 Hours 11% of NB Effort 10% of NB Effort 23% of NB Effort 21% of NB Effort Service ‘Product’ A Service ‘Product’ B Service ‘Product’ B Service 8,156 Cases 12,929 Cases may be an ideal Product 20,735 Hours 42,551 Hours segment for initial 8% of NB Effort 15% of NB Effort data gathering. NB = Non-billable © 2006 by Carnegie Mellon University page 17
  18. 18. Segmentation vs. Stratification Segmentation— • grouping the data according to one of the data elements (e.g., day of week, call type, region, etc.) • gives discrete categories • in general we focus on the largest, most expensive, best/worst – guides “where to look” Stratification— • grouping the data according to the value range of one of the data elements (e.g., all records for days with “high” volume vs. all records with “low” volume days) • choice of ranges is a matter of judgment • enables comparison of attributes associated with “high” and “low” groups—what’s different about these groups? • guides diagnosis [MPDI] © 2006 by Carnegie Mellon University page 18
  19. 19. Process Mapping Process map—a representation of major activities/tasks, subprocesses, process boundaries, key process inputs, and outputs INPUTS OUTPUTS (Sources of PROCESS STEP (Measures of Variation) A blending of Performance) inputs to achieve • People • Perform a service • Material the desired • Produce a Product • Equipment outputs • Complete a Task • Policies • Procedures • Methods • Environment • Information [MPDI] © 2006 by Carnegie Mellon University page 19
  20. 20. Example: Development Process Map Unit Design Code Compile Test Requirements Resources Code Executable Code Estimate Code Stds Test Plan, Technique Concept design LOC counter Operational Profiles Interruptions • Detailed Design • Code • Executable • Functional • Test cases • Data: Defects, Code Code • Complexity Fix time, Defect • Data: Defects, • Data: Defects, • Data: Design Review Injection Phase, Fix time, Defect Fix time, Defect defects, Fix time, Phase duration Injection Phase, Injection Phase, Phase duration Phase duration Phase duration Inspection Critical Inputs Standard Procedure Rework Noise Control Knobs [MPDI] © 2006 by Carnegie Mellon University page 20
  21. 21. Input / Output Analysis Supplier Inputs Process Outputs Customer outputs Inputs Step 1 Step 2 Outputs inputs Assess the Inputs: • Controllable: can be changed to see effect on key outputs (also called “knob” variables) • Critical: statistically shown to have impact on key outputs • Noise: impact key outputs, but difficult to control [MPDI] © 2006 by Carnegie Mellon University page 21
  22. 22. System Engr Designers Programmers Testers Quality Alternative Process Map—Swim Lanes [MPDI] © 2006 by Carnegie Mellon University page 22
  23. 23. Measure Guidance Questions Define Measure Analyze Improve Control Identify • What does the data look like upon initial Define Explore Identify Define needed assessment? Is possible we expected? data it what control project scope data solutions method • What is the overall performance of the process? Characterize Establish Obtain Implement Implement data set process & • Do we have measures (pilot as for all significant formal problem project factors, as best needed) them? we know Document Evaluate data • Update Are there data to be added to the Select quality process map? solution improvement project scope • & scale urgently needed improvements Are any Summarize Evaluate revealed? & baseline data • What assumptions have been made [MPDI] about the process and data? © 2006 by Carnegie Mellon University page 23
  24. 24. Analyze Guidance Questions Define Measure Analyze Improve Control Define Identify Explore • What do the data look Identify Define project needed data like? possible control Scope data • What is driving the solutions method variation? Establish Characterize Obtain data • Implement the new What is process and Implement formal set (pilot as project problem baseline? needed) • What are associated Evaluate Update risks and assumptions Select data quality improvement associated withDocument revised solution project scope and scale data set and baseline? Summarize, Evaluate baseline data © 2006 by Carnegie Mellon University page 24
  25. 25. Summarizing & Baselining the Data What is baselining? Establishing a snapshot of performance (distribution of the process behavior) and/or the characteristics of a process. Why should we baseline performance? It provides a basis by which to measure improvement. How is it done? • Describe the organization’s performance using – the 7 basic tools – a map of the process of interest, including scope (process boundaries) and timeframe • Compare to best-in-class – benchmarking • Gather data – sample appropriately [MPDI] © 2006 by Carnegie Mellon University page 25
  26. 26. The 7 Basic Tools Description • Fundamental data plotting and diagramming tools - cause & effect diagram - histogram - scatter plot - run chart - flow chart - brainstorming - Pareto chart • The list varies with source. Alternatives include: - statistical process control charts - descriptive statistics (mean, median, etc.) - check sheets [MPDI] © 2006 by Carnegie Mellon University page 26
  27. 27. 7 Basic Tools: Cause & Effect Management People Traditional diagram Unrealistic Minimum completion date application Inadequate test experience resources No test specialists No risk management Software not required No formal inspection reliability Test beds to not match process user configuration Subcause C1 Subcause A1 Ca No formal defect Ca e us .1 us us tracking mechanism e A1 bca e Subcause C2 C A u Environment Methods bs Su Process Process Problem Subcause D1 Subcause B1 eB Subcause D2 eD us us Variation Ca Ca .1 use D2 bca [MPDI; original source: Westfall] u bs Su © 2006 by Carnegie Mellon University page 27
  28. 28. Effort (person months) 7 Basic Tools: Chart Examples 2 Development Effort (person 500 months) vs. Size (KSLOC) Scatter Plot 400 300 200 100 20 18 0 16 Number of Days 0 10 20 30 40 50 60 70 14 Size (thousands of executable source lines) 12 10 8 6 4 Histogram 2 0 34 36 38 40 42 44 46 48 50 52 54 56 Product-Service Staff Hours [MPDI] © 2006 by Carnegie Mellon University page 28
  29. 29. 7 Basic Tools: Chart Examples Defects Removed By Type Pareto Chart 90 80 70 60 Quantity 50 40 30 20 Mean Time To Repair 10 0 100.00 20 40 80 50 10 100 30 60 70 Removed in test 90 Time (minutes) 80.00 Defect Type Codes Injected in Design 60.00 40.00 20.00 0.00 Run Chart 1 2 3 4 5 6 7 8 9 10 Product [MPDI] © 2006 by Carnegie Mellon University page 29
  30. 30. 0 10 20 30 40 50 60 70 80 90 100 [MPDI] R q ire e ts e u mn D ve p e t e lo m n T ch ica e n l S lu n o tio P d ct ro u In g tio te ra n © 2006 by Carnegie Mellon University Box & whisker plot V rifica n e tio for assessment data V lid tio a a n O a iza n rg n tio P ce ro ss mean F cu o s SEI Level 3 CMMI BENCHMARK O a iza n l rg n tio a P ce ro ss D fin n e itio x O a iza n l rg n tio a T in g ra in In g te te ra d P je ro ct Mn g mn aae et Risk 10th percentile 25th percentile 75th percentile 90th percentile Mn g mn aae et 7 Basic Tools: Chart Examples D cisio e n Aa n lysis & Median: 50th percentile R so tio e lu n page 30
  31. 31. 7 Basic Tools: Chart Examples SPC Chart: Individual, Moving Range 60 UCL 50 Zone C 40 Zone B 30 Individuals 20 Zone A CL 10 Zone A 0 Zone B -10 -20 Zone C LCL 0 5 10 15 20 25 Subgroup No. . [MPDI] © 2006 by Carnegie Mellon University page 31
  32. 32. Example: Cost/Schedule Monthly Performance Baseline 0 -1000 -2000 All Org Units, all projects, %Cost Var. -3000 -4000 October to March -5000 -6000 • Reminder: This is (current -7000 actual – most recent -8000 estimate) 1: oct-01 2: nov-01 3: dec-01 4: jan-02 5: feb-02 6: mar-02 month label • Averages within spec, and 100 close to 0 80 60 • Need to examine extreme 40 values, especially for cost %Sch. Var. 20 0 Range: -20 -40 60% • Even if extreme values are -60 -80 outliers, it looks like we need -100 1: oct-01 2: nov-01 3: dec-01 4: jan-02 5: feb-02 6: mar-02 to investigate variability month label [MPDI] © 2006 by Carnegie Mellon University page 32
  33. 33. Descriptive Statistics Measures of central tendency: location, middle, the balance point. Mean = sum of n measured values 10486.3 = = 227.9 (the center of (the average) n 46 gravity by value) Median = the midpoint by count Mode = the most frequently observed point Measures of dispersion: spread, variation, distance from central tendency. Range maximum - minimum n 2 ∑ (x i − µ) Average squared distance from the σ2 Variance = i =1 population mean n In the units of the original measures; σ Standard Deviation = Variance indicator of the spread of points from the mean [MPDI] © 2006 by Carnegie Mellon University page 33
  34. 34. Graphical Methods Summary Purpose Graphical Method See Relationships in Data Scatter plot See Time Relationships Time series run chart See Variation of Y with 1 X Box Plot chart See Variation of Y w/2+ X’s Multi-variable chart Prioritize 2+ X’s to focus on Pareto chart Check Normality of Data Normal plot Predict relationships in Data Regression Predicted Line [MPDI] © 2006 by Carnegie Mellon University page 34
  35. 35. Agenda Six Sigma Benefits for Early Adopters – What the Books Don’t Tell You The 7 Six Sigma Tools Everyone Can Use I Hear Voices Dirty Data (and How to Fix It) Statistical Process Control – Where Does It Apply to Engineering? Convincing Senior Management: The Value Proposition Summary Cheap Sources of Information and Tools MSE Addendum © 2006 by Carnegie Mellon University page 35
  36. 36. The Voices of Six Sigma Six Sigma includes powerful techniques for understanding the problem you are trying to solve • Voice of Customer • Voice of Process • Voice of Business These techniques are useful in non-Six Sigma settings for understanding: • Customer requirements and needs • Process performance and capability • Business priorities and trends © 2006 by Carnegie Mellon University page 36
  37. 37. Voice of Customer (VOC) A process used to capture the requirements/feedback from the customer (internal or external) • Proactive and continuous • Stated and unstated needs • “Critical to Quality (CTQ)”- What does the customer think are the critical attributes of quality? Approaches: • Customer specifications • Interviews, surveys, focus groups • Prototypes • Bug reports, complaint logs, etc. • House of Quality © 2006 by Carnegie Mellon University page 37
  38. 38. Kano Diagram © 2006 by Carnegie Mellon University page 38
  39. 39. Quality Function Deployment A link between customer attributes and design parameters • What attributes are critical to our customers? • What design parameters are important in driving those customer attributes? • What should the design parameter targets be for the new design? © 2006 by Carnegie Mellon University page 39
  40. 40. Requirements Development VOC approaches provide powerful methods for eliciting, analyzing, and validating requirements Can overcome common problems by: • Identifying ALL the customers • Identifying ALL their requirements • Probing beyond the stated requirements for needs • Understanding the requirements from the customers’ perspective • Recognizing and resolving conflicts between requirements or between requirement providers © 2006 by Carnegie Mellon University page 40
  41. 41. Voice of Process Characteristics of the process: • What it is capable of achieving • Whether it is under control • What significance to attach to individual measurements - are they part of natural variation or a signal to deal with? © 2006 by Carnegie Mellon University page 41
  42. 42. Control Chart A time-ordered plot of process data points with a centerline based on the average and control limits that bound the expected range of variation Control charts are one of the most useful quantitative tools for understanding variation © 2006 by Carnegie Mellon University page 42
  43. 43. Key Features of a Control Chart Upper 12 Control UCL=11.60 11 Limit Process 10 “Average” Individual Value 9 8 Mean=7.8 7 6 Individual Lower5 data points Control 4 LCL=4.001 Limit 3 Time 0 5 10 15 ordered Observation Number x-axis © 2006 by Carnegie Mellon University page 43
  44. 44. A Stable (Predictable) Process U Chart of Defects Detected in Requirements Definition 0.10 UCL=0.09633 Defects per Line of Code 0.09 0.08 _ U=0.07825 All data points within 0.07 the control limits. No signals of special cause variation. 0.06 LCL=0.06017 1 3 5 7 9 11 13 15 17 19 21 23 25 ID © 2006 by Carnegie Mellon University page 44
  45. 45. Variation 12 Common Cause Variation UCL=11.60 11 10 • Routine variation that comes from within the process Individual Value 9 8 7 Mean=7.8 • Caused by the natural 6 5 variation in the process 4 LCL=4.001 • Predictable (stable) within a 3 0 5 10 15 range Observation Number _ 15 UCL=13.36 Special Cause Variation • Assignable variation that Individual Value 10 Mean=7.467 comes from outside the 5 process LCL=1.578 • Caused by a unexpected 0 1 variation in the process 0 Observation Number 5 10 15 • Unpredictable © 2006 by Carnegie Mellon University page 45
  46. 46. What if the Process Isn’t Stable? You may be able to explain out Defects per component 25 of limit points by observing that Out of limit points they are due to an variation in 20 the process 15 • E.g., peer review held on Friday UCL afternoon 10 • You can eliminate the points 5 _ from the data, if they are not X part of the process you are 0 trying to predict 1 11 21 31 41 51 61 71 Component # You may be able to segment the data by an attribute of the process or attribute of the Defects per component Defects per component 25 25 corresponding work product 20 20 • E.g., different styles of peer 15 15 UCL UCL 10 10 reviews, peer reviews of 5 _ X 5 _ X different types of work 0 1 11 21 31 41 Component # 51 61 71 0 1 11 21 31 41 Component # 51 61 71 products © 2006 by Carnegie Mellon University page 46
  47. 47. Voice of Business The "voice of the business" is the term used to describe the stated and unstated needs or requirements of the business/shareholders. © 2006 by Carnegie Mellon University page 47
  48. 48. Agenda Six Sigma Benefits for Early Adopters – What the Books Don’t Tell You The 7 Six Sigma Tools Everyone Can Use I Hear Voices Dirty Data (and How to Fix It) Statistical Process Control – Where Does It Apply to Engineering? Convincing Senior Management: The Value Proposition Summary Cheap Sources of Information and Tools MSE Addendum © 2006 by Carnegie Mellon University page 49
  49. 49. Evaluating Data Quality Does the measurement system yield accurate, precise, and reproducible data? • A measurement system evaluation (MSE) addresses these questions • It includes understanding the data source and the reliability of the process that created it. Frequently occurring problems include the following: • wrong data • missing data • Skewed or biased data Sometimes, a simple “eyeball” test reveals such problems More frequently, a methodical approach is warranted. [MPDI] © 2006 by Carnegie Mellon University page 50
  50. 50. Discussion: What if I Skip This Step? What if… • All 0’s in the inspection database are really missing data? • “Unhappy” customers are not surveyed? • Delphi estimates are done only by experienced engineers? • A program adjusts the definition of “line of code” and doesn’t mention it? • Inspection data doesn’t include time and defects prior to the inspection meeting? • Most effort data are tagged to the first work breakdown structure item on the system dropdown menu? • The data logger goes down for system maintenance in the first month of every fiscal year? • A “logic error” to one engineer is a “___” to another Which are issues of validity? Bias? Integrity? Accuracy? ? How might they affect your conclusions and decisions? [MPDI] by Carnegie Mellon University © 2006 page 51
  51. 51. Evaluating Data Quality: Simple Checks Use common sense, basic tools, and good powers of observation. Look at the frequency of each value: • Are any values out of bounds? • Does the frequency of each value make sense? • Are some used more or less frequently than expected? Supporting tools and methods include • process mapping • indicator templates • operational definitions • descriptive statistics • checklists [MPDI] © 2006 by Carnegie Mellon University page 52
  52. 52. Practical Tips Map the data collection process. • Know the assumptions associated with the data Look at indicators as well as raw measures. • Ratios of bad data still equal bad data Data systems to focus on include the following: • Manually collected or transferred data • Categorical data • Startup of automated systems [MPDI] © 2006 by Carnegie Mellon University page 53
  53. 53. Formal MSE Provides Answers… • How big is the measurement error? • What are the sources of measurement error? • Is the measurement system stable over time? • Is the measurement system capable? • How can the measurement system be improved? [MPDI] © 2006 by Carnegie Mellon University page 54
  54. 54. Sources of Variation in a Formal MSE Total Variability = Process + Measurement System Variability Variability σ2Total σ2Process M1 σ2MS M2 M3 The variation of a process is the sum of variation from all process sources including measurement error. [MPDI] © 2006 by Carnegie Mellon University page 55
  55. 55. Characteristics of a Formal MSE • Precision (reproducibility and repeatability – R&R) • Accuracy (bias) • Stability over time [MPDI] © 2006 by Carnegie Mellon University page 56
  56. 56. Accuracy (Bias) Accurate Not accurate Accuracy—The closeness of (average) reading to the correct value or accepted reference standard. [MPDI] © 2006 by Carnegie Mellon University page 57
  57. 57. Precision vs. Accuracy (σ) (µ) Accurate Precise Both accurate but not precise but not accurate and precise [MPDI] © 2006 by Carnegie Mellon University page 58
  58. 58. Precision Spread refers to the standard deviation of a distribution. The standard deviation of the measurement system distribution is called the precision, σMS. Precision is made up of two sources of variation: repeatability and reproducibility. Precision σ2 MS = Reproducibility σ2 rpd + Repeatability σ2 rpt [MPDI] © 2006 by Carnegie Mellon University page 59
  59. 59. Repeatability Repeatability is the inherent variability of the measurement system. Measured by σRPT, the standard deviation of the distribution of repeated measurements. The variation that results when repeated measurements are made under identical conditions: • same inspector, analyst • same set up and measurement procedure • same software or document or dataset • same environmental conditions • during a short interval of time ? What are your repeatability issues? [MPDI] © 2006 by Carnegie Mellon University page 60
  60. 60. Reproducibility Reproducibility is the variation that results when different conditions are used to make the measurement: • different software inspectors or analysts • different set up procedures, checklists at different sites • different software modules or documents • different environmental conditions; Measured during a longer period of time. Measured by σRPD. ? What are your reproducibility issues? [MPDI] © 2006 by Carnegie Mellon University page 61
  61. 61. Simple MSE for Continuous Data—1 • Have 10 objects to measure (projects to forecast, modules of code to inspect, tests to run, etc…; variables data involved!). • Have 3 appraisers (different forecasters, inspectors, testers, etc…). • Have each person repeat the measurement at least 2 times for each object. • Measurements should be made independently and in random order. • Calculate the measurement system variability (see addenda). • Calculate the %GRR metric to determine acceptability 62 © 2006 by Carnegie Mellon University page f h ( dd d )
  62. 62. MSE Metrics-Precision %Gauge Repeatability & Reproducibility (%GR&R): The fraction of total variation consumed by measurement system variation. σ MS % GRR * = x100 % σ Total * Automotive Industry Action Group (AIAG) MSA Reference Manual, 3rd edition © 2006 by Carnegie Mellon University page 63
  63. 63. How Much Variation is Tolerable? If the %GRR is… Then measurement error is… <10% Acceptable Unacceptable for “critical” measurements between 10% & 30% (You should improve the measurement process.) >30% Unacceptable * Reference Automotive Industry Action Group (AIAG) MSA Reference Manual, 3rd edition © 2006 by Carnegie Mellon University page 64
  64. 64. MSE Calculations for Attribute Data 1 (see addenda for formulas, example) Conducting measurement system evaluation on attribute data is slightly different from the continuous data. Two approaches for Attribute Data will be discussed: - Quick rule of thumb approach - Formal statistical approach (see addenda) [MPDI] © 2006 by Carnegie Mellon University page 65
  65. 65. MSE Calculations for Attribute Data 2 Quick Rule of Thumb Approach 1. Randomly select 20 items to measure • Ensure at least 5-6 items barely meet the criteria for a “pass” rating. • Ensure at least 5-6 items just miss the criteria for a “pass” rating. 2. Select two appraisers to rate each item twice. • Avoid one appraiser biasing the other. 3. If all ratings agree (four per item), then the measurement error is acceptable, otherwise the measurement error is unacceptable. [MPDI] © 2006 by Carnegie Mellon University page 66
  66. 66. Analyze Guidance Questions Define Measure Analyze Improve Control Define Identify Explore • What do the data look Identify Define project needed data like? possible control Scope data solutions method • What is driving the Establish Obtain data Characterize variation? process and Implement Implement formal set • (pilot asis the What new project problem needed) baseline? Evaluate Update • What are associated risks Select data quality improvement and assumptions solution Document project scope and scale associated with revised data set and baseline? Summarize, Evaluate baseline data [MPDI] © 2006 by Carnegie Mellon University page 67
  67. 67. Exploring the Data What do the data look like? What is driving the variation? Probing questions during data exploration: • What should the data look like? And, does it? - first principles, heuristics or relationships - mental model of process (refer to that black box) - what do we expect, in terms of cause & effect • Are there yet-unexplained patterns or variation? If so, - conduct more Y to x analysis - plot, plot, plot using the basic tools • Are there hypothesized x’s that can be removed from the list? Objective - To completely identify the Y’s, little y’s, and x’s [MPDI] © 2006 by Carnegie Mellon University page 68
  68. 68. Exercise: Outliers What is an outlier? • a data point which does not appear to follow the characteristic distribution of the rest of the data • an observation that lies an abnormal distance from other values in a random sample from a population • Consider this cost variance data: ?? - 13, 22, 16, 20, 16, 18, 27, 25, 30, 333, 40 - average = 50.9, standard deviation = 93.9 If “333” is a typo and should have been “33” - corrected average = 23.6, corrected standard deviation = 8.3 But, what if it’s a real value? In groups of 3 • Share your approach for deciding if and when to remove extreme values from data sets. [Frost 03], [stats-online] © 2006 by Carnegie Mellon University page 69
  69. 69. Removing Outliers There is not a widely-accepted automated approach to removing outliers. Approaches • Visual - examine distributions, trend charts, SPC charts, scatter plots, box plots - couple with knowledge of data and process • Quantitative methods - interquartile range - Grubbs’ test [Frost 03], [stats-online] © 2006 by Carnegie Mellon University page 70
  70. 70. Interquartile Range Description • A quantitative method for identifying possible outliers in a data set Procedure 1. Determine 1st and 3rd quartiles of data set: Q1, Q3 2. Calculate the difference: interquartile range or IQR 3. Lower outlier boundary = Q1 – 1.5*IQR 4. Upper outlier boundary = Q3 + 1.5*IQR [MPDI] © 2006 by Carnegie Mellon University page 71
  71. 71. Interquartile Range: Example 4 1 333 2 40 Upper outlier boundary Q3 30 30 + 1.5*14 = 51 Interquartile Range 27 30 – 16 = 14 25 22 Procedure 20 1. Determine 1st and 3rd quartiles 18 of data set: Q1, Q3 Q1 16 3 2. Calculate the difference: 16 interquartile range or IQR Lower outlier boundary 3. Lower outlier boundary = 13 Q1 – 1.5*IQR 16 – 1.5*14 = -5 4. Upper outlier boundary = Q3 + 1.5*IQR Example adapted from “Metrics, Measurements, & Mathematical Mayhem,” Alison Frost, Raytheon, SEPG 2003 © 2006 by Carnegie Mellon University page 72
  72. 72. Tips About Outliers Outliers can be clue to process understanding: learn from them. If outliers lead you to measurement system problems, • repair the erroneous data if possible • if it cannot be repaired, delete it Charts that are particularly effective to flag possible outliers include: • box plots • distributions • scatter plots • control charts (if you meet the assumptions) Rescale charts when an outlier reduces visibility into variation. Be wary of influence of outliers on linear relationships. [MPDI] © 2006 by Carnegie Mellon University page 73
  73. 73. When Not to Remove Outliers When you don’t understand the process. Because you “don’t like the data points” or they make your analysis more complicated. Because IQR or Grubbs method “says so.” When they indicate a “second population.” • Identify the distinguishing factor and separate the data. When you have very few data points. Innocent until proven guilty [MPDI] © 2006 by Carnegie Mellon University page 74
  74. 74. Summary: Addressing Data Quality Issues Identify & remove data with poor quality Identify & remove outliers • Remember: innocent until proven guilty If you remove significant amounts of data • Repair your measurement system Quantify variation due to measurement system • Reduce variability as needed Determine the risks of moving ahead with process and product analysis • Identify interpretation risks • Identify magnitude of process/product problems relative to data problems • Identify undesirable consequences of not proceeding with data-driven process improvement, even in the face of data quality issues © 2006 by Carnegie Mellon University page 75
  75. 75. Agenda Six Sigma Benefits for Early Adopters – What the Books Don’t Tell You The 7 Six Sigma Tools Everyone Can Use I Hear Voices Dirty Data (and How to Fix It) Statistical Process Control – Where Does It Apply to Engineering? Convincing Senior Management: The Value Proposition Summary Cheap Sources of Information and Tools MSE Addendum © 2006 by Carnegie Mellon University page 76
  76. 76. A Typical Six Sigma Project in Engineering The organization notes that systems integration has been problematic on past projects (budget/schedule overruns) A Six Sigma team is formed to scope the problem, collect data from past projects, and determine the root cause(s) The team’s analysis of the historical data indicates that poorly understood interface requirements account for 90% of the overruns Procedures and criteria for a peer review of the interface requirements are written, using best practices from past projects A pilot project uses the new peer review procedures and criteria, and collects data to verify that they solve the problem The organization’s standard SE process and training is modified to incorporate the procedures and criteria, to prevent similar problems on future projects © 2006 by Carnegie Mellon University page 77
  77. 77. Applicability to Engineering System engineering processes are fuzzy • Systems engineering "parts" are produced using processes lacking predictable mechanizations assumed for manufacturing of physical parts • Simple variation in human cognitive processes can prevent rigorous application of the Six Sigma methodology • Process variation can never be eliminated or may not even reduced below a moderate level Results often cannot be measured in clear $ savings returned to organization • Value is seen in reduced risk, increased customer satisfaction, more competitive bids, … © 2006 by Carnegie Mellon University page 78
  78. 78. Additional Challenges Difficulty in collecting subjective, reliable data • Humans are prone to errors and can bias data • E.g., the time spent in privately reviewing a document Dynamic nature of an on-going project • Changes in schedule, budget, personnel, etc. corrupt data Repeatable process data requires the project/organization to define (and follow) a detailed process Analysis requires that complex SE processes be broken down into small, repeatable tasks • E.g., peer review © 2006 by Carnegie Mellon University page 79
  79. 79. The Engineering Life-Cycle Requirements Top Level Detailed Integration Test Analysis Design Design The development process has many sources of variation • Process • Measurement system • Personnel • Product • Technology • Management actions A stable (quantifiable) process must be chosen which is short, and has limited sources of variation • Must also have value in being predictable © 2006 by Carnegie Mellon University page 80
  80. 80. Typical Choices in Industry Most customers care about: • Delivered defects Defect Detection Profile • Cost and schedule 180.00 So organizations try to predict: 160.00 140.00 • Defects found throughout 120.00 the lifecycle Defects/KSLOC 100.00 All Projects New Process • Effectiveness of peer 80.00 60.00 reviews, testing 40.00 • Cost achieved/actual 20.00 (Cost Performance Index – 0.00 Req'mts Design Code Unit Test Integrate Sys Test Del 90 Days Phase CPI) • Schedule achieved/actual (Schedule Performance Index – SPI) © 2006 by Carnegie Mellon University page 81
  81. 81. Peer Reviews Can we predict the number of errors found in a peer review? 12 Could generate a control chart 11 UCL=11.60 of errors detected over multiple 10 Individual Value reviews 9 8 Mean=7.8 Must assume: 7 6 • Product errors are normally 5 and uniformly distributed 4 LCL=4.001 • Same quality of reviews 3 0 5 10 15 (number/ability of reviewers) Observation Number • No other special causes (process is stable) © 2006 by Carnegie Mellon University page 82
  82. 82. Quantitative Management Suppose your project conducted several peer reviews of similar code, and analyzed the results • Mean = 7.8 defects/KSLOC • +3σ = 11.60 defects/KSLOC • -3σ = 4.001 defects/KSLOC 12 UCL=11.60 11 10 What would you expect Individual Value 9 the next peer review to 8 Mean=7.8 produce in terms of 7 defects/ KSLOC? 6 5 What would you think if a 4 LCL=4.001 review resulted in 10 3 defects/KSLOC? 0 5 10 15 Observation Number 3 defects/KSLOC? © 2006 by Carnegie Mellon University page 83
  83. 83. Understanding the Process 12 UCL=11.60 11 10 Individual Value 9 Expected 8 Variation Average Mean=7.8 7 6 5 4 LCL=4.001 3 0 5 10 15 Observation Number Useful in evaluating future reviews • Was the review effective? Corrective and • Was the process different? preventative actions • Is the product different? © 2006 by Carnegie Mellon University page 84
  84. 84. Improving the Process Reduce the variation • Train people on the process 15 1 2 • Create procedures/checklists UCL=11.17 • Strengthen process audits 10 Individual Value Mean=7.268 Increase the effectiveness 5 (increase the mean) LCL=3.363 • Train people 0 • Create checklists • Reduce waste and re-work 0 5 10 15 • Replicate best practices Observation Number from other projects © 2006 by Carnegie Mellon University page 85
  85. 85. CMMI Level 4 Organizational Process Performance • Establishes a quantitative understanding of the performance of the organization’s set of standard processes organizational measurement organizational performance data repository standard process & models tailoring project’s defined customer and project project performance process objectives Quantitative Project Management • Quantitatively manage the project’s defined process to achieve the project’s established quality and process-performance objectives. © 2006 by Carnegie Mellon University page 86
  86. 86. How Six Sigma Helps Process Improvement PI efforts often generate or have little direct impact on the business goals • Confuses ends with means; results measured in activities implemented, not results Six Sigma delivers results that matter to managers (fewer defects, higher efficiency, cost savings, …) Six Sigma concentrates on problem solving in small groups, focused on a narrow issue • Allows for frequent successes (3-9 months) Six Sigma focuses on the customer’s perception of quality © 2006 by Carnegie Mellon University page 87
  87. 87. How Six Sigma Helps CMMI Adoption For an individual process: • CMM/CMMI identifies what SG 1 Establish Estimates SP 1.1 Estimate the Scope of the Project activities are expected in SP 1.2 Establish Estimates of Project Attributes the process SP 1.3 Define Project Life Cycle SP 1.4 Determine Estimates of Effort and • Six Sigma identifies how Cost they can be improved SG 2 Develop a Project Plan SP 2.1 Establish the Budget and Schedule (efficient, effective) SP 2.2 Identify Project Risks SP 2.3 Plan for Data Management SP 2.4 Plan for Project Resources SP 2.5 Plan for Needed Knowledge and Example – Project Planning Skills SP 2.6 Plan Stakeholder Involvement • Could fully meet the CMMI goals and SP 2.7 Establish the Project Plan practices, but still write poor plans SG 3 Obtain Commitment to the Plan SP 3.1 Review Subordinate Plans • Six Sigma can be used to improve the SP 3.2 Reconcile Work and Resource planning process and write better Levels plans SP 3.3 Obtain Plan Commitment © 2006 by Carnegie Mellon University page 88
  88. 88. Approaches to Process Improvement Data-Driven (e.g., Six Sigma, Lean) Model-Driven (e.g., CMM, CMMI) Clarify what your customer wants Determine the industry best practice (Voice of Customer) • Benchmarking, models • Critical to Quality (CTQs) Compare your current practices to the Determine what your processes can model do (Voice of Process) • Appraisal, education • Statistical Process Control Identify and prioritize improvement Identify and prioritize improvement opportunities opportunities • Implementation • Causal analysis of data • Institutionalization Determine where your Look for ways to optimize the customers/competitors are going processes (Voice of Business) • Design for Six Sigma © 2006 by Carnegie Mellon University page 89
  89. 89. Agenda Six Sigma Benefits for Early Adopters – What the Books Don’t Tell You The 7 Six Sigma Tools Everyone Can Use I Hear Voices Dirty Data (and How to Fix It) Statistical Process Control – Where Does It Apply to Engineering? Convincing Sr. Management: The Value Proposition Summary Cheap Sources of Information and Tools MSE Addendum © 2006 by Carnegie Mellon University page 90
  90. 90. What Is Your Current Reality? What are your managers saying? Asking? How do your views differ? What would you like to convince them of? • What is your value proposition? © 2006 by Carnegie Mellon University page 91
  91. 91. Example: NGC Mission Systems The Six Sigma adoption decision Started as a CEO mandate, but embraced by the organization • Seen as a way to enable data-drive decision making • Integrated with CMMI and other PI initiatives • Engaged customers, who saw it as a way to solve their problems With experience, people saw that Six Sigma: • Was more than statistics • Could be applied to engineering • Greatly accelerated the understanding and adoption of CMMI Levels 4 and 5 • Resulted in both hard and soft savings that could be quantified © 2006 by Carnegie Mellon University page 92
  92. 92. Example: Motorola The CMMI adoption decision: Will it benefit existing Six Sigma initiatives? Executive sponsorship and engagement • Benchmarked with execs from a successful company: to witness the benefits first hand • Execs gave the sales pitch -- their personal leadership sold it • Established upward mentoring: MBB coach & CMMI expert for each exec Deployment - Leveraging executive “pull” • Execs controlled adoption schedule, to meet critical business needs • Modified the reward and recognition structure • “Rising star” program for both technical and management tracks • Training began at the top and worked its way down Execution – Speaking the language of executives and the business • Calculated costs & benefits of all proposals; listed the intangibles • Risk reduction: Start small, pilot, and build on successes [Siviy 05-2] © 2006 by Carnegie Mellon University page 93
  93. 93. Change Management D x V x F > R D = Dissatisfaction with the present V = Vision for the Future F = First (or next) Steps R = Resistance [Beckhard] © 2006 by Carnegie Mellon University page 94
  94. 94. What Drives Process Improvement? Performance issues: product, project— • and, eventually, process issues Regulations and mandates • Sarbanes Oxley • “Level 3” requirements to win contracts Business issues and “burning platforms” • lost market share or contracts • continuous cost and cycle time improvement • capitalizing on new opportunities There is compliance-driven improvement, and there is performance-driven improvement. © 2006 by Carnegie Mellon University page 95
  95. 95. Value Proposition: Six Sigma as Strategic Enabler The SEI conducted a research project to explore the feasibility of Six Sigma as a transition enabler for software and systems engineering best practices. Hypothesis • Six Sigma used in combination with other software, systems, and IT improvement practices results in - better selections of improvement practices and projects - accelerated implementation of selected improvements - more effective implementation - more valid measurements of results and success from use of the technology Achieving process improvement… better, faster, cheaper. © 2006 by Carnegie Mellon University page 96
  96. 96. Research Conclusions Six Sigma is feasible as an enabler of the adoption of software, systems, and IT improvement models and practices (a.k.a., “improvement technologies”). The CMMI community is more advanced in their joint use of CMMI & Six Sigma than originally presumed. Noting that, for organizations studied, Six Sigma adoption & deployment • was frequently decided upon at the enterprise level, with software, systems, and IT organizations following suit • was driven by senior management’s previous experience and/or a burning business platform • was consistently comprehensive. [IR&D 04] © 2006 by Carnegie Mellon University page 97
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×