1
CMMI Model Changes
for High Maturity
Herb Weiner
Pat O’Toole
2008 SEPG Conference
Tampa, Florida
2
Problem Statement
High maturity practices are not consistently
understood, applied, or appraised
 SEI is addressing the training and appraisal portions of
the CMMI Product Suite; e.g.,
 Understanding CMMI High Maturity Practices course
 Several recent presentations by SEI personnel
 High Maturity Lead Appraisers certification
 However, there is insufficient foundation for these
“raise-the-floor” interpretations in CMMI v1.2
 Goals do not establish the requirements
 Practices do not establish the expectations
 Informative material purported to take on greater importance.
3
Eating Your Own Dog Food
Requirements Management SG1:
 Requirements are managed and
inconsistencies with project plans and work
products are identified
CMMI Product Suite Management SG1:
 CMMI model requirements are managed and
inconsistencies with CMMI training courses
and appraisal methods are identified.
4
Approach
 Draft proposed changes
 CMMI Model & SCAMPI Method Changes for High Maturity
(Herb Weiner, May 2007)
 Solicit feedback from SEI authorized people via ATLAS
 ATLAS = Ask The Lead AppraiserS
 ATLAS has been expanded to include CMMI instructors
 Candidate lead appraisers and instructors also included
 Publish results to SEI authorized individuals
 Submit CRs to SEI for consideration
 Update model to re-align the CMMI Product Suite.
5
ATLAS Feedback
For each proposed change, respondents indicated:
 Strongly support (It’s perfect!)
 Support (It’s better)
 Are ambivalent (It’s OK either way)
 Disagree (It’s worse)
 Strongly disagree (What were you thinking?)
Ratings were determined on a +1 to -1 scale as follows:
 Strongly support = +1.0
 Support = +0.5
 Ambivalent = 0.0
 Disagree = -0.5
 Strongly disagree = -1.0
For each change, the average rating will be displayed for:
 [High Maturity Lead Appraisers, Other SEI authorized individuals]
6
Proposed
OPP
Changes
7
OPP Proposed Change #1 of 4
Move SP 1.3 to SP 1.1
Current:
SP 1.1 Select Processes
SP 1.2 Establish Process-Performance Measures
SP 1.3 Establish Quality and Process-Performance
Objectives
Proposed:
SP 1.1 Establish Quality and Process-Performance
Objectives
SP 1.2 Select Processes
SP 1.3 Establish Process-Performance Measures
 MA, OPF, and QPM establish objectives in SP 1.1.
(.50, .51)
8
OPP Proposed Change #2 of 4
Revise OPP SP 1.4
Current:
Establish and maintain the organization’s process-
performance baselines.
Proposed:
Conduct process-performance analyses on the selected
processes and subprocesses to verify process stability
and to establish and maintain the organization’s
process-performance baselines.
 SP 1.1 & 1.2 indicate process-performance analysis will
be conducted, but that’s the last we hear of it
 Baselines are established for stable processes
 Elevate this from informative to expected.
(.39, .42)
9
OPP Proposed Change #3 of 4
Revise OPP SP 1.5
Current:
Establish and maintain the process-performance models for the
organization’s set of standard processes.
Proposed:
Establish and maintain models that predict process performance
related to the quality and process-performance objectives.
 The SEI’s new training courses emphasize use of process-
performance models with respect to quantitative objectives
 Focusing this practice on these objectives achieves better alignment
between the model and training.
(.59, .50)
10
OPP Proposed Change #4 of 4
Enhance the informative material
Proposed:
Modify informative material that suggests improving process
performance such as the examples found in OPP SP 1.3 (which
imply that common causes of variation be addressed)
Add new informative material should indicate that, at ML4/CL4,
achieving such improvement might be addressed via OPF and
GP3.1, while at ML5/CL5, it is more likely to be achieved through
CAR, OID, and GP5.2
 In order to delineate level 4 from level 5, the model should avoid
implying that common causes of variation are addressed at level 4
 ML4/CL4: Process stability / execution consistency / special causes
 ML5/CL5: Improving capability / systemic improvement / common causes.
(.36, .44)
11
Proposed
QPM
Changes
12
QPM Proposed Change #1 of 4
Revise QPM SP 1.4
Current:
SP 1.4 Manage Project Performance
Monitor the project to determine whether the project’s objectives for quality
and process performance will be satisfied, and identify corrective action as
appropriate.
Proposed:
SP 1.4 Analyze Project Performance
Analyze the collective performance of the project's subprocesses to
predict whether the project's objectives for quality and process
performance will be satisfied and identify the need for corrective action as
appropriate.
 Fixes mismatch between the current title and practice statement
 Recognizes that project management deals with both quantitatively
managed, and non-quantitatively managed processes.
(.54, .57)
13
QPM Proposed Change #2 of 4
Add QPM SP 1.5
Current: <None>
Proposed:
SP 1.5 Use Process-Performance Models
Use calibrated process-performance models
throughout the life cycle to identify, analyze, and
execute corrective action when necessary.
 Currently, PPMs aren’t expected to be used in QPM
 But use throughout life cycle appears to be expected by SEI
 PPMs may support process or subprocess activities
 Added practice to SG 1, but it could have been added to SG2.
(.39, .46)
14
QPM Proposed Change #3 of 4
Add QPM SP 2.3
Current: <None>
Proposed:
SP 2.3 Address Special Causes of Variation
Identify, address, and prevent reoccurrence of special causes of
variation in the selected subprocesses.
 “Special causes” are featured in SEI materials
 Currently “special causes” are only in QPM’s informative material
 The Glossary definition of “stable process” includes “…and prevent
reoccurrences of special causes”
 Add informative material to ensure that process performance data
and statistical techniques are used appropriately.
(.64, .48)
15
QPM Proposed Change #4 of 4
Revise QPM SP 2.3 (now SP 2.4)
Current:
SP 2.3 Monitor Performance of the Selected Subprocesses
Monitor the performance of the selected subprocesses to
determine their capability to satisfy their quality and process-
performance objectives, and identify corrective action as
necessary.
Proposed:
SP 2.4 Analyze Performance of the Selected Subprocesses
Analyze the performance of the selected subprocesses to predict
their capability to satisfy their quality and process-performance
objectives, and identify and take corrective action as necessary.
 “Analyze” is a much stronger word than “monitor”
 “Predict” is a much stronger word than “determine”
 Emphasize “taking corrective action,” not just identifying it.
(.59, .46)
16
Proposed
CAR
Changes
17
CAR Proposed Change #1 of 7
Thematic Change
 Currently, there is little to suggest that CAR should target
statistically managed subprocesses to identify and analyze
common causes of variation to address:
 Stable processes with unacceptably high standard deviations;
 Stable processes not capable of achieving quality or process
performance objectives; and
 Stable and capable processes that might be improved to enhance
competitive advantage
 Change the focus of CAR’s specific goals and practices from
“defects and other problems” to “problems”
 By collapsing this phrase, model users will not limit their application of
CAR to the subset of problem candidates called “defects”
 Also include a discussion of “opportunities” in the informative material.
(.50, .46)
18
CAR Proposed Change #2 of 7
Revise CAR SG 1
Current:
SG 1 Determine Causes of Defects
Root causes of defects and other problems are
systematically determined.
Proposed:
SG 1 Determine and Analyze Causes
Common causes of variation and root causes of
problems are systematically analyzed.
 Reflects the Thematic Change
 “Analyzed” is a stronger word than “determined”.
(.56, .63)
19
CAR Proposed Change #3 of 7
Revise CAR SP 1.1
Current:
SP 1.1 Select Defect Data for Analysis
Select the defects and other problems for analysis.
Proposed:
SP 1.1 Select Data for Analysis
Select for analysis, using established criteria, quantitatively managed
processes that are candidates for improvement as well as problems
that have a significant effect on quality and process performance.
 Reflects the Thematic Change
 “Significant effect” emphasizes quantitatively managed processes.
(.64, .53)
20
CAR Proposed Change #4 of 7
Revise CAR SP 1.2 and add SP1.3-SP 1.4
Current:
SP 1.2 Analyze Causes
Perform causal analysis of selected defects and other problems and propose
actions to address them.
Proposed:
SP 1.2 Analyze Common Causes
Analyze common causes of variation to understand the inherent quality and
process performance constraints.
SP 1.3 Analyze Root Causes
Perform causal analysis on selected problems to determine their root
causes.
SP 1.4 Propose Actions to Address Causes
Propose actions to address selected common causes of variation and to
prevent recurrence of selected problems.
 Reflects the Thematic Change.
 Establishes expectations for BOTH common causes and root causes.
(.44, .57)
21
CAR Proposed Change #5 of 7
Add CAR SP 1.5
Current: <None>
Proposed:
SP 1.5 Predict Effects of Proposed Actions
Use process performance models and statistical
techniques to predict, in quantitative terms, the effects
of the proposed actions, as appropriate.
 Reflects the SEI’s expected use of PPMs and statistical
methods in high maturity organizations
 Supports proper cost/benefit analysis.
(.52, .58)
22
CAR Proposed Change #6 of 7
Revise CAR SG 2, SP 2.1 – SP 2.2
Current:
SG 2 Analyze Causes
Root causes of defects and other problems are systematically addressed to
prevent their future occurrence.
SP 2.1 Implement the Action Proposals
Implement the selected action proposals that were developed in causal analysis.
SP 2.2 Evaluate the Effect of Changes
Evaluate the effect of changes on process performance.
Proposed:
SG 2 Address Causes
Common causes of variation and root causes of problems are systematically
addressed to quantitatively improve quality and process performance.
SP 2.1 Implement the Action Proposals
Implement selected action proposals that are predicted to achieve a measurable
improvement in quality and process performance.
SP 2.2 Evaluate the Effect of Implemented Actions
Evaluate the effect of implemented actions on quality and process performance.
23
CAR Proposed Change #6 of 7
Proposed: (Copied from previous slide)
SG 2 Address Causes
Common causes of variation and root causes of problems are
systematically addressed to quantitatively improve quality and process
performance.
SP 2.1 Implement the Action Proposals
Implement selected action proposals that are predicted to achieve a
measurable improvement in quality and process performance.
SP 2.2 Evaluate the Effect of Implemented Actions
Evaluate the effect of implemented actions on quality and process
performance.
 Reflects the Thematic Change
 Wording enhanced to focus on measurable improvement of “quality and
process performance” – a phrase reserved for high maturity practices
 SP 2.2 modified to include quality as well as process performance
 A perceived oversight in the current practice.
(.46, .64)
24
CAR Proposed Change #7 of 7
Revise CAR SP 2.3
Current:
SP 2.3 Record Data
Record causal analysis and resolution data for use across the
project and organization.
Proposed:
SP 2.3 Submit Improvement Proposals
Submit process- and technology-improvement proposals based
on implemented actions, as appropriate.
 Proposed practice relies on OID to determine “use across the
project and organization”
 Recognizes that CAR may have been applied locally but the resulting
improvements may be more broadly applicable.
(.48, .41)
25
CAR Proposed Change #8 of 7
CAR is the only high maturity process area with
no lower-level foundation
 OPP – OPD & MA
 QPM – PP, PMC & IPM
 OID – OPF & OPD
Several alternatives were explored via ATLAS:
0. Leave CAR exactly as it is
1. Add “Causal Analysis” PA at ML2
2. Add “Causal Analysis” PA at ML3
3. Add “Causal Analysis” practice to PMC SG2
4. ADD “Issue & Causal Analysis” PA at ML2
5. Add “Causal Analysis” goal to OPF
(-.08,-.19)
(-.45,-.55)
(-.45,-.26)
(+.09,+.16)
(-.55,-.22)
(-.45,-.22)
26
Proposed
OID
Changes
27
OID Proposed Change #1 of 7
Revise OID SG 1
Current:
SG 1 Select Improvements
Process and technology improvements, which contribute to
meeting quality and process-performance objectives, are selected.
Proposed:
SG 1 Select Improvements
Process and technology improvements are identified proactively,
evaluated quantitatively, and selected for deployment based on
their contribution to quality and process performance.
 Somewhat passive vs. very proactive
 Focus on quantitative evaluation and ongoing improvement.
(.66, .63)
28
OID Proposed Change #2 of 7
Revise OID SP 1.1
Current:
SP 1.1 Collect and Analyze Improvement Proposals
Collect and analyze process- and technology-improvement
proposals.
Proposed:
SP 1.1 Solicit Improvement Proposals
Solicit proposals for incremental process and technology
improvements.
 “Solicit” is more proactive than “collect”
 “Analysis” is deferred to SP 1.3 and SP 1.4
 Explicitly targets incremental improvements.
(.66, .43)
29
OID Proposed Change #3 of 7
Revise OID SP 1.2
Current:
SP 1.2 Identify and Analyze Innovations
Identify and analyze innovative improvements that could increase
the organization’s quality and process performance.
Proposed:
SP 1.2 Seek Innovations
Seek and investigate innovative processes and technologies that
have potential for significantly improving the organization’s quality
and process performance.
 “Seek and investigate” is more proactive than “identify”
 “Analysis” is deferred to SP 1.3 and SP 1.4
 Focuses on “significant” performance enhancement.
(.65, .50)
30
OID Proposed Change #4 of 7
Add OID SP 1.3
Current: <None>
Proposed:
SP 1.3 Model Improvements
Use process performance models, as appropriate, to
predict the effect of incremental and innovative
improvements in quantitative terms.
 Adds modeling as an additional “filter”
 Supports quantitative cost/benefit analysis.
(.68, .44)
31
OID Proposed Change #5 of 7
Revise OID SP 1.3 (now SP 1.4)
Current:
SP 1.3 Pilot Improvements
Pilot process and technology improvements to select
which ones to implement.
Proposed:
SP 1.4 Pilot Improvements
Pilot proposed improvements, as appropriate, to
evaluate the actual effect on quality and process
performance in quantitative terms.
 Piloting performed “as appropriate”
 Provides rationale for implementation.
(.70, .61)
32
OID Proposed Change #6 of 7
Revise OID SP 1.4 (now SP 1.5)
Current:
SP 1.5 Select Improvements for Deployment
Select process and technology improvements for
deployment across the organization.
Proposed:
SP 1.4 Select Improvements for Deployment
Select process and technology improvements for
deployment across the organization based on an
evaluation of costs, benefits, and other factors.
 Provides cost and benefits as the basis for selection
 “Other factors” provides flexibility.
(.67, .51)
33
OID Proposed Change #7 of 7
Replace OID SP 2.3
Current:
SP 2.3 Measure Improvement Effects
Measure the effects of the deployed process and
technology improvements.
Proposed:
SP 2.3 Measure Improvement Effects
Evaluate the effects of deployed improvements on
quality and process performance in quantitative terms.
 Specifies evaluation criteria
 Indicates “quantitative” evaluation
 New informative material – update baselines/models.
(.70, .63)
34
What’s Next?
35
Change Requests
1. Since the feedback related to the
proposed changes was primarily
supportive, all will be submitted as Change
Requests to the SEI for consideration.
2. Change request submitted for UCHMP
course – add exercise to re-write high
maturity practices using ATLAS results as
the base.
36
Now It’s YOUR Turn!
Handout contains ATLAS #12Z proposing:
 Consolidating ML5 PAs into ML4
 Changing ML5 to “Sustaining Excellence”
 Achieve ML4
 ML4 = OPP, QPM, CAR, & OID
 No additional process areas at ML5
 Perform at high maturity for 2 contiguous years
 Demonstrate sustained business benefit as well
 Submit your input to PACT.otoole@att.net
 Results will be published to all submitters.
37
Questions?
???
38
Download & Contact Information
Refer to the following websites to:
 Contact the authors
 Download the final SEPG 2008 presentation
 Download the supporting ATLAS 12A – 12D
results
 Download the CMMI Model and SCAMPI Method
Changes presentation from the May 2007 San
Francisco Beyond CMMI v1.2 Workshop
Herb Weiner
Herb.Weiner@welchallyn.com
www.highmaturity.com
Pat O’Toole
PACT.otoole@att.net
www.pactcmmi.com

Cmmi hm 2008 sepg model changes for high maturity 1v01[1]

  • 1.
    1 CMMI Model Changes forHigh Maturity Herb Weiner Pat O’Toole 2008 SEPG Conference Tampa, Florida
  • 2.
    2 Problem Statement High maturitypractices are not consistently understood, applied, or appraised  SEI is addressing the training and appraisal portions of the CMMI Product Suite; e.g.,  Understanding CMMI High Maturity Practices course  Several recent presentations by SEI personnel  High Maturity Lead Appraisers certification  However, there is insufficient foundation for these “raise-the-floor” interpretations in CMMI v1.2  Goals do not establish the requirements  Practices do not establish the expectations  Informative material purported to take on greater importance.
  • 3.
    3 Eating Your OwnDog Food Requirements Management SG1:  Requirements are managed and inconsistencies with project plans and work products are identified CMMI Product Suite Management SG1:  CMMI model requirements are managed and inconsistencies with CMMI training courses and appraisal methods are identified.
  • 4.
    4 Approach  Draft proposedchanges  CMMI Model & SCAMPI Method Changes for High Maturity (Herb Weiner, May 2007)  Solicit feedback from SEI authorized people via ATLAS  ATLAS = Ask The Lead AppraiserS  ATLAS has been expanded to include CMMI instructors  Candidate lead appraisers and instructors also included  Publish results to SEI authorized individuals  Submit CRs to SEI for consideration  Update model to re-align the CMMI Product Suite.
  • 5.
    5 ATLAS Feedback For eachproposed change, respondents indicated:  Strongly support (It’s perfect!)  Support (It’s better)  Are ambivalent (It’s OK either way)  Disagree (It’s worse)  Strongly disagree (What were you thinking?) Ratings were determined on a +1 to -1 scale as follows:  Strongly support = +1.0  Support = +0.5  Ambivalent = 0.0  Disagree = -0.5  Strongly disagree = -1.0 For each change, the average rating will be displayed for:  [High Maturity Lead Appraisers, Other SEI authorized individuals]
  • 6.
  • 7.
    7 OPP Proposed Change#1 of 4 Move SP 1.3 to SP 1.1 Current: SP 1.1 Select Processes SP 1.2 Establish Process-Performance Measures SP 1.3 Establish Quality and Process-Performance Objectives Proposed: SP 1.1 Establish Quality and Process-Performance Objectives SP 1.2 Select Processes SP 1.3 Establish Process-Performance Measures  MA, OPF, and QPM establish objectives in SP 1.1. (.50, .51)
  • 8.
    8 OPP Proposed Change#2 of 4 Revise OPP SP 1.4 Current: Establish and maintain the organization’s process- performance baselines. Proposed: Conduct process-performance analyses on the selected processes and subprocesses to verify process stability and to establish and maintain the organization’s process-performance baselines.  SP 1.1 & 1.2 indicate process-performance analysis will be conducted, but that’s the last we hear of it  Baselines are established for stable processes  Elevate this from informative to expected. (.39, .42)
  • 9.
    9 OPP Proposed Change#3 of 4 Revise OPP SP 1.5 Current: Establish and maintain the process-performance models for the organization’s set of standard processes. Proposed: Establish and maintain models that predict process performance related to the quality and process-performance objectives.  The SEI’s new training courses emphasize use of process- performance models with respect to quantitative objectives  Focusing this practice on these objectives achieves better alignment between the model and training. (.59, .50)
  • 10.
    10 OPP Proposed Change#4 of 4 Enhance the informative material Proposed: Modify informative material that suggests improving process performance such as the examples found in OPP SP 1.3 (which imply that common causes of variation be addressed) Add new informative material should indicate that, at ML4/CL4, achieving such improvement might be addressed via OPF and GP3.1, while at ML5/CL5, it is more likely to be achieved through CAR, OID, and GP5.2  In order to delineate level 4 from level 5, the model should avoid implying that common causes of variation are addressed at level 4  ML4/CL4: Process stability / execution consistency / special causes  ML5/CL5: Improving capability / systemic improvement / common causes. (.36, .44)
  • 11.
  • 12.
    12 QPM Proposed Change#1 of 4 Revise QPM SP 1.4 Current: SP 1.4 Manage Project Performance Monitor the project to determine whether the project’s objectives for quality and process performance will be satisfied, and identify corrective action as appropriate. Proposed: SP 1.4 Analyze Project Performance Analyze the collective performance of the project's subprocesses to predict whether the project's objectives for quality and process performance will be satisfied and identify the need for corrective action as appropriate.  Fixes mismatch between the current title and practice statement  Recognizes that project management deals with both quantitatively managed, and non-quantitatively managed processes. (.54, .57)
  • 13.
    13 QPM Proposed Change#2 of 4 Add QPM SP 1.5 Current: <None> Proposed: SP 1.5 Use Process-Performance Models Use calibrated process-performance models throughout the life cycle to identify, analyze, and execute corrective action when necessary.  Currently, PPMs aren’t expected to be used in QPM  But use throughout life cycle appears to be expected by SEI  PPMs may support process or subprocess activities  Added practice to SG 1, but it could have been added to SG2. (.39, .46)
  • 14.
    14 QPM Proposed Change#3 of 4 Add QPM SP 2.3 Current: <None> Proposed: SP 2.3 Address Special Causes of Variation Identify, address, and prevent reoccurrence of special causes of variation in the selected subprocesses.  “Special causes” are featured in SEI materials  Currently “special causes” are only in QPM’s informative material  The Glossary definition of “stable process” includes “…and prevent reoccurrences of special causes”  Add informative material to ensure that process performance data and statistical techniques are used appropriately. (.64, .48)
  • 15.
    15 QPM Proposed Change#4 of 4 Revise QPM SP 2.3 (now SP 2.4) Current: SP 2.3 Monitor Performance of the Selected Subprocesses Monitor the performance of the selected subprocesses to determine their capability to satisfy their quality and process- performance objectives, and identify corrective action as necessary. Proposed: SP 2.4 Analyze Performance of the Selected Subprocesses Analyze the performance of the selected subprocesses to predict their capability to satisfy their quality and process-performance objectives, and identify and take corrective action as necessary.  “Analyze” is a much stronger word than “monitor”  “Predict” is a much stronger word than “determine”  Emphasize “taking corrective action,” not just identifying it. (.59, .46)
  • 16.
  • 17.
    17 CAR Proposed Change#1 of 7 Thematic Change  Currently, there is little to suggest that CAR should target statistically managed subprocesses to identify and analyze common causes of variation to address:  Stable processes with unacceptably high standard deviations;  Stable processes not capable of achieving quality or process performance objectives; and  Stable and capable processes that might be improved to enhance competitive advantage  Change the focus of CAR’s specific goals and practices from “defects and other problems” to “problems”  By collapsing this phrase, model users will not limit their application of CAR to the subset of problem candidates called “defects”  Also include a discussion of “opportunities” in the informative material. (.50, .46)
  • 18.
    18 CAR Proposed Change#2 of 7 Revise CAR SG 1 Current: SG 1 Determine Causes of Defects Root causes of defects and other problems are systematically determined. Proposed: SG 1 Determine and Analyze Causes Common causes of variation and root causes of problems are systematically analyzed.  Reflects the Thematic Change  “Analyzed” is a stronger word than “determined”. (.56, .63)
  • 19.
    19 CAR Proposed Change#3 of 7 Revise CAR SP 1.1 Current: SP 1.1 Select Defect Data for Analysis Select the defects and other problems for analysis. Proposed: SP 1.1 Select Data for Analysis Select for analysis, using established criteria, quantitatively managed processes that are candidates for improvement as well as problems that have a significant effect on quality and process performance.  Reflects the Thematic Change  “Significant effect” emphasizes quantitatively managed processes. (.64, .53)
  • 20.
    20 CAR Proposed Change#4 of 7 Revise CAR SP 1.2 and add SP1.3-SP 1.4 Current: SP 1.2 Analyze Causes Perform causal analysis of selected defects and other problems and propose actions to address them. Proposed: SP 1.2 Analyze Common Causes Analyze common causes of variation to understand the inherent quality and process performance constraints. SP 1.3 Analyze Root Causes Perform causal analysis on selected problems to determine their root causes. SP 1.4 Propose Actions to Address Causes Propose actions to address selected common causes of variation and to prevent recurrence of selected problems.  Reflects the Thematic Change.  Establishes expectations for BOTH common causes and root causes. (.44, .57)
  • 21.
    21 CAR Proposed Change#5 of 7 Add CAR SP 1.5 Current: <None> Proposed: SP 1.5 Predict Effects of Proposed Actions Use process performance models and statistical techniques to predict, in quantitative terms, the effects of the proposed actions, as appropriate.  Reflects the SEI’s expected use of PPMs and statistical methods in high maturity organizations  Supports proper cost/benefit analysis. (.52, .58)
  • 22.
    22 CAR Proposed Change#6 of 7 Revise CAR SG 2, SP 2.1 – SP 2.2 Current: SG 2 Analyze Causes Root causes of defects and other problems are systematically addressed to prevent their future occurrence. SP 2.1 Implement the Action Proposals Implement the selected action proposals that were developed in causal analysis. SP 2.2 Evaluate the Effect of Changes Evaluate the effect of changes on process performance. Proposed: SG 2 Address Causes Common causes of variation and root causes of problems are systematically addressed to quantitatively improve quality and process performance. SP 2.1 Implement the Action Proposals Implement selected action proposals that are predicted to achieve a measurable improvement in quality and process performance. SP 2.2 Evaluate the Effect of Implemented Actions Evaluate the effect of implemented actions on quality and process performance.
  • 23.
    23 CAR Proposed Change#6 of 7 Proposed: (Copied from previous slide) SG 2 Address Causes Common causes of variation and root causes of problems are systematically addressed to quantitatively improve quality and process performance. SP 2.1 Implement the Action Proposals Implement selected action proposals that are predicted to achieve a measurable improvement in quality and process performance. SP 2.2 Evaluate the Effect of Implemented Actions Evaluate the effect of implemented actions on quality and process performance.  Reflects the Thematic Change  Wording enhanced to focus on measurable improvement of “quality and process performance” – a phrase reserved for high maturity practices  SP 2.2 modified to include quality as well as process performance  A perceived oversight in the current practice. (.46, .64)
  • 24.
    24 CAR Proposed Change#7 of 7 Revise CAR SP 2.3 Current: SP 2.3 Record Data Record causal analysis and resolution data for use across the project and organization. Proposed: SP 2.3 Submit Improvement Proposals Submit process- and technology-improvement proposals based on implemented actions, as appropriate.  Proposed practice relies on OID to determine “use across the project and organization”  Recognizes that CAR may have been applied locally but the resulting improvements may be more broadly applicable. (.48, .41)
  • 25.
    25 CAR Proposed Change#8 of 7 CAR is the only high maturity process area with no lower-level foundation  OPP – OPD & MA  QPM – PP, PMC & IPM  OID – OPF & OPD Several alternatives were explored via ATLAS: 0. Leave CAR exactly as it is 1. Add “Causal Analysis” PA at ML2 2. Add “Causal Analysis” PA at ML3 3. Add “Causal Analysis” practice to PMC SG2 4. ADD “Issue & Causal Analysis” PA at ML2 5. Add “Causal Analysis” goal to OPF (-.08,-.19) (-.45,-.55) (-.45,-.26) (+.09,+.16) (-.55,-.22) (-.45,-.22)
  • 26.
  • 27.
    27 OID Proposed Change#1 of 7 Revise OID SG 1 Current: SG 1 Select Improvements Process and technology improvements, which contribute to meeting quality and process-performance objectives, are selected. Proposed: SG 1 Select Improvements Process and technology improvements are identified proactively, evaluated quantitatively, and selected for deployment based on their contribution to quality and process performance.  Somewhat passive vs. very proactive  Focus on quantitative evaluation and ongoing improvement. (.66, .63)
  • 28.
    28 OID Proposed Change#2 of 7 Revise OID SP 1.1 Current: SP 1.1 Collect and Analyze Improvement Proposals Collect and analyze process- and technology-improvement proposals. Proposed: SP 1.1 Solicit Improvement Proposals Solicit proposals for incremental process and technology improvements.  “Solicit” is more proactive than “collect”  “Analysis” is deferred to SP 1.3 and SP 1.4  Explicitly targets incremental improvements. (.66, .43)
  • 29.
    29 OID Proposed Change#3 of 7 Revise OID SP 1.2 Current: SP 1.2 Identify and Analyze Innovations Identify and analyze innovative improvements that could increase the organization’s quality and process performance. Proposed: SP 1.2 Seek Innovations Seek and investigate innovative processes and technologies that have potential for significantly improving the organization’s quality and process performance.  “Seek and investigate” is more proactive than “identify”  “Analysis” is deferred to SP 1.3 and SP 1.4  Focuses on “significant” performance enhancement. (.65, .50)
  • 30.
    30 OID Proposed Change#4 of 7 Add OID SP 1.3 Current: <None> Proposed: SP 1.3 Model Improvements Use process performance models, as appropriate, to predict the effect of incremental and innovative improvements in quantitative terms.  Adds modeling as an additional “filter”  Supports quantitative cost/benefit analysis. (.68, .44)
  • 31.
    31 OID Proposed Change#5 of 7 Revise OID SP 1.3 (now SP 1.4) Current: SP 1.3 Pilot Improvements Pilot process and technology improvements to select which ones to implement. Proposed: SP 1.4 Pilot Improvements Pilot proposed improvements, as appropriate, to evaluate the actual effect on quality and process performance in quantitative terms.  Piloting performed “as appropriate”  Provides rationale for implementation. (.70, .61)
  • 32.
    32 OID Proposed Change#6 of 7 Revise OID SP 1.4 (now SP 1.5) Current: SP 1.5 Select Improvements for Deployment Select process and technology improvements for deployment across the organization. Proposed: SP 1.4 Select Improvements for Deployment Select process and technology improvements for deployment across the organization based on an evaluation of costs, benefits, and other factors.  Provides cost and benefits as the basis for selection  “Other factors” provides flexibility. (.67, .51)
  • 33.
    33 OID Proposed Change#7 of 7 Replace OID SP 2.3 Current: SP 2.3 Measure Improvement Effects Measure the effects of the deployed process and technology improvements. Proposed: SP 2.3 Measure Improvement Effects Evaluate the effects of deployed improvements on quality and process performance in quantitative terms.  Specifies evaluation criteria  Indicates “quantitative” evaluation  New informative material – update baselines/models. (.70, .63)
  • 34.
  • 35.
    35 Change Requests 1. Sincethe feedback related to the proposed changes was primarily supportive, all will be submitted as Change Requests to the SEI for consideration. 2. Change request submitted for UCHMP course – add exercise to re-write high maturity practices using ATLAS results as the base.
  • 36.
    36 Now It’s YOURTurn! Handout contains ATLAS #12Z proposing:  Consolidating ML5 PAs into ML4  Changing ML5 to “Sustaining Excellence”  Achieve ML4  ML4 = OPP, QPM, CAR, & OID  No additional process areas at ML5  Perform at high maturity for 2 contiguous years  Demonstrate sustained business benefit as well  Submit your input to PACT.otoole@att.net  Results will be published to all submitters.
  • 37.
  • 38.
    38 Download & ContactInformation Refer to the following websites to:  Contact the authors  Download the final SEPG 2008 presentation  Download the supporting ATLAS 12A – 12D results  Download the CMMI Model and SCAMPI Method Changes presentation from the May 2007 San Francisco Beyond CMMI v1.2 Workshop Herb Weiner Herb.Weiner@welchallyn.com www.highmaturity.com Pat O’Toole PACT.otoole@att.net www.pactcmmi.com

Editor's Notes

  • #8 The full wording of Item #1 and its rationale is: Move SP 1.3 to SP 1.1, shifting SP 1.1 and SP 1.2 to SP 1.2 and SP 1.3 respectively Current:SP 1.1 Select Processes SP 1.2 Establish Process-Performance Measures SP 1.3 Establish Quality and Process-Performance Objectives Proposed:SP 1.1 Establish Quality and Process-Performance Objectives SP 1.2 Select Processes SP 1.3 Establish Process-Performance Measures Rationale:There are three other process areas in which “objectives” are established – MA, OPF, and QPM. In each of these other process areas, objectives are established in SP 1.1, and the other practices focus on accomplishing them. It is suggested here that OPP be structured in the same manner. Granted, the current ordering may have more intuitive appeal to an “emerging” ML4 organization, but the proposed ordering reflects more of a steady state (i.e., institutionalized) condition.
  • #9 The full wording of Item #2 and its rationale is: OPP SP 1.4 Current:OPP SP 1.4: Establish and maintain the organization’s process-performance baselines. Proposed:OPP SP 1.4: Conduct process-performance analyses on the selected processes and subprocesses to verify process stability and to establish and maintain the organization’s process-performance baselines. Rationale:The current OPP SP 1.1 and SP 1.2 both imply that process-performance analysis will be conducted and yet that’s the last we hear of it – so it is proposed that such analyses are explicitly performed here. In addition, as currently emphasized in the informative material, the proposed practice wording suggests establishing baselines for stable processes, a necessary prerequisite for quantitative management.
  • #10 The full wording of Item #3 and its rationale is: OPP SP 1.5: Current:OPP SP 1.5: Establish and maintain the process-performance models for the organization’s set of standard processes. Proposed:OPP SP 1.5: Establish and maintain models that predict process performance related to the quality and process-performance objectives. Rationale:The new training courses emphasize the use of process-performance models to compose the defined process and to predict future performance throughout the life cycle with respect to the quantitative objectives. Focusing the expected model component on these objectives achieves better alignment between the model and training.
  • #11 The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  • #16 The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  • #20 The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  • #21 The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  • #23 The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  • #24 The full wording of Item #4’s rationale is: Similar to the proposed change for QPM SP 1.4, the word “monitor” is much too weak a word; “analyze” is a word that better reflects the activity expected to be performed by high maturity organizations. Similar to the proposed change for QPM SP 1.4, it is suggested that the performance of selected subprocesses be used to “predict” rather than merely “determine” their capability to satisfy their quality and process-performance objectives. “Predict” implies a higher degree of sophistication than does “determine,” and is more closely aligned with the expected behavior of high maturity organizations. The current practice expects project personnel to “identify corrective action;” the propose practice expects them to “identify and take corrective action”. The informative material of this practice should be expanded to refer to the proposed practice SP 1.5 and its use of process-performance models to “identify, analyze, and execute corrective action when necessary.” It should also refer to refer to PMC SG 2 for corrective action that does not warrant the use of process-performance models.The informative material should also be enhanced to discuss managing the inherent variation of the measurement system to heighten the probability that the measurement system is providing more “signal” than “noise.”
  • #25 The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  • #26 The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  • #28 Full explanation of Rationale: The current wording of SG 1 is somewhat passive as it focuses on meeting current quality and process-performance objectives. It implies that once the objectives are being met, the urgency for ongoing improvement is diminished. The ML5 concept of “optimizing” demands that organizations continuously and proactively seek ways to exceed, not merely meet, these expectations (i.e., once they’ve achieved “world class” they then strive for “universe class!”) In addition to proactively soliciting improvement proposals, ML5 organizations should experiment with both existing and emerging technologies in an effort to push their quality and process performance to the next level. The proposed changes to the specific practices supporting SG 1 also reflect this more proactive posture. Note: new informative material should be added to every OID specific practice that uses the word “quantitatively.” Although not explicitly stated in the practices themselves, the informative material should strongly encourage the use of process performance baselines and statistical methods where appropriate.
  • #29 The full wording of the Rationale is: The proposed wording of SP 1.1 introduces the following changes:a. “Collect” is replaced by the more proactive verb, “Solicit.”b. The “analyze” portion of this practice is deferred to OID SP 1.3 and SP 1.4.c. The practice explicitly targets “incremental” improvements thereby differentiating SP 1.1 more clearly from the “innovative” improvements covered by SP 1.2.
  • #30 The full wording of the rationale is: The proposed wording of SP 1.2 introduces the following changes:a. “Identify” is replaced by the more proactive verbs, “seek and investigate.”b. The “analyze” portion of this practice is deferred to OID SP 1.3 and SP 1.4.c. The practice explicitly targets improvements that significantly enhance performance. Innovative change is disruptive and may not be warranted to achieve marginal benefits.
  • #32 The full wording of the rationale is: The proposed wording of SP 1.4 introduces the following changes:a. Use of the term, “as appropriate” was added to indicate that not all incremental or innovative improvements need to be piloted.b. “…to select which ones to implement” is somewhat mealy and does not reflect the the expected behavior of an ML5 organization. The reworded practice is much more explicit in this regard.
  • #33 The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  • #34 The full wording of the rationale is: The wording of the existing practice provides little direction as to what should be measured and how the measures are to be used. The subpractices extend the measures beyond “the effects of the … improvements” suggesting that actual cost, effort, and schedule for deploying each improvement be captured as well.The proposed wording of SP 2.3 indicates that the organization should evaluate the improvement using the same metrics that were predicted via process modeling and/or initially achieved during the pilots.Furthermore, new informative material should indicate that this evaluation may result in the need to adjust the implementation of recently deployed improvements, to enhance the associated tailoring guidelines, and/or to initiate other forms of corrective action. It’s not enough to simply “measure the effects;” rather, the organization should strive to achieve the benefits.Finally, new informative material should remind the organization to update the corresponding process performance baselines and models based on the quantitative results achieved by the deployed improvements. If quality and process performance has, indeed, been improved, then the organization should expect to continue deriving these benefits in the future.