Page 1
High Maturity Implementation:
Pitfalls and Misconceptions
At CSI-SPIN (Mumbai), Sept 27, 2010
Rajesh Naik
QAI India Ltd
Page 2
Agenda
• Process Performance Models
• Sub-Process Control
• Managing Process Improvements
• Typical misconceptions and pitfalls
Page 3
Source: How Does High Maturity Benefit the Customer? – Rick Hefner, Northrop Grumman
CMMI® Levels
Page 4
Source: SEI Webinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow
Page 5
PPMs are complex
- because reality is complex
• I want to go from my residence to my friend’s place
• I have many options (have you heard - we don’t have
options?)
• With a little thought we can come up with options – all
seem valid
Taxi
Bus Auto
terminus
Bus Bus
Auto Bus
My
House
Friend’s
House
Page 6
• There are combination of resources that I would
like to optimize
– Energy level (physical, emotional) [Quality]
– Money [Cost/ effort]
– Elapsed time [Schedule]
(some may be more important than others, some may
start pinching when they cross a threshold)
• I may also have constraints on some of the
resources (e.g., I can spend a max of 3 hours
elapsed time; or I don’t want to spend more than
Rs 500 on the journey)
PPMs are complex
- because reality is complex (contd.)
Page 7
PPMs are complex
- because reality is complex (contd.)
• Each step of the journey (each process) would
consume (or sometimes add back) some of the
resources
From To Mode Energy Money Time
My Res Friend's Res Taxi 0.5 unit 400 Rs 1 hour
My Res Terminus Bus 1.0 unit 50 Rs 1 hour
My Res Terminus Auto 1.0 unit 120 Rs 45 mins
Terminus Friend's Res Bus 1.0 unit 50 Rs 1 hour
Terminus Friend's Res Auto 1.0 unit 120 Rs 45 mins
What is the simplification in the above table?
Page 8
PPMs are complex
- because reality is complex (contd.)
• Many simplifications, significant enough to
make a difference in the choices made
1. Not taking into account wait times to get the
transport
2. Assuming that all values are invariant, fixed and
deterministic
• Look at the table in the previous slide and
examine whether the above two factors could
have a significant impact on your choice
Page 9
Outcome of Complex Process is
difficult to predict intuitively
Source: SEI Webinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow
Page 10Source: SEI Webinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow
Outcome of Complex Process is
difficult to predict intuitively
Page 11Source: SEI Webinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow
Outcome of Complex Process is
difficult to predict intuitively
Page 12
Source: SEI Webinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow
Page 13
Issues seen in PPM Implementation
• PPMs used only as forecasting tools
• “We do not have ANY choices”
• PPMs used for a single parameter – assumption
is that we have unlimited other resources
• PPMs used in a stand alone manner – one for
defect prediction, one for effort, one for schedule
– in reality every choice potentially impacts all three
simultaneously (everything is interdependent)
Page 14
Issues seen in PPM Implementation
(contd.)
• Separate, unrelated PPMs used in each phase
– ignoring the fact that phases depend on each other
(defect density found may be dependent on the defect
density present)
• Variation of processes and sub-processes not
taken into account
• Skill of people/ team not considered in the PPM
as a factor that impacts cost, schedule, defects
• Ignoring the process tailoring done while
evaluating PPMs
• Not re-evaluating the process composition after
some progress in the project
Page 15
Issues seen in PPM Implementation
(contd.)
• Assuming “normal” (symmetric) distribution – no
real phenomena with human beings has a
“normal” distribution – only gambling situations
and computer games have a normal distribution
Page 16
Issues seen in PPM Implementation
(contd.)
• Assuming that changing the values of some
process parameters will change process
behavior (without actually changing the process).
Here is a classic one – if we increase the review
effort, we will find more defects.
– if you don’t change the review process, why will it take
more effort?
• Underlying data in PPMs not based on true
process/ sub-process performance baselines
• PPMs trying to optimize “Schedule Variance”
and “Effort Variance”
– (Thankfully, we don’t try to optimize “defect variance”)
Page 17
Sub-Process Control
• Choosing sub-processes and parameter to
control
– High contribution to the overall project for one or more
parameters (effort, schedule, quality)
– High contribution to the variation in the overall project
for one or more parameters (effort, schedule, quality)
– The sub-process and parameters are appropriate for
statistical process control
• You have control on the parameter - you can
change something in the process
• Statistical tool – SPC charts
Page 18
Issues seen in Sub-process Control
Implementation
• Confusing “sub-process” with “parameter”
– We are controlling “schedule variance” sub-process
• Sub-process at a very high level (not really a
sub-process, but an aggregate)
• Trying to control output, instead of the
controllable input/ process
– You only monitor the output
– But you can control the inputs and the process
– E.g.,
• You cannot control your weight (output)
• But you can control your diet and exercise
Page 19
Issues seen in Sub-process Control
Implementation (contd.)
• Data that is used is not actually from the
same sub-process. E.g.,
– speed of running is plotted – but from races of
different distances (100 meters to marathon)
– Coding productivity from programs of different
sizes and complexity
– Coding productivity - taken from the
performance of people with different skill
levels
Page 20
Issues seen in Sub-process Control
Implementation (contd.)
• Accepting huge variation (wide range of process
control limits) – because all data points follow
the rules of process stability (missing the woods
for the trees)
• Using an arbitrary sequence in the control chart
(e.g., should we sequence by start date, or end
date?)
• Ignoring the fact that points with a large base
have a smaller variation by its very nature
Page 21
Issues seen in Sub-process Control
Implementation (contd.)
• Discarding “outliers”, till all remaining data
points show stability of the sub-process
• Using baseline control limits, without
qualitatively determining that the sub-
process continues to be the same
• Ignoring the phenomenon that
measurement and focus has an impact on
the stability
Page 22
Managing Process Improvements
OID & CAR
• Involves
– Specifying improvement objectives
– Identifying processes/ sub-process to be
improved
– Piloting proposed process improvements
– Checking the impact; refining the
improvement
– Deploying the change
– Measuring the impact (after large scale
deployment)
Page 23
Issues seen in Process Improvement
Implementation
• Drawing cause-effect relationship from
correlation (higher the review effort -> higher
defects found)
• Measuring the improvement in just one
parameter (defects found) while ignoring the
impact on other parameters (effort, schedule)
• Not trying to ensure that conditions for “before”
and “after” are same (except for the change that
is being tried)
– Is the skill level the same
– Is the input the same?
Page 24
Issues seen in Process Improvement
Implementation (contd.)
• Taking an isolated view of the
improvement (not looking downstream)
• Ignoring the impact of measurement and
attention that is being focused on the
improvement
– Not checking over long durations
• Not setting the right hypotheses for
testing; and not using the right tool for
testing the hypotheses
Page 25
Issues seen in Process Improvement
Implementation (contd.)
• Assuming that changing a quantitative
parameter will bring about the
improvement (without changing the input
or process. E.g.,
– If we increase the test effort then more
defects will be found (but if we use the same
test process, how can we fruitfully utilize the
increased test effort?)
Page 26
What we should see in future
High Maturity Implementations
• More comprehensive / holistic analysis
• Models should be factoring in important
“soft” influencers
– Skills/ Cross-skills (IPPD?)
– Team work/ gelled teams (IPPD?)
– Impact of empowerment (IPPD?)
– Impact of measurement
– Impact of management focus
Page 27
About this Presentation
More resources on the subject are available from the creator of this presentation at:
http://www.rajeshnaik.com
© Rajesh Naik, 2010
This work is released under a Creative Commons Attribution-NonCommercial-ShareAlike
3.0 Unported License license. This means you can use it for non-commercial purposes
so long as you include the copyright line “© Rajesh Naik, 2010". If you create derivative
works using this work, they should also be made available under a similar license. For
further information go to http://creativecommons.org/licenses/by-nc-sa/3.0/ For uses
outside the scope of the license, contact Rajesh Naik at naik.rajeshnaik@gmail.com
Author: Rajesh Naik
Founding Partner
QAI India Limited
naik.rajeshnaik@gmail.com
+91 9845488767
PPT Template Copyright © 2010 PowerPoint Styles from http://www.powerpointstyles.com
Page 28
Thank You
Rajesh Naik
Consulting Partner
QAI India Limited
Email
rajesh.naik@qaiglobal.com
OR
naik.rajeshnaik@gmail.com
Mobile
+91 9845488767
Rajesh Naik
Founding Partner
QAI India Limited
Email
naik.rajeshnaik@gmail.com
Mobile
+91 9845488767
Website
www.rajeshnaik.com
Also, have a look at the latest “business novel”:
Aligning Ferret: How an Organization Meets
Extraordinary Challenges
By Swapna Kishore & Rajesh Naik
Available at Amazon:
http://www.amazon.com/dp/B00CZA94XC

CMMI - High Maturity Misconceptions and Pitfalls

  • 1.
    Page 1 High MaturityImplementation: Pitfalls and Misconceptions At CSI-SPIN (Mumbai), Sept 27, 2010 Rajesh Naik QAI India Ltd
  • 2.
    Page 2 Agenda • ProcessPerformance Models • Sub-Process Control • Managing Process Improvements • Typical misconceptions and pitfalls
  • 3.
    Page 3 Source: HowDoes High Maturity Benefit the Customer? – Rick Hefner, Northrop Grumman CMMI® Levels
  • 4.
    Page 4 Source: SEIWebinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow
  • 5.
    Page 5 PPMs arecomplex - because reality is complex • I want to go from my residence to my friend’s place • I have many options (have you heard - we don’t have options?) • With a little thought we can come up with options – all seem valid Taxi Bus Auto terminus Bus Bus Auto Bus My House Friend’s House
  • 6.
    Page 6 • Thereare combination of resources that I would like to optimize – Energy level (physical, emotional) [Quality] – Money [Cost/ effort] – Elapsed time [Schedule] (some may be more important than others, some may start pinching when they cross a threshold) • I may also have constraints on some of the resources (e.g., I can spend a max of 3 hours elapsed time; or I don’t want to spend more than Rs 500 on the journey) PPMs are complex - because reality is complex (contd.)
  • 7.
    Page 7 PPMs arecomplex - because reality is complex (contd.) • Each step of the journey (each process) would consume (or sometimes add back) some of the resources From To Mode Energy Money Time My Res Friend's Res Taxi 0.5 unit 400 Rs 1 hour My Res Terminus Bus 1.0 unit 50 Rs 1 hour My Res Terminus Auto 1.0 unit 120 Rs 45 mins Terminus Friend's Res Bus 1.0 unit 50 Rs 1 hour Terminus Friend's Res Auto 1.0 unit 120 Rs 45 mins What is the simplification in the above table?
  • 8.
    Page 8 PPMs arecomplex - because reality is complex (contd.) • Many simplifications, significant enough to make a difference in the choices made 1. Not taking into account wait times to get the transport 2. Assuming that all values are invariant, fixed and deterministic • Look at the table in the previous slide and examine whether the above two factors could have a significant impact on your choice
  • 9.
    Page 9 Outcome ofComplex Process is difficult to predict intuitively Source: SEI Webinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow
  • 10.
    Page 10Source: SEIWebinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow Outcome of Complex Process is difficult to predict intuitively
  • 11.
    Page 11Source: SEIWebinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow Outcome of Complex Process is difficult to predict intuitively
  • 12.
    Page 12 Source: SEIWebinar A Mini Tutorial for Building CMMI Process Performance Models – Stoddard, Schaaff, Young & Zubrow
  • 13.
    Page 13 Issues seenin PPM Implementation • PPMs used only as forecasting tools • “We do not have ANY choices” • PPMs used for a single parameter – assumption is that we have unlimited other resources • PPMs used in a stand alone manner – one for defect prediction, one for effort, one for schedule – in reality every choice potentially impacts all three simultaneously (everything is interdependent)
  • 14.
    Page 14 Issues seenin PPM Implementation (contd.) • Separate, unrelated PPMs used in each phase – ignoring the fact that phases depend on each other (defect density found may be dependent on the defect density present) • Variation of processes and sub-processes not taken into account • Skill of people/ team not considered in the PPM as a factor that impacts cost, schedule, defects • Ignoring the process tailoring done while evaluating PPMs • Not re-evaluating the process composition after some progress in the project
  • 15.
    Page 15 Issues seenin PPM Implementation (contd.) • Assuming “normal” (symmetric) distribution – no real phenomena with human beings has a “normal” distribution – only gambling situations and computer games have a normal distribution
  • 16.
    Page 16 Issues seenin PPM Implementation (contd.) • Assuming that changing the values of some process parameters will change process behavior (without actually changing the process). Here is a classic one – if we increase the review effort, we will find more defects. – if you don’t change the review process, why will it take more effort? • Underlying data in PPMs not based on true process/ sub-process performance baselines • PPMs trying to optimize “Schedule Variance” and “Effort Variance” – (Thankfully, we don’t try to optimize “defect variance”)
  • 17.
    Page 17 Sub-Process Control •Choosing sub-processes and parameter to control – High contribution to the overall project for one or more parameters (effort, schedule, quality) – High contribution to the variation in the overall project for one or more parameters (effort, schedule, quality) – The sub-process and parameters are appropriate for statistical process control • You have control on the parameter - you can change something in the process • Statistical tool – SPC charts
  • 18.
    Page 18 Issues seenin Sub-process Control Implementation • Confusing “sub-process” with “parameter” – We are controlling “schedule variance” sub-process • Sub-process at a very high level (not really a sub-process, but an aggregate) • Trying to control output, instead of the controllable input/ process – You only monitor the output – But you can control the inputs and the process – E.g., • You cannot control your weight (output) • But you can control your diet and exercise
  • 19.
    Page 19 Issues seenin Sub-process Control Implementation (contd.) • Data that is used is not actually from the same sub-process. E.g., – speed of running is plotted – but from races of different distances (100 meters to marathon) – Coding productivity from programs of different sizes and complexity – Coding productivity - taken from the performance of people with different skill levels
  • 20.
    Page 20 Issues seenin Sub-process Control Implementation (contd.) • Accepting huge variation (wide range of process control limits) – because all data points follow the rules of process stability (missing the woods for the trees) • Using an arbitrary sequence in the control chart (e.g., should we sequence by start date, or end date?) • Ignoring the fact that points with a large base have a smaller variation by its very nature
  • 21.
    Page 21 Issues seenin Sub-process Control Implementation (contd.) • Discarding “outliers”, till all remaining data points show stability of the sub-process • Using baseline control limits, without qualitatively determining that the sub- process continues to be the same • Ignoring the phenomenon that measurement and focus has an impact on the stability
  • 22.
    Page 22 Managing ProcessImprovements OID & CAR • Involves – Specifying improvement objectives – Identifying processes/ sub-process to be improved – Piloting proposed process improvements – Checking the impact; refining the improvement – Deploying the change – Measuring the impact (after large scale deployment)
  • 23.
    Page 23 Issues seenin Process Improvement Implementation • Drawing cause-effect relationship from correlation (higher the review effort -> higher defects found) • Measuring the improvement in just one parameter (defects found) while ignoring the impact on other parameters (effort, schedule) • Not trying to ensure that conditions for “before” and “after” are same (except for the change that is being tried) – Is the skill level the same – Is the input the same?
  • 24.
    Page 24 Issues seenin Process Improvement Implementation (contd.) • Taking an isolated view of the improvement (not looking downstream) • Ignoring the impact of measurement and attention that is being focused on the improvement – Not checking over long durations • Not setting the right hypotheses for testing; and not using the right tool for testing the hypotheses
  • 25.
    Page 25 Issues seenin Process Improvement Implementation (contd.) • Assuming that changing a quantitative parameter will bring about the improvement (without changing the input or process. E.g., – If we increase the test effort then more defects will be found (but if we use the same test process, how can we fruitfully utilize the increased test effort?)
  • 26.
    Page 26 What weshould see in future High Maturity Implementations • More comprehensive / holistic analysis • Models should be factoring in important “soft” influencers – Skills/ Cross-skills (IPPD?) – Team work/ gelled teams (IPPD?) – Impact of empowerment (IPPD?) – Impact of measurement – Impact of management focus
  • 27.
    Page 27 About thisPresentation More resources on the subject are available from the creator of this presentation at: http://www.rajeshnaik.com © Rajesh Naik, 2010 This work is released under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License license. This means you can use it for non-commercial purposes so long as you include the copyright line “© Rajesh Naik, 2010". If you create derivative works using this work, they should also be made available under a similar license. For further information go to http://creativecommons.org/licenses/by-nc-sa/3.0/ For uses outside the scope of the license, contact Rajesh Naik at naik.rajeshnaik@gmail.com Author: Rajesh Naik Founding Partner QAI India Limited naik.rajeshnaik@gmail.com +91 9845488767 PPT Template Copyright © 2010 PowerPoint Styles from http://www.powerpointstyles.com
  • 28.
    Page 28 Thank You RajeshNaik Consulting Partner QAI India Limited Email rajesh.naik@qaiglobal.com OR naik.rajeshnaik@gmail.com Mobile +91 9845488767 Rajesh Naik Founding Partner QAI India Limited Email naik.rajeshnaik@gmail.com Mobile +91 9845488767 Website www.rajeshnaik.com Also, have a look at the latest “business novel”: Aligning Ferret: How an Organization Meets Extraordinary Challenges By Swapna Kishore & Rajesh Naik Available at Amazon: http://www.amazon.com/dp/B00CZA94XC