Is CMMI a guarantee of performance improvement? - Isabel Margarido (Critical Software)

  • 907 views
Uploaded on

 

More in: Business , Travel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
907
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
23
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Isabel Lopes Margarido isabel.margarido@gmail.comIs CMMI® a Guarantee of Performance Improvement? CMMI Portugal: 3rd of October | Braga João Pascoal Faria, Raul Moreira Vidal – FEUP Marco Vieira – FCTUC
  • 2. agenda introduction approach overview metamodel procedures conclusion3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 2/26
  • 3. introduction overview metamodel procedures conclusion CMMI benefits Performance Improvements over Time by Category Performance Category Median Improvement Cost 34% Schedule 50% Productivity 61% Quality 48% Customer Satisfaction 14% Return on Investment 4.0 : 1 [1] 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 3/26
  • 4. introduction overview metamodel procedures conclusion motivation [4] [7] [6] [8] CMMI performance depends on the implementation method SCAMPI: organisation honesty, appraisal team quality, small percentage of the organisation, limited number of affirmations [3-6] 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 4/26
  • 5. introduction overview metamodel procedures conclusion implementation problems • Understand statistical nature of level 4 [9][10], [14] • Lack of institutionalisation [6][14] • Uncorrelated and meaningless metrics [10][14] • Dissemination • Metrics definition (collect problems [14] and analyse data) [12][13][14] • Metrics categorization [14] • Tools setup [14] • Not all projects are measurable [14] • Overhead [14] 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 5/26
  • 6. introduction overview metamodel procedures conclusion problem statement CMMI Model high variability of performance • dependence on methods used and quality of implementation Quality of Implementation Performance Indicatorsdifficulties in the selection of need for a performance evaluation methodimplementation methods • CMMI V1.3 more focused on performance • SCAMPI does not measure performance 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 6/26
  • 7. introduction overview metamodel procedures conclusion purpose objectives of the research develop an evaluation framework select implementation methods select performance indicators to allow early evaluation of quality of implementation of the model organisational performance impact of process improvement initiatives on organisational performance beneficiaries organisations implementing CMMI SEI – easily verify performance improvement from one SCAMPI to the next 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 7/26
  • 8. introduction overview metamodel procedures conclusion concept Quality Principles • CMMI-DEV CMMI - ACQ ISO CMMI - SVC Improvement Techniques PMI CMMI - DEV Lean Quality of Org. Performance Six Sigma Theory Implementation of Organisation Constraints • PP • PMC Operational Practices •… • Tools TSP • Techniques • Procedures Agile •… 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 8/26
  • 9. introduction overview metamodel procedures conclusion concept ML5 ML4 ML3 PI SG1 1 n SP1.1 0 or 1 n ML2 PA n n SG2 1 n implementation SP2.1 0 or 1 methods n … 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 9/26
  • 10. introduction overview metamodel procedures conclusion approach overviewevaluation framework application Metamodel RepositoryCMMI Model Goal/Practice Methods x CMMI practices T E * *Evaluation Framework Evaluates Base Measure Method * * Evaluates4 0..* 1..* Process PI Product PI Performance Indicator (PI) * * Used to calculate4 A X Support -ratting Performance Indicators I E • potential Influences4 Leading PI Lagging PI * * • profiles L C • Quality ofEvaluation (ML, methods) Semaphore Goal/Practice -colour (red, yellow, green) -numerical value (optional) -time -source: (org, dep, proj) -target: (G/P, method, PI)) * * 0..1 0..1 Method PI Aggregation O U Implementation 0..1 -thresholds0..1 * 0..1 0..1 R T • Organisation Project Department Aggregation Organisation Performance I I • Performance Improvement Procedures including aggregation N O and rating G N 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 10/26
  • 11. introduction overview metamodel procedures conclusion approach overview quality of implementation degree of implementation of a practice with a given method some performance indicators measure it performance indicators measure the organisation performance measure the quality of implementation their aggregation indicates degree of institutionalisation, necessary for generic goals and high maturity evaluation 8th of June, 2011 Isabel Lopes Margarido, SEPG Europe – Dublin, Ireland 11/26
  • 12. introduction overview metamodel procedures conclusion definitions Derived Measure Performance Indicator Base Measure 1 Base Measure 2 Attribute x Attribute y base measure – measure of a single property/characteristic of product, process, project or resource (attribute) [15] performance indicator – measure that provides estimate/evaluation of an attribute derived from base measures or other derived measures [15] leading indicator – anticipates quality, allows forecasting and diagnosis [16-17] lagging indicator – follows an event or tendency, allows appraising [17] methods – good practices, procedures, techniques, tools, etc. , that are part of the processes of the organisation 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 12/26
  • 13. introduction overview metamodel procedures conclusionCMMI Model Goal/Practice * *Evaluation Framework Evaluates Base Measure Process PI Product PI * * Used to calculate4 1..* * 0..* * Evaluates4 Method Performance Indicator (PI) Support -rating Directly Adressed Partially Adressed Supports Not Adressed Influences4 Leading PI Lagging PI Unrated [7] * * 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 13/26
  • 14. introduction overview metamodel repository procedures conclusion Evaluation Semaphore Goal/Practice -colour (red, yellow, green) 0..1 Aggregation -numerical value (optional) Method -time * * 0..1 -source: (org, dep, proj) PI -target: (G/P, method, PI)) 0..1 -thresholds * 0..1 0..1 0..1 Project Department Organisation Aggregation 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 14/26
  • 15. introduction overview metamodel procedures conclusion aggregation 1 1… * 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 15/26
  • 16. introduction overview metamodel procedures conclusion rating and aggregation example PI: Schedule Estimation Error (%) Time: 2010 Value Org: P3 (D1) threshold2 D1: P2 (D2) threshold1 D2: P1 (D1) 0% P5 (D3) D3: threshold1 P4 (D1) threshold2 Time Legend: Org – organisation, D1 – department 1; P1 – project 1. 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 16/26
  • 17. introduction overview metamodel procedures conclusion aggregation example P1 P4 D1 P3 P2 D2 Org PI1 PI2 N/A N/A PI3 PI4 PI5 N/A N/A M1 (alt M2) (PI1 ^ PI2) N/A N/A M2 (alt M1) (PI1 ^ PI5) N/A N/A M3 (opt) (PI3) N/A M4 (mandat) (PI4) SP1 M4 SP2 (M1 v M2) SP3 (M1 v M2) ^ M3 SP4 (M1 ^ M4) Legend: Org – organisation, D1 – department 1; P1 – project 1; alt – alternative; opt – optional; ,mandat – mandatory; ^ – AND; v – OR. 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 17/26
  • 18. introduction overview metamodel procedures conclusion research status identified some of the problems in the CMMI gathered performance indicators and implementation methods for a subset of practices designed the metamodel and preliminary version of the evaluation framework 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 18/26
  • 19. introduction overview metamodel procedures conclusion future work analyse organisations data (PI, methods, relations between PI) define the rationale for tailoring the PI validate the evaluation framework in organisations 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 19/26
  • 20. introduction overview metamodel procedures conclusion summary data aggregation and its analysis is particularly important in the implementation of the GG and HML performance indicators are useful to evaluate the quality of implementation we also map the CMMI practices with methods that organisations can choose and adapt 3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 20/26
  • 21. questions3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 21/26
  • 22. references [1] C. P. Team, "CMMI® for Development, Version 1.3," CMU/SEI CMU/SEI-2010-TR- 033, ESC-TR-2010-033, November 2010. [2] Gibson, Diane L., Goldenson, Dennis R., Kost, Keith, Performance Results of CMMI®-Based Process Improvement, CMU/SEI, 2006. [3] N. Davis and J. Mullaney, "The Team Software ProcessSM (TSPSM) in Practice: A Summary of Recent Results," CMU/SEI CMU/SEI-2003-TR-014, ESC-TR-2003-014, 2003. [4] N. Davis and J. McHale, "Relating the Team Software ProcessSM (TSPSM) to the Capability Maturity Model® for Software (SW-CMM®)," CMU/SEI-2002-TR-008, ESC-TR- 2002-008, March 2003. [5] J. McHale and D. S. Wall, "Mapping TSP to CMMI," CMU/SEI CMU/SEI-2004-TR- 014, ESC-TR-2004-014, 2005. [6] R. Radice, "Statistical Process Control in Level 4 and Level 5 Software Organizations Worldwide," presented at the Software Technology Conference, 2000. [7] R. Charette, et al., "Understanding the Roots of Process Performance Failure," CROSSTALK The Journal of Defense Software Engineering, pp. 18-22, 2004. [8] M. Schaeffer, "DoD Systems Engineering and CMMI," presented at the CMMI Technology Conference and User Group, 2004.3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 22/26
  • 23. references [8] A. Takara, et al., "Problems and Pitfalls in a CMMI level 3 to level 4 Migration Process," presented at the Sixth International Conference on the Quality of Information and Communications Technology, 2007. [9] C. Hollenbach and D. Smith. (2002) A portrait of a CMMISM level 4 effort Systems Engineering. 52-61. [10] B. Kitchenham, et al., "Lessons Learnt from the Analysis of Large-scale Corporate Databases," presented at the International Conference on Software Engineering, Shanghai, 2006. [11] D. Breuker, et al., "Reliability of software metrics tools," presented at the International Conference on Software Process and Product Measurement, Amsterdam, 2009. [12] M. C. P. A. Goulão, "Component-Based Software Engineering: a Quantitative Approach," Doctoral, Departamento de Informática, Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia, Lisboa, 2008. [13] I. Lopes Margarido, et al., "What is wrong with the CMMI® High Maturity Levels?," in SEPG Europe, Porto, 2010. [14] M. Philips, "CMMI V1.3 Planned Improvements," presented at the SEPG Europe 2010, Porto, Portugal, 2010.3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 23/26
  • 24. images http://mitografias.files.wordpress.com/2009/08/superman-flying.jpg –adapted, 24-06-2010 http://api.ning.com/files/hpf*xOTebDs- F23o6FETZ3j*3sNiONWjfXjTJCzprPjU5bS1 WJoGgWBjMPIOiQkm3SbZ41ijncrJ4K2aT-6dM9QURwHK3led/Dissemination2.jpg -26-06-2010 http://blog.pmtech.com.br/wp-content/uploads/Square-Paradox.jpg – 29-04-2011 http://www.signsexpressshop.co.uk/prodpics/1103.gif – 29-04-2011 Benjamin Haas/Shutterstock, http://cynthiayildirim.posterous.com/how-can-we-measure-the-size-of- the-universe – 29-04-2011 http://ryanstephensmarketing.com/blog/wp-content/uploads/2009/10/one_size_fits_all.JPG http://evolvingwe.com/wp-content/uploads/2010/11/image3.png – 29-04-2011 http://occlink.com/wp-content/uploads/process-picture.jpg http://www.pastinyala.com/images/customised_software_product.jpg – adapted 29-04-2011 http://www.smartkids.com.br/conteudo/especiais/transito/sinalizacao/semaforo.gif – 21-01-2011 http://www.screenhog.com/sketch/LightbulbIdea.jpg – 21-04-2010 http://igraduatedwhatnow.files.wordpress.com/2009/11/thank_you_small.jpg – 02-05-2010 http://etablissements.ac-amiens.fr/0801372g/matieres/anglais/images/difficult.gif – 25-05-2011 http://3.bp.blogspot.com/-_Z2dYcXxMmA/TbLat4c6i_I/AAAAAAAAAnk/KlLdgG- dgtw/s1600/whereamigoing.jpg – adapted, 25-05-20111 http://www.articulate.com/rapid-elearning/wp-content/uploads/2008/08/summary-objectives450.gif – adapted, 25-05-2011 http://1.bp.blogspot.com/_C3jJLFkkSKs/Rzb7NQdztyI/AAAAAAAAA1A/H8nECRh_76A/s400/ponte.JPG – 06-06-2011 http://www.veryhappypig.com/blog/results.jpg -06-06-2011 http://www.stampa.unibocconi.it/immagini/LA4_economiaq20100603145905.jpg - 06-06-20113rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 24/26
  • 25. interested in our research? get involved! participate in the surveys and/or in the validation phase share your experience and/or opinion contact: Isabel Lopes Margarido, isabel.margarido@gmail.com http://paginas.fe.up.pt/~pro09003/ copyright: partially sponsored by:3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 25/26
  • 26. acronyms ACQ – acquisition MA – Measurement and Analysis C – case study ML – maturity level CAR – Causal Analysis and Resolution n or * – many CMMI® – Capability Maturity Model org – organisation Integrated PA – Process Area dep – department PI – performance indicator DEV – development proj – project DoD – Department of Defense (United States SCAMPI ℠ – Standard CMMI Appraisal of America) Method for Process Improvement FEUP – Faculty of Engineering, University of PP – Project Planning Porto PMC – Project Monitoring and Control GDM – goal driven measurement SEI – Software Engineering Institute GG – generic goal SG – specific goal GP – generic practice SP – specific practice G/P – goal or practice SVC – services ISO – International Organization for SW – software Standardization TSP℠ – Team Software Process KLOC – thousand lines of code V – version3rd of October, 2011 Isabel Lopes Margarido, CMMI Portugal | Braga 26/26