Together. Free your energies

Implementing a level 5 metrics programme @Capgemini Netherlands
Selecting The Right Set...
N...
MEASUREMENT
OBJECTIVE

SCOPE

LEADING &
LAGGING
INDICATORS

LAGGING
INDICATORS

SPECIFIC

MEASUREABLE

Are the measurement...
MEASUREMENT
OBJECTIVE

SCOPE

LEADING &
LAGGING
INDICATORS

REGRESSION
EQUATION

APPLICATION DEVELOPMENT KPI PORTFOLIO

CO...
MEASUREMENT
OBJECTIVE

SCOPE

LEADING &
LAGGING
INDICATORS

REGRESSION
EQUATION

LIST OF X FACTORS
code complexity, encaps...
MEASUREMENT
OBJECTIVE

SCOPE

DEVELOPMENT
PROJECT –
INDICATORS

REGRESSION
EQUATION

APPLICATION DEVELOPMENT KPI PORTFOLIO...
MEASUREMENT
OBJECTIVE

SCOPE

DEVELOPMENT
PROJECT –
INDICATORS

REGRESSION
EQUATION

APPLICATION DEVELOPMENT KPI PORTFOLIO...
MEASUREMENT
OBJECTIVE

SCOPE

MAINTENANCE
ENGAGEMENT
INDICATORS

REGRESSION
EQUATION

APPLICATION MAINTENANCE KPI PORTFOLI...
MEASUREMENT
OBJECTIVE

SCOPE

INDICATORS
MAINTENANCE
PROJECT

REGRESSION
EQUATION

APPLICATION MAINTENANCE KPI PORTFOLIO
I...
MEASUREMENT
OBJECTIVE

SCOPE

INDICATORS
MAINTENANCE
PROJECT

REGRESSION
EQUATION

Example: Delivered Defect Density

*DDD...
About Capgemini
With more than 120,000 people in 40 countries, Capgemini
is one of the world's foremost providers of
consu...
Upcoming SlideShare
Loading in …5
×

Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

676 views

Published on

As Capgemini, as part of the CMMI ML5 program, further enhances systematic measurement programs to monitor, control and improve process and product quality, they also keep in mind the right tradeoff between benefit and cost to achieve this.

The presentation emphasizes on selection of the right set of X (leading) and Y (Lagging) factors distributed over Cost, Time and Quality and trade off between them.

Every engagement has an end goal that needs to be achieved within a certain budget, with available resources and within a given timeline. In an perfect world, an engagement manager would have as much time, money and resources needed to achieve the highest-quality result; however, this is seldom the case in reality. Think of the three factors – cost, quality and time – as legs of a stool drifting away from each other.

When the X and Y factors are identified it gives the engagement manager the opportunity of making an informed decision in selecting the right delivery strategy.

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
676
On SlideShare
0
From Embeds
0
Number of Embeds
12
Actions
Shares
0
Downloads
12
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

  1. 1. Together. Free your energies Implementing a level 5 metrics programme @Capgemini Netherlands Selecting The Right Set... Niteen Kumar - 26/11/2013
  2. 2. MEASUREMENT OBJECTIVE SCOPE LEADING & LAGGING INDICATORS LAGGING INDICATORS SPECIFIC MEASUREABLE Are the measurement target oriented? Can it be measured? RELEVANT What story will it tell? ATTAINABLE Does it cost too much? TIME BOUND By When? Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 2
  3. 3. MEASUREMENT OBJECTIVE SCOPE LEADING & LAGGING INDICATORS REGRESSION EQUATION APPLICATION DEVELOPMENT KPI PORTFOLIO COST QUALITY SCHEDULE APPLICATION MAINTENANCE KPI PORTFOLIO Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 3
  4. 4. MEASUREMENT OBJECTIVE SCOPE LEADING & LAGGING INDICATORS REGRESSION EQUATION LIST OF X FACTORS code complexity, encapsulation, program language & tools, code review checklist, coding skills and experiences with the program languages and tools used, code review skills and experiences, % of tickets having existing solution in KEDB, quality of reused source code, requirements volatility, integration test methods and tools, Integration test skills and experiences with methods and tools used, quality of reused test cases ,domain, requirements volatility, quality attributes, readability of documents, architecture measures, code complexity, encapsulation, requirements methods and tools, Right shore Ratio requirements inspection checklist, high-level design methods and tools, high-level design inspection checklist, detailed design methods and tools, detailed design review/inspection checklist, program language & tools, code review checklist usage, domain experiences, requirements skills and experiences with methods and tools used, requirement inspection skills and experiences, high-level design skills and experiences with the methods and tools used, high-level design inspection skills and experiences, detailed de-sign skills and experiences with the methods and tools used, detailed design review/inspection skills and experiences, # of CR’s Rolled Back, coding skills domain, architecture measures, high-level design methods and tools, high-level design skills and experiences with methods and tools used, quality of reused high-level design documents, Rework Effort Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 4
  5. 5. MEASUREMENT OBJECTIVE SCOPE DEVELOPMENT PROJECT – INDICATORS REGRESSION EQUATION APPLICATION DEVELOPMENT KPI PORTFOLIO COST Y - FACTORS  % EFFORT VARIANCE  CONTRIBUTION MARGIN X - FACTORS  Requirements Volatility  Skill Index  Reusability  Effort by SDLC Phase  Review , Rework Effort  Resource Cost  Code Complexity / Quality  Overrun / Underrun  # of times resource changed during build QUALITY Y - FACTORS  DEFECT REMOVAL EFFICIENCY  DELIVERED DEFECT DENSITY  COST OF QUALITY X - FACTORS SCHEDULE Y - FACTORS  % SCHEDULE VARIANCE X - FACTORS  Resource Availability  Requirements Volatility  Skill Index  Reusability  Rework Effort  # of times resource changed during build The “X” factors influencing the outcome of “Y” was identified during the workshops. The identified “X” factors are logical in nature and may change during statistical validation  Rework Effort  Test Coverage  Testing Rate  Review Effort  Skill Level  Code Complexity / Quality  Test Preparation Effort Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 5
  6. 6. MEASUREMENT OBJECTIVE SCOPE DEVELOPMENT PROJECT – INDICATORS REGRESSION EQUATION APPLICATION DEVELOPMENT KPI PORTFOLIO SPRINT SPRINT NUMBER i Total # of Features / Use Case Planned i Estimated Size (SP) i COST Planned Actual Size Productivity % (SP) Factor Completion i i i Total Planned Effort (P.Hrs) i TOTAL AT ENGAGEMENT LEVEL g Total # of User Stories Total Total Total Actual MODIFIED Features / Features / Overall Effort Effort During the Use Case Use Case Variance in (P.Hrs) Iteration. COMPLETED ACCEPTED % i i i i i ACTUAL EFFORT For Below Activities (Person Hours) P.Hrs 902,00 1985,00 2177,00 1561,00 1153,00 1412,00 1805,00 1089,00 1478,00 1498,00 7935,00 174,00 628,00 534,00 h h h i MODL COD TST-P TST-E REFTR SCRUM MASTER REV REW h g g g g g 38,00 g g g 63,00 7,00 56,00 70,00 4,00 2,00 19,00 25,00 10,00 25,00 5,00 223,00 4 7 6 - 67,00 9,00 27,00 48,00 45,00 22,00 27,00 23,00 18,00 22,00 2,00 243,00 0 12 8 263% 120,00 11,00 12,00 19,00 27,00 19,00 16,00 32,00 26,00 21,00 21,00 204,00 5 9 10 70% i i i i i i h SCOPE & REFTR -1 g g g g g 50,00 25,00 1 8 139 152 7 60 245,00 2 12 200 175 8 100 3 9 150 130 7,5 100 SCOPE DSGN QUALITY DETAILS Total Number Of Planned Test Cases i Total Number Of Total Number Of Test Cases Total Number Of EXTERNAL Executed INTERNAL Defects Defects i i i QUALITY DOD Performed i % DOD Steps Performed i Total # Of Impediments Reported i Total # Of Impediments Removed i Defect Removal Effeciency (%) i 1289 1084 1443 472 46 - 284 163 h h h h h h h h i 28 20 17 9 YES 70 4 3 65% 34 30 21 16 NO 100 7 7 50% 12 12 25 12 YES 100 3 3 50% Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 6
  7. 7. MEASUREMENT OBJECTIVE SCOPE MAINTENANCE ENGAGEMENT INDICATORS REGRESSION EQUATION APPLICATION MAINTENANCE KPI PORTFOLIO COST Y - FACTORS  % EFFORT VARIANCE FOR KT and RELEASE  PRODCTIVITY (AET)  % BACKLOG OF TICKET  CONTRIBUTION MARGIN X - FACTORS  Idle Time (under discussion)  Resource Cost  Right shore Ratio  Skill Index  Effort Spent on KT  % of tickets having existing solution in KEDB  # of modules reoccurring impacted  System Downtime  % Additional Work QUALITY Y - FACTORS  % INCIDENT REDUCTION  % FIRST TIME PASS  % OF SYSTEM S’FULLY TRANSITIONED DURING KT STAGE  DEFECT REMOVAL EFFICIENCY FOR RELEASE  DELIVERED DEFECT DENSITY FOR RELEASE  COST OF QUALITY X - FACTORS  Rework Effort  Test Coverage  Testing Rate  Test Preparation Effort  System Downtime  # of CR’s Rolled Back  % RCA Compliance  # of reoccurring modules impacted SCHEDULE Y - FACTORS  % SCHEDULE VARIANCE FOR KT PHASE  % RESPONSE & RESOLUTION COMPLIANCE  % SCHEDULE VARIANCE FOR RELEASE The “X” factors influencing the outcome of “Y” was identified during the workshops. X - FACTORS  Resource Availability  Skill Index  Reusability  Rework Effort  % of tickets having existing solution in KEDB  Elapsed Time to Assign / Investigate / Testing / Implementation Per Ticket  # of times incident/service request assigned within and between teams Capgemini Leading and Lagging Indicators – NESMA Presentation The identified “X” factors are logical in nature and may change during statistical validation Niteen Kumar 7
  8. 8. MEASUREMENT OBJECTIVE SCOPE INDICATORS MAINTENANCE PROJECT REGRESSION EQUATION APPLICATION MAINTENANCE KPI PORTFOLIO INCIDENT / PROBLEM MANAGEMENT - LEADING & LAGGING INDICATORS 5811 Reporting Month Priority Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 Jun-13 P0 P1 P2 P3 P4 P5 S0 S1 S2 S3 S4 S5 926 5895 Tickets Backlog Number of Received Tickets of Tickets Current Previous Resolved Month Month 11 13 807 44 1 4 132 35 12 17 845 47 2 10 437 53 1 2 82 46 3 10 475 52 842 20545 3 Number of Backlog Tickets Effort Spent in Closing The Ticket P. Hours Average Effort per Ticket 0 0 94 32 0 2 44 47 - 29,75 40,5 2552 216 4 1134 259 2,48 2,38 3,02 4,60 0,00 0,40 2,39 4,98 - 64 98,91 263 95,54 # of # of % SLA % SLA Response Resolution Compliance Compliance Breach Breach 2 3 5 1 0 6 0 0 83,33 82,35 99,41 97,87 100,00 40,00 100,00 100,00 - 0 0 45 0 0 2 55 0 100,00 100,00 94,67 100,00 100,00 80,00 88,42 100,00 - Capgemini Leading and Lagging Indicators – NESMA Presentation 0 FTR 0 - Average Elapsed Time To Closure H:MM:S 1:30:22 0:33:00 0:02:00 0:02:00 4:54:00 11:33:00 30:10:00 149:38:00 0:05:00 43:34:00 0:02:00 0:16:00 %First Time Right Average Elapsed Time To Assign Ticket Hours : Mins: Sec 10:56:00 52:14:00 24:25:00 169:45:00 Niteen Kumar 8
  9. 9. MEASUREMENT OBJECTIVE SCOPE INDICATORS MAINTENANCE PROJECT REGRESSION EQUATION Example: Delivered Defect Density *DDD = 3.0-0.05*RAE-0.06*TPE – 0.025CRE RAE = Requirement Analysis Effort TPE = Test Preperation Effort CRE = Code Review Effort * Example Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar 9
  10. 10. About Capgemini With more than 120,000 people in 40 countries, Capgemini is one of the world's foremost providers of consulting, technology and outsourcing services. The Group reported 2011 global revenues of EUR 9.7 billion. Together with its clients, Capgemini creates and delivers business and technology solutions that fit their needs and drive the results they want. A deeply multicultural organization, Capgemini has developed its own way of working, the Collaborative Business ExperienceTM, and draws on Rightshore ®, its worldwide delivery model. Rightshore® is a trademark belonging to Capgemini www.capgemini.com The information contained in this presentation is proprietary. © 2012 Capgemini. All rights reserved.

×