Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Harold van Heeringen, 
ISBSG president 
John Ogilvie, 
CEO ISBSG
Topics 
Benchmarking; 
Software Project Industry; 
Comparing apples to apples; 
3 cases from experience: 
Reality check pr...
Benchmarking (wikipedia) 
Benchmarking is the process of comparing one's business processes 
and performance metrics to in...
Where are we now? 
“Even the most detailed navigation map of an area is 
useless if you don’t know where you are” 
? 
? 
?...
Informed decisions 
Senior Management of IT departments/organizations need 
to make decisions need to make decisions based...
Software project industry 
Low ‘performance metrics’ maturity: 
Few Performance Measurement Process implemented; 
Few Benc...
But… 
Best in Class organizations deliver software up to 30 times 
more productively than Worst in Class organizations 
Hi...
Difficulty – low industry maturity 
How to measure metrics like productivity, quality, 
time-to-market in such a way that ...
Software is not easy to compare
Functional Size Measurement 
Function Point Analysis (NESMA, IFPUG or COSMIC) 
Measure the functional user requirements – ...
Historical data: ISBSG 
repositories 
 International Software Benchmarking Standards Group; 
 Independent and not-for-pr...
3 Cases from my experience 
 Case 1: Telecom project reality check on the expert 
estimate 
 Case 2: Assessing the compe...
Case 1 
 A telecom company wished to develop a new Java 
system for the maintenance of subscription types; 
 A team of e...
ISBSG Reality Check: Effort 
 An estimated FPA comes up with the expected size: 
 Min: 550 FP, likely 850 FP, Max 1300 F...
ISBSG Reality Check: Duration 
 Same analysis is possible; 
 Also, formulas have been published in the Practical Project...
Result 
 Expert estimate was assessed optimistic; 
 Adjusted Estimate: 
 Effort: 8.000 hours; 
 Duration: 10 months. 
...
Case 2 
 Senior management of a software company wondered how 
competitive they were when it comes to productivity. 
 Ma...
Case 2 
 Analysis of the bid phase showed a number of issues: 
 Estimates were extremely pessimistic due to severe 
pena...
Case 3 
 An organization outsourced all of their IT work to one 
supplier; 
 Therefore, the competition was gone and pot...
Case 3: Supplier targets 
13 
12 
11 
10 
9 
8 
7 
PDR (h/FP) 
Target PDR PDR (h/FP) 
1300 
1200 
1100 
1000 
900 
800 
70...
Other uses for ISBSG data 
Vendor selection, based on productivity, speed or quality 
metrics, compared to the industry. 
...
Analysis of the data 
Analyze the difference in productivity or quality 
between two (or more) types of projects: 
Traditi...
Special reports 
 Impact of Software Size on Productivity; 
 Government and Non-Government Software Project 
Performance...
Country Report 
- 12 countries 
->40 projects submitted 
- Australia 
- Brazil 
- Canada 
- Denmark 
- Finland 
- France 
...
IFPUG / NESMA FPA data 
Country N (max) Size Effort Duration 
Australië 624 140 2054 6,3 
Brazilië 74 253 2047 7 
Canada 9...
IFPUG / NESMA FPA metrics 
PDR Speed Manpower DR Defect density 
Australië 14,1 25 5,4 5,9 
Brazilië 15,2 32 2,8 0 
Canada...
PDR (h/FP) per country 
18 
16 
14 
12 
10 
8 
6 
4 
2 
0 
PDR 
Mediaan PDR 
Hours per FP
Delivery Speed per country 
70 
60 
50 
40 
30 
20 
10 
0 
Speed 
Mediaan Speed 
FP per calendar month
Manpower Delivery Rate per country 
FP per month per person 
25 
20 
15 
10 
5 
0 
Manpower DR 
Mediaan Manpower DR
Defect density per country 
60 
50 
40 
30 
20 
10 
0 
Defect density 
Mediaan Defect 
density 
Defects delivered per 1000...
We need data! 
 Everybody wants to use data 
 But nobody wants to submit data… Why not? 
 Is it hard? 
 Is there a ris...
Data collection initiatives 
 Now: COSMIC initiative!! 
 Concise DCQ, rewards for submitting data1: 
 Free benchmark re...
IWSM-Mensura Event offer 
IWSM participants get a 
20% discount. Use the code 
at check-out. 
Please check out the ISBSG 
...
Thank you!
The importance of benchmarking software projects - Van Heeringen and Ogilvie
Upcoming SlideShare
Loading in …5
×

The importance of benchmarking software projects - Van Heeringen and Ogilvie

568 views

Published on

Benchmarking is a crucial management activity that enables organizations to understand how competitive they are. Using functional size measurement methods and historical data enables organizations to improve their processes and become more succesful.

Published in: Business
  • Be the first to comment

  • Be the first to like this

The importance of benchmarking software projects - Van Heeringen and Ogilvie

  1. 1. Harold van Heeringen, ISBSG president John Ogilvie, CEO ISBSG
  2. 2. Topics Benchmarking; Software Project Industry; Comparing apples to apples; 3 cases from experience: Reality check proposal; Competitiveness analysis; Supplier performance measurement. Other use of the ISBSG data; Data submission initiative.
  3. 3. Benchmarking (wikipedia) Benchmarking is the process of comparing one's business processes and performance metrics to industry bests or best practices from other industries. Benchmarking is used to measure performance using a specific indicator (cost per unit of measure, productivity per unit of measure, cycle time of x per unit of measure or defects per unit of measure) resulting in a metric of performance that is then compared to others This then allows organizations to develop plans on how to make improvements or adapt specific best practices, usually with the aim of increasing some aspect of performance. Benchmarking may be a one-off event, but is often treated as a continuous process in which organizations continually seek to improve their practices.
  4. 4. Where are we now? “Even the most detailed navigation map of an area is useless if you don’t know where you are” ? ? ? ?
  5. 5. Informed decisions Senior Management of IT departments/organizations need to make decisions need to make decisions based on ‘where they are’ and ‘where they want to go’. Benchmarking is about determining ‘where you are’ compared to relevant peers, in order to make informed decisions. But how to measure and determine where you are?
  6. 6. Software project industry Low ‘performance metrics’ maturity: Few Performance Measurement Process implemented; Few Benchmarking processes implemented. Most organizations don’t know how good or how bad they are in delivering or maintaining software. These organizations are not able to assess their competitive position, nor able to make informed strategic decisions to improve their competitive position.
  7. 7. But… Best in Class organizations deliver software up to 30 times more productively than Worst in Class organizations High Productivity, High Quality; More functionality for the users against lower costs – value; Shorter Time to Market – competitive advantage! Worst in Class organizations will find themselves in trouble in an increasingly competitive market Outperformed by competition; Internal IT departments get outsourced; Commercial software houses fail to win new contracts. Important to know where you stand! Benchmarking is essential!
  8. 8. Difficulty – low industry maturity How to measure metrics like productivity, quality, time-to-market in such a way that a meaningful comparison is possible? Comparing apples to apples
  9. 9. Software is not easy to compare
  10. 10. Functional Size Measurement Function Point Analysis (NESMA, IFPUG or COSMIC) Measure the functional user requirements – size in function points; ISO standards – objective, independent, verifiable, repeatable; Strong relation between functional size and project effort needed. What to do with the results? Project effort/duration/cost estimation; Benchmarking/performance measurement; Use in Request for Proposal management (answer price/FP questions). What about historical data? Company data (preferably for estimation); Industry data (necessary for external benchmarking).
  11. 11. Historical data: ISBSG repositories  International Software Benchmarking Standards Group;  Independent and not-for-profit;  Grows and exploits two repositories of software data:  New development projects and enhancements (> 7000 projects);  Maintenance and support (> 1100 applications).  Everybody can submit project data  DCQ on the site;  Anonymous;  Free benchmark report in return;  New rewards to be added soon!
  12. 12. 3 Cases from my experience  Case 1: Telecom project reality check on the expert estimate  Case 2: Assessing the competitive position of an organization  Case 3: Supplier Performance Measurement
  13. 13. Case 1  A telecom company wished to develop a new Java system for the maintenance of subscription types;  A team of experts studied the requirements documents and filled in the WBS-based estimation calculation (bottom-up estimate);  They decide that an estimate of 5.500 hours and a duration of 6 months should be feasible;  The project manager decided not to believe the experts ‘on their blue eyes’ only and wished to carry out a reality check.
  14. 14. ISBSG Reality Check: Effort  An estimated FPA comes up with the expected size:  Min: 550 FP, likely 850 FP, Max 1300 FP  Selecting the most relevant projects in the ISBSG D&E repository show the next results: PDR (h/FP) FP) Min. 3,2 Percentile 10% 10% 4,3 Percentile 25% 25% 6,2 Median 8,9 Percentile 75% 75% 12,9 Percentile 90% 90% 19,8 Max. 34,2 N 89 Max. 34,2 N 89 Functional Size 550 850 1300 3.410 5.270 8.060 4.895 7.565 11.570 7.095 10.965 16.770 5.500 hours seems optimistic
  15. 15. ISBSG Reality Check: Duration  Same analysis is possible;  Also, formulas have been published in the Practical Project Estimation book;  For instance: table C-2.2 Project Duration, estimated from software size only Functionele omvang 550 FP C uit tabel 0,507 E1 uit tabel 0,429 Duration = C * Size^E1 Duration = 7,6 elapsed months 550 850 1300 Duration 7,6 9,2 11,0
  16. 16. Result  Expert estimate was assessed optimistic;  Adjusted Estimate:  Effort: 8.000 hours;  Duration: 10 months.  This turned out to be quite accurate!  The project manager now always carries out a reality check and is ‘spreading the word’.
  17. 17. Case 2  Senior management of a software company wondered how competitive they were when it comes to productivity.  Many bids for projects were lost and they wished to improve, especially their Microsoft .Net department.  Analysis of the bids by department showed the next figures: Nr. of bids 23 Average PDR in bid 16,3 h/FP Average Size (FP) 230 FP Average teamsize 6 fte PDR (h/FP) Min. 3,2 Percentile 10% 3,8 Percentile 25% 5,9 Median 7,6 Percentile 75% 12,9 Percentile 90% 18,9 Max. 34,2 N 35 ISBSG data analysis
  18. 18. Case 2  Analysis of the bid phase showed a number of issues:  Estimates were extremely pessimistic due to severe penalties in case of overruns;  In a number of stages, risk surcharges were added;  They wished to work in fixed team of 6 fte, but ISBSG data shows that the project size was usually to small for this teams size to be efficient.  Because of the knowledge that the department bids were not market average (or better), the bid process was redesigned, making the company more successful!
  19. 19. Case 3  An organization outsourced all of their IT work to one supplier;  Therefore, the competition was gone and potentially the supplier could charge whatever they wish;  The ISBSG data was used to construct the productivity targets for the supplier;  Bonus/malus arrangements were agreed upon based on these targets.
  20. 20. Case 3: Supplier targets 13 12 11 10 9 8 7 PDR (h/FP) Target PDR PDR (h/FP) 1300 1200 1100 1000 900 800 700 600 500 EUR/FP Target PCR PCR (EUR/FP) 45 40 35 30 25 20 15 10 5 0 jan-13 feb-13 mrt-13 apr-13 mei-13 jun-13 jul-13 aug-13 sep-13 okt-13 Target Defects/1000 FP Defects/1000 FP The supplier is measured continuously and still has to make his target for the first time! The organization is happy that the trends show improvement and they feel in control.
  21. 21. Other uses for ISBSG data Vendor selection, based on productivity, speed or quality metrics, compared to the industry. Definition of SLA agreements (or other KPI’s) based on industry average performance. Establish a baseline from which to measure future improvement. Explain to the client/business that a project was carried out in a ‘better-than-average’ way, while the client may perceive otherwise.
  22. 22. Analysis of the data Analyze the difference in productivity or quality between two (or more) types of projects: Traditional vs. Agile; Outsourced vs. In-house; Government vs. Non-government; One site, multi site; Reuse vs. no reuse; Etcetera. ISBSG Special Analysis reports free of charge for Nesma members
  23. 23. Special reports  Impact of Software Size on Productivity;  Government and Non-Government Software Project Performance;  ISBSG Software Industry Performance report;  ISBSG The Performance of Business Application, Real-Time and Component Software Projects;  Estimates – How accurate are they?  Planning Projects – Role Percentages;  Team size impact on productivity;  Manage your M&S environment – what to expect?  Many more.
  24. 24. Country Report - 12 countries ->40 projects submitted - Australia - Brazil - Canada - Denmark - Finland - France - India - Italy - Japan - Netherlands - UK - USA
  25. 25. IFPUG / NESMA FPA data Country N (max) Size Effort Duration Australië 624 140 2054 6,3 Brazilië 74 253 2047 7 Canada 94 244 3207 8 Denemarken 167 253 3476 10 Frankrijk 464 145 1843 7 India 129 283 2794 6 Italië 18 247 3706 9 Japan 777 280 2108 4,2 Nederland IFPUG 26 326 3988 6 Nederland NESMA 153 192 1576 3,5 UK 42 268 1932 5 United States 1435 100 865 3,9 Totaal 4003 Mediaan 250 2081 6,15
  26. 26. IFPUG / NESMA FPA metrics PDR Speed Manpower DR Defect density Australië 14,1 25 5,4 5,9 Brazilië 15,2 32 2,8 0 Canada 15,4 26 3,3 14,3 Denemarken 14,3 26 3,2 13,5 Frankrijk 13,2 19 2,5 0 India 8,6 63 6,3 0 Italië 9,3 32 5,4 - Japan 6,9 63 13,4 17,1 Nederland IFPUG 9,6 48 9,8 2,9 Nederland NESMA 6,9 61 2,8 5,2 UK 4,1 50 23,6 55,4 United States 11,6 25 18,4 0,5 Mediaan 10,6 32 5,4 5,2 PDR: uren/FP Speed: FP / kalender maand Manpower DR: FP/maand/persoon Defect density: defects /1000 FP geleverd
  27. 27. PDR (h/FP) per country 18 16 14 12 10 8 6 4 2 0 PDR Mediaan PDR Hours per FP
  28. 28. Delivery Speed per country 70 60 50 40 30 20 10 0 Speed Mediaan Speed FP per calendar month
  29. 29. Manpower Delivery Rate per country FP per month per person 25 20 15 10 5 0 Manpower DR Mediaan Manpower DR
  30. 30. Defect density per country 60 50 40 30 20 10 0 Defect density Mediaan Defect density Defects delivered per 1000 FP
  31. 31. We need data!  Everybody wants to use data  But nobody wants to submit data… Why not?  Is it hard?  Is there a risk?  Is the reward not big enough?  Does it take to much time?  Are there any factors preventing you? Let’s discuss!  WWW.ISBSG.ORG
  32. 32. Data collection initiatives  Now: COSMIC initiative!!  Concise DCQ, rewards for submitting data1:  Free benchmark report;  Free report: The performance of business application, real-time and component software projects (March 2012);  Amazon coupons up to 100 USD;  ISBSG coupons up to 140 USD;  20 ISBSG portal credits.  Anonymity guaranteed! 1: check the Eligibility criteria at the ISBSG booth
  33. 33. IWSM-Mensura Event offer IWSM participants get a 20% discount. Use the code at check-out. Please check out the ISBSG booth here at the IWSM! NB: Nesma members can download the ISBSG special analysis reports for free.
  34. 34. Thank you!

×