Benchmarking is a crucial management activity that enables organizations to understand how competitive they are. Using functional size measurement methods and historical data enables organizations to improve their processes and become more succesful.
2. Topics
Benchmarking;
Software Project Industry;
Comparing apples to apples;
3 cases from experience:
Reality check proposal;
Competitiveness analysis;
Supplier performance measurement.
Other use of the ISBSG data;
Data submission initiative.
3. Benchmarking (wikipedia)
Benchmarking is the process of comparing one's business processes
and performance metrics to industry bests or best practices from other
industries.
Benchmarking is used to measure performance using a
specific indicator (cost per unit of measure, productivity per unit of
measure, cycle time of x per unit of measure or defects per unit of
measure) resulting in a metric of performance that is then compared
to others
This then allows organizations to develop plans on how to make
improvements or adapt specific best practices, usually with the aim of
increasing some aspect of performance. Benchmarking may be a
one-off event, but is often treated as a continuous process in which
organizations continually seek to improve their practices.
4. Where are we now?
“Even the most detailed navigation map of an area is
useless if you don’t know where you are”
?
?
?
?
5. Informed decisions
Senior Management of IT departments/organizations need
to make decisions need to make decisions based on
‘where they are’ and ‘where they want to go’.
Benchmarking is about determining ‘where you are’
compared to relevant peers, in order to make informed
decisions.
But how to measure and determine where you are?
6. Software project industry
Low ‘performance metrics’ maturity:
Few Performance Measurement Process implemented;
Few Benchmarking processes implemented.
Most organizations don’t know how good or how bad they
are in delivering or maintaining software.
These organizations are not able to assess their
competitive position, nor able to make informed strategic
decisions to improve their competitive position.
7. But…
Best in Class organizations deliver software up to 30 times
more productively than Worst in Class organizations
High Productivity, High Quality;
More functionality for the users against lower costs – value;
Shorter Time to Market – competitive advantage!
Worst in Class organizations will find themselves in trouble in
an increasingly competitive market
Outperformed by competition;
Internal IT departments get outsourced;
Commercial software houses fail to win new contracts.
Important to know where you stand!
Benchmarking is essential!
8. Difficulty – low industry maturity
How to measure metrics like productivity, quality,
time-to-market in such a way that a meaningful comparison
is possible?
Comparing apples to apples
10. Functional Size Measurement
Function Point Analysis (NESMA, IFPUG or COSMIC)
Measure the functional user requirements – size in function points;
ISO standards – objective, independent, verifiable, repeatable;
Strong relation between functional size and project effort needed.
What to do with the results?
Project effort/duration/cost estimation;
Benchmarking/performance measurement;
Use in Request for Proposal management (answer price/FP questions).
What about historical data?
Company data (preferably for estimation);
Industry data (necessary for external benchmarking).
11. Historical data: ISBSG
repositories
International Software Benchmarking Standards Group;
Independent and not-for-profit;
Grows and exploits two repositories of software data:
New development projects and enhancements (> 7000 projects);
Maintenance and support (> 1100 applications).
Everybody can submit project data
DCQ on the site;
Anonymous;
Free benchmark report in return;
New rewards to be added soon!
12. 3 Cases from my experience
Case 1: Telecom project reality check on the expert
estimate
Case 2: Assessing the competitive position of an
organization
Case 3: Supplier Performance Measurement
13. Case 1
A telecom company wished to develop a new Java
system for the maintenance of subscription types;
A team of experts studied the requirements documents
and filled in the WBS-based estimation calculation
(bottom-up estimate);
They decide that an estimate of 5.500 hours and a
duration of 6 months should be feasible;
The project manager decided not to believe the experts
‘on their blue eyes’ only and wished to carry out a reality
check.
14. ISBSG Reality Check: Effort
An estimated FPA comes up with the expected size:
Min: 550 FP, likely 850 FP, Max 1300 FP
Selecting the most relevant projects in the ISBSG D&E
repository show the next results:
PDR (h/FP)
FP)
Min. 3,2
Percentile 10% 10% 4,3
Percentile 25% 25% 6,2
Median 8,9
Percentile 75% 75% 12,9
Percentile 90% 90% 19,8
Max. 34,2
N 89
Max. 34,2
N 89
Functional Size
550 850 1300
3.410 5.270 8.060
4.895 7.565 11.570
7.095 10.965 16.770
5.500 hours
seems optimistic
15. ISBSG Reality Check: Duration
Same analysis is possible;
Also, formulas have been published in the Practical Project
Estimation book;
For instance:
table C-2.2 Project Duration, estimated from software size only
Functionele omvang 550 FP
C uit tabel 0,507
E1 uit tabel 0,429
Duration = C * Size^E1
Duration = 7,6 elapsed months
550 850 1300
Duration 7,6 9,2 11,0
16. Result
Expert estimate was assessed optimistic;
Adjusted Estimate:
Effort: 8.000 hours;
Duration: 10 months.
This turned out to be quite accurate!
The project manager now always carries out a reality
check and is ‘spreading the word’.
17. Case 2
Senior management of a software company wondered how
competitive they were when it comes to productivity.
Many bids for projects were lost and they wished to improve,
especially their Microsoft .Net department.
Analysis of the bids by department showed the next figures:
Nr. of bids 23
Average PDR in bid 16,3 h/FP
Average Size (FP) 230 FP
Average teamsize 6 fte
PDR (h/FP)
Min. 3,2
Percentile 10% 3,8
Percentile 25% 5,9
Median 7,6
Percentile 75% 12,9
Percentile 90% 18,9
Max. 34,2
N 35
ISBSG data
analysis
18. Case 2
Analysis of the bid phase showed a number of issues:
Estimates were extremely pessimistic due to severe
penalties in case of overruns;
In a number of stages, risk surcharges were added;
They wished to work in fixed team of 6 fte, but ISBSG data
shows that the project size was usually to small for this
teams size to be efficient.
Because of the knowledge that the department bids were not
market average (or better), the bid process was redesigned,
making the company more successful!
19. Case 3
An organization outsourced all of their IT work to one
supplier;
Therefore, the competition was gone and potentially the
supplier could charge whatever they wish;
The ISBSG data was used to construct the productivity
targets for the supplier;
Bonus/malus arrangements were agreed upon based on
these targets.
20. Case 3: Supplier targets
13
12
11
10
9
8
7
PDR (h/FP)
Target PDR PDR (h/FP)
1300
1200
1100
1000
900
800
700
600
500
EUR/FP
Target PCR PCR (EUR/FP)
45
40
35
30
25
20
15
10
5
0
jan-13 feb-13 mrt-13 apr-13 mei-13 jun-13 jul-13 aug-13 sep-13 okt-13
Target Defects/1000 FP Defects/1000 FP
The supplier is measured continuously
and still has to make his target for the
first time!
The organization is happy that the
trends show improvement and they feel
in control.
21. Other uses for ISBSG data
Vendor selection, based on productivity, speed or quality
metrics, compared to the industry.
Definition of SLA agreements (or other KPI’s) based on
industry average performance.
Establish a baseline from which to measure future
improvement.
Explain to the client/business that a project was carried out
in a ‘better-than-average’ way, while the client may perceive
otherwise.
22. Analysis of the data
Analyze the difference in productivity or quality
between two (or more) types of projects:
Traditional vs. Agile;
Outsourced vs. In-house;
Government vs. Non-government;
One site, multi site;
Reuse vs. no reuse;
Etcetera.
ISBSG Special Analysis reports
free of charge for Nesma members
23. Special reports
Impact of Software Size on Productivity;
Government and Non-Government Software Project
Performance;
ISBSG Software Industry Performance report;
ISBSG The Performance of Business Application, Real-Time
and Component Software Projects;
Estimates – How accurate are they?
Planning Projects – Role Percentages;
Team size impact on productivity;
Manage your M&S environment – what to expect?
Many more.
24.
25. Country Report
- 12 countries
->40 projects submitted
- Australia
- Brazil
- Canada
- Denmark
- Finland
- France
- India
- Italy
- Japan
- Netherlands
- UK
- USA
26. IFPUG / NESMA FPA data
Country N (max) Size Effort Duration
Australië 624 140 2054 6,3
Brazilië 74 253 2047 7
Canada 94 244 3207 8
Denemarken 167 253 3476 10
Frankrijk 464 145 1843 7
India 129 283 2794 6
Italië 18 247 3706 9
Japan 777 280 2108 4,2
Nederland IFPUG 26 326 3988 6
Nederland NESMA 153 192 1576 3,5
UK 42 268 1932 5
United States 1435 100 865 3,9
Totaal 4003
Mediaan 250 2081 6,15
28. PDR (h/FP) per country
18
16
14
12
10
8
6
4
2
0
PDR
Mediaan PDR
Hours per FP
29. Delivery Speed per country
70
60
50
40
30
20
10
0
Speed
Mediaan Speed
FP per calendar month
30. Manpower Delivery Rate per country
FP per month per person
25
20
15
10
5
0
Manpower DR
Mediaan Manpower DR
31. Defect density per country
60
50
40
30
20
10
0
Defect density
Mediaan Defect
density
Defects delivered per 1000 FP
32. We need data!
Everybody wants to use data
But nobody wants to submit data… Why not?
Is it hard?
Is there a risk?
Is the reward not big enough?
Does it take to much time?
Are there any factors preventing you? Let’s discuss!
WWW.ISBSG.ORG
33. Data collection initiatives
Now: COSMIC initiative!!
Concise DCQ, rewards for submitting data1:
Free benchmark report;
Free report: The performance of business application,
real-time and component software projects (March
2012);
Amazon coupons up to 100 USD;
ISBSG coupons up to 140 USD;
20 ISBSG portal credits.
Anonymity guaranteed!
1: check the Eligibility criteria at the ISBSG booth
34. IWSM-Mensura Event offer
IWSM participants get a
20% discount. Use the code
at check-out.
Please check out the ISBSG
booth here at the IWSM!
NB: Nesma members can
download the ISBSG special
analysis reports for free.