Van Heeringen - Seminar Software Metrics Network Sweden 18-04-2013

419 views

Published on

Software project estimation - the differences between expert estimates and parametric estimates. The results of a study on the accuracy (effort, schedule and cost) of the two types of estimates is part of the presentation

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
419
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
18
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Van Heeringen - Seminar Software Metrics Network Sweden 18-04-2013

  1. 1. Software Estimation andPerformance MeasurementHarold van HeeringenSogeti Sizing, Estimating & Control (SEC)Sogeti Nederland B.V.Stockholm, April 18 2013@haroldveendam
  2. 2. 2Topics• Software projects• Software estimation• Expert vs. Parametric Estimates• Challenge for parametric estimates• E&PM process• Accuracy of the different estimation types• The Swedish software industry?
  3. 3. 3Software projects• Software project industry : low maturity− Low estimation maturity− No or little formal estimation processes− No or little use of historical data• Lots of schedule and cost overruns− Standish Chaos reports: Most projects fail or are at leastunsuccessful• Low customer satisfaction rates− In Europe: only slightly higher than the financial sector
  4. 4. 4Software project estimation• Most of the projects are estimated by ‘experts’− Bottom up, task by task effort estimation• Usually very optimistic (>30%)− Experts estimate, but other people (juniors) do the job− Forgotten activities (e.g. testscript reviews)− No feedback loop with past projects: experts don’t learn frompast estimates and actuals− No scenario’s: duration, team size, etc.− Not objective, transparent, verifiable or repeatable• Not defendable!− ‘Easy’ to push back by stakeholders• No risk assessment (distribution of theestimate)
  5. 5. 5Expert EstimationProbabilityEffort50/50 median result90% . . . . . . . . . . . . . . . . 24 hours75% . . . . . . . . . . . . . . . . 22 hours50% . . . . . . . . . . . . . . . . 20 hours10% . . . . . . . . . . . . . . . . 18 hours0% . . . . . . . . . . . . . . . . 16 hours. . . . . . . . 14 hours. . . . . . . . 12 hours. . . . . . . . 10 hours. . . . . . . . 8 hours. . . . . . . . 6 hours. . . . . . . . 4 hours. . . . . . . . 2 hoursExpert Estimate: 1st possibility of success0%100%Task– Code and Unit test of module XYZExpert estimateRealistic estimate
  6. 6. 6Project Estimates• Two types of project estimation:− Expert estimation− Parametric estimation• Expert− Knowledge and experience of experts− Assign effort hours to tasks (bottom-up)− Subjective, but always applicable• Parametric− Size measurement, historical data and tooling− Size measurement methods: NESMA FPA, COSMIC, IFPUG− Objective, but well documented specifications required
  7. 7. 7Comparing the two types• Expert Estimates− Bottom-up estimation− Usually optimistic (up to 30% under estimation is common)− Forgotten activities− Hard to defend− The expert is not going to do all the work− The expert may not be an expert on the new project− Are the real experts available?• Parametric Estimates− Top-down estimation− Estimating & Performance Measurement process needed− Effort = size * Productivity◦ Size is objectively measureable (COSMIC, FPA)◦ Productivity from historical data (organization / ISBSG)◦ scenario’s through tools (QSM / SEER-SEM / ISBSG)
  8. 8. 8Eestimate & Performance MeasurementPLANEstimateAdministrateEvaluateAdjust &ReportSize measurement: FPAHistorical dataEstimation toolsFinetune Estimation modelAnalyse productivity,Report productivityStart: Estimate requestStart: Project completedResults:- Parametric Estimation- Expert EstimationResult:-Management report,-Adjusted modelResult:-Growing project DB,-Performance measurement-Updated expert knowledgeACTCHECK DOStart: Project startContinuous data collection• effort hours registration• defect registration• change measurement• project characteristicsResult: Project dataData collection andadministration• Collect project data• Measure size• Benchmark the projectStart: PeriodicallyExpert Estimate
  9. 9. 9Challenge – ‘sell parametric estimates’• Project/Bid management still believe expertsmore than parametrics− ‘More detail must mean more accurate, right’?− ‘This project is very different from past projects’− ‘I see that we don’t get any function points for module XYZ,but we have to do a lot of work for that!’− ‘I think the project is quite easy and I think that theparametric estimate overestimates the effort’− But what about: team size, duration, forgotten activities,past performance.• How can we convince project management ??
  10. 10. 10Software equation (Putnam)Size/productivity= Effort 1/3 * duration4/3EffortDurationPlan A: 6 months, 4.500 hoursPlan B: 7 months, 2.400 hours
  11. 11. 11Project at different durationsEffort(hours)DurationPlan ADuration: 6 monthsEffort: 4.500 hoursMax. team size: 5,8 fteMTTD: 1,764 daysPlan BDuration: 7 monthsEffort: 2.400 hoursMax. team size: 2,7 fteMTTD: 2,816 daysSize and productivity being constantWhich duration have the experts in mind??
  12. 12. 12Sogeti SEC• Sizing, Estimating & Control− Certified (COSMIC) Function Point Analysts− Metrics consultants• Responsible for metrics part of a quotation.− Size: FPA/COSMIC− Estimation: SEER-SEM / QSM / Sogeti tool / ISBSG− Product: Methodical Estimation Report (scenario’s)− Pricing: EUR/FP− Quality: Defects/FP• Centers of Excellence: MS.Net, Java, Oracle,mobile, drupal, sharepoint, BI, etc.
  13. 13. 13Assignment 2005• Build estimation instrument− Gain time and effort in estimating bids− Accurate enough to depend and rely on− Flexible:◦ Estimate onshore / offshore and hybrids◦ Calculate different test strategies◦ Take into account complexity◦ Implement Deming cycle (PDCA)• Give scenario’s for duration !!!
  14. 14. 14First version Estimating Wizard (2005)• Try to grasp the duration / effort tradeoff in amodel− Tuned with experience dataHour/FP: Average ComplexityDuration in months 3½ 4 4½ 5 5½ 6 6½ 7 7½ 80-250 FP 10,1 8,9 8,1 7,7 6,9250-500 FP 9,1 8,0 7,3 6,9 6,2500-750 FP 8,6 7,6 6,9 6,5 5,9750-1000 FP 8,3 7,3 6,6 6,3 5,61000 -1250 FP 8,1 7,1 6,5 6,2 5,51250 - 1500 FP 7,9 6,9 6,3 6,0 5,4
  15. 15. 15Estimating Wizard 2013InputFunctional design parametersFunctional Design YesOverlap Yes, manually 5Language EnglishAvailability key users LowLocation On siteBuild and test parametersDevelopment tool JavaOnshore OffshoreConstruction 30% 70%Translation FD required NoSystem test approach TMap HeavySystem test strategyTools/methodologies 5 - AverageComplexity 6Development team 5 - AverageReuse 0 - None 5%General parametersSize 367 FPStart date 01-05-13Project control 2 hrs/weekRisk surcharge (%) 10 %Warranty (%) 4 %Organization type BankingQuality documentation 6Non functional req.Scenario interval 2,0 weeksm20v20120224Scripting and design NL, execution in IndiaAverage (0)
  16. 16. 16EW - output• Functional Design – no schedule scenariosFunctional design phaseDuration in weeks 17,6Design complete 4-05-11Total effort 1.975Effort per FP 2,53Effort cost € 208.531Additional cost € 14.815Totaal cost € 223.346Cost per FP € 286Average team size 2,80Data altered due to company security reasons
  17. 17. 17EW - output• Main build – 7 schedule scenariosBuild and test phaseDuration in weeks 20,0 22,0 24,0 26,0 28,0 30,0 32,0Start phase 18-02-11 18-02-11 18-02-11 18-02-11 18-02-11 18-02-11 18-02-11Effort 9.794 6.690 4.723 3.429 2.550 1.935 1.495Effort per FP 28,06 19,17 13,53 9,83 7,31 5,54 4,28Effort cost € 550.720 € 376.152 € 265.590 € 192.826 € 143.360 € 108.787 € 84.036Additional cost € 56.446 € 41.090 € 31.364 € 24.963 € 20.611 € 17.570 € 15.393Totaal cost € 607.167 € 417.242 € 296.954 € 217.789 € 163.972 € 126.357 € 99.428Cost per FP € 1.740 € 1.196 € 851 € 624 € 470 € 362 € 285Average team size 12,24 7,60 4,92 3,30 2,28 1,61 1,17Risk and warrantyRisk hours 834 586 429 325 255 205 170Risk cost € 54.301 € 39.107 € 29.484 € 23.150 € 18.845 € 15.836 € 13.682Warranty hours 209 146 107 81 64 51 43Warranty cost € 13.575 € 9.777 € 7.371 € 5.788 € 4.711 € 3.959 € 3.420TotalDuration in weeks 27,0 29,0 31,0 33,0 35,0 37,0 39,0Delivery for acceptance 8-07-11 22-07-11 5-08-11 19-08-11 2-09-11 16-09-11 30-09-11Total effort 11.470 8.055 5.892 4.469 3.501 2.825 2.340Effort per FP 32,87 23,08 16,88 12,80 10,03 8,09 6,71Totaal cost € 746.634 € 537.717 € 405.400 € 318.319 € 259.120 € 217.744 € 188.122Cost per FP € 2.139 € 1.541 € 1.162 € 912 € 742 € 624 € 539Average team size 10,64 6,96 4,76 3,39 2,50 1,91 1,50Average hourly rate € 65 € 67 € 69 € 71 € 74 € 77 € 80Data altered due to company security reasons
  18. 18. 18Calibrating the model• Collect historical data• Use ISBSG data
  19. 19. 19Calibrating the model• Collect historical data• Use ISBSG data
  20. 20. 20Estimation accuracy• Study of 10 completed projects, Different sizes,programming languages, etc− Expert estimate− Estimating Wizard estimate− Estimates vs. Actual results• Calculate− Effort accuracy (Effort estimate / Actual effort)− Duration Accuracy (Duration Estimate / Actual Duration)− Cost Accuracy (Cost Estimate / Actual Cost)
  21. 21. 21Results• Calculate− Effort accuracy (Effort estimate / Actual effort)− Duration Accuracy (Duration Estimate / Actual Duration)− Cost Accuracy (Cost Estimate / Actual Cost)Expert Expert Expert Expert Est. Wizard Est. Wizard Est. Wizard Est. WizardProject Size (FP)EffortAccuracyDurationAccuracyCostAccuracy Time SpentEffortAccuracyDurationAccuracyCostAccuracyTimespentProject 1 277 0,675 1,545 0,467 30 0,477 1,204 0,501 17Project 2 359 0,579 0,951 1,139 35 0,707 0,775 1,170 26Project 3 347 0,589 0,142 0,615 40 1,067 0,996 1,283 14Project 4 1.178 0,414 0,557 0,312 60 0,774 0,590 0,862 55Project 5 951 1,430 0,997 0,946 34 1,067 0,877 1,718 24Project 6 295 0,763 0,857 0,619 26 0,881 1,200 0,845 6Project 7 790 0,717 0,850 0,976 34 0,926 0,865 1,132 27Project 8 350 1,258 0,800 1,309 28 1,203 1,096 1,318 20Project 9 746 0,586 0,296 0,545 34 0,826 0,385 1,953 22Project 10 2.293 0,766 0,421 0,797 40 0,931 0,632 1,058 14
  22. 22. 22Estimation Accuracy results• Average time spent− Expert: 36,1 hours− EW (including FPA / COSMIC measurement): 22,5 hoursExpert Estimate Est. Wizard EstimateEffort AccuracyAverage 0,778 0,886St.Dev. 0,319 0,207Median 0,696 0,904Duration AccuracyAverage 0,742 0,862St.Dev. 0,405 0,272Median 0,825 0,871Cost AccuracyAverage 0,772 1,184St.Dev. 0,316 0,423Median 0,708 1,151
  23. 23. 23Results• Expert estimates take a lot of time too!− on average 60% more than Parametric Estimates− More than 1 expert has to read through all thedocumentation− Discussions and agreements take time• Parametric estimates are more acurate!− Effort and duration estimates on average still optimistic, butless optimistic than expert estimates− Cost estimate pessimistic, but still closer to actuals thanExpert estimate• Experts might win the project, but the result willbe overruns!
  24. 24. 24Cost of high and low estimatesNon-linear extra costs- Planning errors- Team enlargement more expensive, not faster- Extra management attention / overhead- Stress: More defects, lower maintainability !!Linear extra costsExtra hours will be used
  25. 25. 25Conclusions of the study• Estimating wizard− Higher effort estimation accuracy− Higher duration estimation accuracy− Higher cost estimation accuracy (although >1)− Less hours spent than expert estimates− Standard WBS, historical data collection and parametercalibration improve the maturity of the process• Next steps− Analyze why costs are overestimated− Try to identify the projects where EW estimate is enough− Use the results to convince project management and bidmanagement to use parametric estimating more
  26. 26. 26Do we have time for some other thoughts?• What about the maturity in the Swedish market?
  27. 27. 27ISBSG ‘Country Analysis Report’• ISBSG repository (>6000 projects)
  28. 28. 28ISBSG ‘Country Analysis Report’
  29. 29. 29Where are the Swedish projects??• Please submit project data to ISBSG !• Independent, anonymous• Free benchmark report• Flyer ISBSG availablewww.isbsg.org
  30. 30. 30Thank you for your attention !!@haroldveendamPresident ISBSG (International Software Benchmarking Standards Group (www.isbsg.org))Board member NESMA (Netherlands Software Metrics Association (www.nesma.nl))IAC member COSMIC (www.cosmicon.com)Harold.van.heeringen@sogeti.nlHarold van HeeringenSenior Consultant Software Metrics /Software Cost EngineerSogeti Sizing, Estimating & Control (SEC)

×