Your SlideShare is downloading. ×
0
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Software Estimation -  Better Information, Better Decisions
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Software Estimation - Better Information, Better Decisions

948

Published on

This presentation highlights what measure may be missing from a managers tool box. Once we establish what measures are necessary, we will learn more about the missing measure and how to apply it to …

This presentation highlights what measure may be missing from a managers tool box. Once we establish what measures are necessary, we will learn more about the missing measure and how to apply it to manage projects, manage performance and even manage the customer.

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
948
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
1
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Estimating is part of the decision making process
  • Transcript

    • 1. Software Estimation:Better Information, Better Decisions David Herron d.herron@davidconsultinggroup.com 2013
    • 2. Decisions, Decisions• What decisions need to be made?• Who makes those decisions?• What information is available?• How effectively do we make decisions?• What is the impact of poorly made decisions?©2013 David Consulting Group
    • 3. Business and Technical Decisions Strategic Positioning Continuous (Business & Technical) Deliver Process Improvement Value Improve Productivity Satisfy Customer Reduce Time to Market Improve Competitive Position Increase Quality Increase Lower Market Costs Share Improve Increase Margins Shareholder Revenues Value©2013 David Consulting Group
    • 4. Establish an Information Framework Executive Management Enterprise Dashboard Milestone Baseline Plan Actual % Var Project Resource and Effort Status Business Decisions Checkpoint A – Charter & Kickoff 1/10/2008 1/10/2008 1/10/2008 0% Requirements Complete 1/28/2008 1/28/2008 1/28/2008 0% 1,600 2/4/2008 2/4/2008 2/15/2008 Project Resources/Hours Vendor Selection Complete 7% 1,400 PMP/Schedule Complete 2/12/2008 2/12/2008 2/28/2008 11% 1,200 Checkpoint B– Planning & Reqs 2/28/2008 3/15/2008 11% 1,000 Design Complete 3/15/2008 4/15/2008 20% 800 Development Complete 4/15/2008 4/30/2008 10% 600 Checkpoint C– Midpoint 4/30/2008 5/15/2008 10% 4/30/2008 5/15/2008 400 Testing Complete 10% 5/10/2008 5/30/2008 200 Training Complete 13% 5/30/2008 6/15/2008 0 Go Live 11% 8 8 8 8 08 8 8 08 8 8 8 8 6/1/2008 6/30/2008 l0 0 0 0 r0 n0 0 0 b0 0 Lessons Learned/Cust Sat Survey Complete 19% g n ar ct ec ov ay Sep Ju Ap Fe Au Ju Ja O M D N M Checkpoint D – Deploy & Close 6/1/2008 6/30/2008 19% Performance Cum Planned Effort Allocated Cum Actual Effort Spent "Earned Value" Baseline Total Hours Project Defect Status Requirements Growth and Stability 1000 900 800 200 150 Measures # of Requirements 700 # of Defects 600 100 500 400 50 300 200 0 100 0 -50 8 8 8 08 8 08 8 8 8 8 8 l0 08 8 8 08 0 8 8 8 8 r0 n0 0 0 08 b0 0 8 08 l0 g0 n0 r0 n0 0 g p0 n 0 ar ct ay Sep Ju ar ov ec Ap ay Fe Au Ju Ja Feb ct Ju O M Ap M Au Ju Ja Se M O D N M Added Changed Deleted Total Reqs Total Defects Discovered Total Closed Defects Process Process Management Measurement Baseline Repository Process Impro Measures Enterprise .. ve Contr Project Score Mngmnt Req Des Build Test Environ BI Product Releases | Q2 2007 56.2 68 62 68 58 41 35 ol Database EDW Phase IV: Applicant Tracking System 44.3 68 49 57 35 28 35 CRM Product Maintenance Releases | Q3 2007 60.2 73 74 68 65 41 27 Road to 90: In Bound 36.4 57 44 32 46 22 27 Exec SAR PM 2.0 37.5 50 51 25 46 28 27 Defin Measur Meetings | Teleconf. vendor selection 46.6 68 62 57 38 25 27 ute e CoBRA Application 53.6 77 64 50 46 50 31 e Web 2.1 53.2 61 72 48 58 41 31 Proce Web 2.0 Q1 Maintenance 43.7 61 54 20 58 44 31 Q3 2007 Web v2.1 Enhancements / Maintenance 47.3 61 54 20 58 41 31 ss Web v2.2 (EPN) 59.8 77 69 55 58 53 31 Web v2.2 Enhancements / Maintenance | Q4 2007 44.2 61 54 20 65 41 31 Project PAL Historical Measures Project X Project Y Project Z Project Data©2013 David Consulting Group
    • 5. Why Are We Estimating? Planning Budgeting Managing Projects Allocating ResourcesManaging ExpectationsProj. Portfolio Mngmnt 0% 20% 40% 60% 80% 100%©2013 David Consulting Group
    • 6. Decisions Relevant to Project Managers• Shifting priorities – Scope – Schedule• Managing expectations – Customer – Senior Management• Lack of performance – Productivity – Skill levels• Controlling Costs©2013 David Consulting Group
    • 7. Estimating Best PracticesThe Software Engineering Institute’s (SEI) requirements for goodestimating:• Corporate historical database• Structured processes for estimating product size and reuse• Mechanisms for extrapolating benchmark characteristics of past projects• Audit trails• Integrity in dealing with dictated costs and schedules• Data collection and feedback processes foster correct data interpretation©2013 David Consulting Group
    • 8. Basis of Estimates Prior Experience Historical Data Commercial Data Bases Industry Bnchmk Data Other 0% 20% 40% 60% 80% 100%©2013 David Consulting Group
    • 9. Establishing a Level of Performance COLLECT COLLECT QUANTITATIVE DATA QUALITATIVE DATA Size Process Collection Effort Methods Duration Skills Cost Tools Quality Management Analysis Measured Capability Performance Profiles Results Baseline Performance©2013 David Consulting Group
    • 10. Developing a Performance ProfileProduct Performance Performance RiskDeliverable D Indicators Factors C D Time to Market Management B Definition Level of Effort A Quality (Defects) Design Build Test Environment DELIVERY PROFILES MEASURES (Size) RATE 21 19 FP/PM PROFICIENCIES A B 36 13 FP/PM INADEQUACIES C 58 16 FP/PM D 110 10 FP/PM : 550 5 FP/PM ©2013 David Consulting Group
    • 11. Quantitative Performance Evaluation Quantitative Assessment: COLLECT QUANTITATIVE DATA Size  Perform functional sizing on all selected projects. Effort  Collect data on project level of effort, cost, duration Duration and quality. Cost  Calculate productivity rates for each project, including Quality functional size delivered per staff month, cost per functional size, time to market, and defects delivered. Measured Performance Results: Baseline Productivity Average Project Size 133 Average FP/SM 10.7 Average Time-To-Market (Months) 6.9 Average Cost/FP $939 Delivered Defects/FP 0.0301©2013 David Consulting Group
    • 12. Create a Profile of Key VariablesManagement Definition Design• Team Dynamics • Clearly Stated • Formal Process• Morale Requirements • Rigorous Reviews• Project Tracking • Formal Process • Design Reuse• Project Planning • Customer Involvement • Customer Involvement• Automation • Experience Levels • Experienced Development• Management Skills • Business Impact Staff • AutomationBuild Test Environment• Code Reviews • Formal Testing Methods • New Technology• Source Code Tracking • Test Plans • Automated Process• Code Reuse • Staff Testing Experience • Adequate Training• Data Administration • Effective Test Tools • Organizational Dynamics• Experienced Staff • Customer Involvement • Certification• Automation©2013 David Consulting Group 11
    • 13. Qualitative Performance Evaluation COLLECT Qualitative Assessment QUALITATIVE DATA  Conduct interviews with members of each Process project team Methods Skills  Collect Project Profile information Tools  Develop Performance Profiles to display Management strengths and weaknesses among the selected Capability projects Profiles Results Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent Accounts Payable 55.3 47.73 82.05 50.00 46.15 43.75 50.00 Priotity One 27.6 50.00 48.72 11.36 38.46 0.00 42.31 HR Enhancements 32.3 29.55 48.72 0.00 42.31 37.50 42.31 Client Accounts 29.5 31.82 43.59 0.00 30.77 37.50 42.31 ABC Release 44.1 31.82 53.85 34.09 38.46 53.13 42.31 Screen Redesign 17.0 22.73 43.59 0.00 15.38 0.00 30.77 Customer Web 40.2 45.45 23.08 38.64 53.85 50.00 34.62 Whole Life 29.2 56.82 28.21 22.73 26.92 18.75 53.85 Regional - East 22.7 36.36 43.59 0.00 30.77 9.38 30.77 Regional - West 17.6 43.18 23.08 0.00 26.92 9.38 26.92 Cashflow 40.6 56.82 71.79 0.00 38.46 43.75 38.46 Credit Automation 23.5 29.55 48.72 0.00 38.46 6.25 26.92 NISE 49.0 38.64 56.41 52.27 30.77 53.13 53.85 Help Desk Automation 49.3 54.55 74.36 20.45 53.85 50.00 38.46 Formula One Upgrade 22.8 31.82 38.46 0.00 11.54 25.00 46.15©2013 David Consulting Group 12
    • 14. Overall Information Framework Executive Management Enterprise Dashboard Milestone Baseline Plan Actual % Var Project Resource and Effort Status Business Decisions Checkpoint A – Charter & Kickoff 1/10/2008 1/10/2008 1/10/2008 0% Requirements Complete 1/28/2008 1/28/2008 1/28/2008 0% 1,600 2/4/2008 2/4/2008 2/15/2008 Project Resources/Hours Vendor Selection Complete 7% 1,400 PMP/Schedule Complete 2/12/2008 2/12/2008 2/28/2008 11% 1,200 Checkpoint B– Planning & Reqs 2/28/2008 3/15/2008 11% 1,000 Design Complete 3/15/2008 4/15/2008 20% 800 Development Complete 4/15/2008 4/30/2008 10% 600 Checkpoint C– Midpoint 4/30/2008 5/15/2008 10% 4/30/2008 5/15/2008 400 Testing Complete 10% 5/10/2008 5/30/2008 200 Training Complete 13% 5/30/2008 6/15/2008 0 Go Live 11% 8 8 8 8 08 8 8 08 8 8 8 8 6/1/2008 6/30/2008 l0 0 0 0 r0 n0 0 0 b0 0 Lessons Learned/Cust Sat Survey Complete 19% g n ar ct ec ov ay Sep Ju Ap Fe Au Ju Ja O M D N M Checkpoint D – Deploy & Close 6/1/2008 6/30/2008 19% Cum Planned Effort Allocated Cum Actual Effort Spent Performance "Earned Value" Baseline Total Hours Project Defect Status Requirements Growth and Stability Measures 1000 200 900 800 150 # of Requirements 700 # of Defects 600 100 500 400 50 300 200 0 100 0 -50 8 8 8 08 8 08 8 8 8 8 8 l0 08 8 8 08 0 8 8 8 8 r0 n0 0 0 08 b0 0 8 08 l0 g0 n0 r0 n0 0 g p0 n 0 ar ct ay Sep Ju ar ov ec Ap ay Fe Au Ju Ja Feb ct Ju O M Ap M Au Ju Ja Se M O D N M Added Changed Deleted Total Reqs Total Defects Discovered Total Closed Defects Process Process Management Measurement Baseline Repository Process Impro Measures Enterprise .. ve Contr Project Score Mngmnt Req Des Build Test Environ BI Product Releases | Q2 2007 56.2 68 62 68 58 41 35 ol Database EDW Phase IV: Applicant Tracking System 44.3 68 49 57 35 28 35 CRM Product Maintenance Releases | Q3 2007 60.2 73 74 68 65 41 27 Road to 90: In Bound 36.4 57 44 32 46 22 27 Exec SAR PM 2.0 37.5 50 51 25 46 28 27 Defin Measur Meetings | Teleconf. vendor selection 46.6 68 62 57 38 25 27 ute e CoBRA Application 53.6 77 64 50 46 50 31 e Web 2.1 53.2 61 72 48 58 41 31 Proce Web 2.0 Q1 Maintenance 43.7 61 54 20 58 44 31 Q3 2007 Web v2.1 Enhancements / Maintenance 47.3 61 54 20 58 41 31 ss Web v2.2 (EPN) 59.8 77 69 55 58 53 31 Web v2.2 Enhancements / Maintenance | Q4 2007 44.2 61 54 20 65 41 31 Project PAL Historical Measures End User Project X Project Y Project Z Project Estimates©2013 David Consulting Group 13
    • 15. Benefits of Good Estimating• Reduce Risk• Reduce Costs• Gain Credibility• Manage Expectations• Resource Capacity Planning• Improve Decision Making Capability©2013 David Consulting Group
    • 16. Barriers to Successful Estimating Availability of Data Lack of SkillsNo Documented Procedure Lack of AutomationLack of Sr. Mngmnt Support Est. Not Id as Problem Other No Perceived Value 0% 20% 40% 60% 80% 100%©2013 David Consulting Group 15
    • 17. Basic Estimating Model• Quantify the size• Assess the complexity• Understand the capacity to deliver DEFINITION CAPABILITY ESTIMATE CAPACITY PROJECT X PROJECT X ScheduleREQUIREMENT to SIZE COMPLEXITY DELIVER Costs Effort Estimating Model©2013 David Consulting Group 16
    • 18. Why is Sizing Important?Finding – Nine out of ten projects that fail have not been properly sizedConsider – When you build a house you specify all the functions and features you want – these are your requirements The builder then generates an estimate based on the size (square footage) of your requirements• Size is the key to effectively managing software projects©2013 David Consulting Group 17
    • 19. Sizing Options Internal vs.Organizational External Power vs. Specific Industry Definitions Definitions Defined Ease of Use Fewer More Rules Rules Modules, Story Lines of Use Case Cosmic, Use Cases, Points Code Points NESMA FP Hours, Story Use Case COSMIC Days Points Points NESMA FP Test Cases IFPUG Function Points IFPUG Function Points Mark II Mark II Easier to Harder to Learn LearnLess Accurate Consistency/ More Accurate Accuracy Ease of Use Power / Ease Index Story Lines of Use Case COSMIC Hours, Power Increases Points Code Points NESMA FP Days IFPUG Function Points Mark II ©2013 David Consulting Group
    • 20. Why Function Points?Function Point Analysis is a standardized method for measuring thefunctionality delivered to an end user.• Consistent method• Easy to learn• Available early in the lifecycle• Acceptable level of accuracy• Meaningful internally and externally• Function Point counts have replaced Line of Code counts as a sizing metric that can be used consistently and with a high degree of accuracy.©2013 David Consulting Group 19
    • 21. The Function Point MethodologyThe software deliverable is sized based upon the functionality delivered.• Inputs• Outputs• Inquiries Five key components are• Data Stores identified based on logical user view• Interface Files Input Inquiry Output Data Stores Interface Application File©2013 David Consulting Group 20
    • 22. Simplifying the MethodologyThe Formal Process:1. Identify Components2. Assess Complexity3. Apply Weightings4. Compute Function Points Components: Low Avg High Total Data Stores __ x 7 __ x 10 __ x 15 ___ Interfaces __ x 5 __ x 7 __ x 10 ___ Inputs __ x 3 __ x 4 __ x 6 ___ Outputs __ x 4 __ x 5 __ x 7 ___ Inquiries __ x 3 __ x 4 __ x 6 ___ Total Function Points ___©2013 David Consulting Group 21
    • 23. Common Criticisms of Function Points• FP methodology terms are confusing• Too long to learn, need an expert• Need too much detailed data• Does not reflect the complexity of the application• Takes too much time• We tried it before©2013 David Consulting Group 22
    • 24. Simplifying the Methodology• Assume complexity to be average Complexity Components: Low Avg . High Total Data Stores __ x 7 __ x 10 __ x 15 ___ Interfaces __ x 5 __ x 7 __ x 10 ___ Inputs __ x 3 __ x 4 __ x 6 ___ Outputs __ x 4 __ x 5 __ x 7 ___ Inquiries __ x 3 __ x 4 __ x 6 ___ Total Function Points ___©2013 David Consulting Group 23
    • 25. Exercise: Identify the Functionality PURCHASE USER USER ORDER Input SYSTEMInputs ADD, CHG PAYMENTS INVOICES Interface PURCHASE ORDER INFO PAYMENTS INVOICES Inquiry VENDOR Data Stores USER PAYMENT STATUS ACCOUNTS PAYABLE Output USER PAID INVOICES©2013 David Consulting Group 24
    • 26. Determine the Functional SizeThe FP Lite Process: TM1. Identify Components USER USER PURCHASE ORDER2. Assess Complexity SYSEM PAYMENTS3. Apply Weightings INVOICES VENDOR USER4. Compute Function Points ACCOUNTS PAYABLE USER Components Low Average High Total Data Stores _______ x7 3 x10 _______ x15 30 Interfaces _______ x5 1 x7 _______ x10 7 Inputs _______ x3 3 x4 _______ x6 12 Outputs _______ x4 1 x5 _______ x7 5 Inquiries _______ x3 1 x4 _______ x6 4 58 Function Point Size©2013 David Consulting Group 25
    • 27. Estimating Techniques• Manual, using organizational baseline data• Manual, using standard industry data• Automated, using a commercial estimating software package CAPACITY PROJECT X PROJECT X to Input SIZE COMPLEXITY DELIVER Output Estimating Model©2013 David Consulting Group 26
    • 28. Estimating Using Delivery Rates DEFINITION CAPABILITY ESTIMATE PROJECT X RATE PROJECT ScheduleREQUIREMENT X OF SIZE COMPLEXITY DELIVERY Costs Effort FUNCTIONAL SIZE PROFILE©2013 David Consulting Group 27
    • 29. A Comprehensive Measurement of Capability COLLECT COLLECT QUANTITATIVE DATA QUALITATIVE DATA ProcessCollection Size Methods Effort Skills Duration Tools Defects ManagementAnalysis Measured Capability Performance Profiles Results Benchmark Data Delivery Rates Best Action Practices©2013 David Consulting Group 28
    • 30. Developing a Performance ProfileProduct Performance Performance RiskDeliverable D Indicators Factors C D Duration (Months) Management B Definition Cost (Effort) A Quality (Defects) Design Build Test Environment RATE OF PROFILES SIZE DELIVERY 21 19 FP/PM PROFICIENCIES A B 36 13 FP/PM INADEQUACIES C 58 16 FP/PM D 110 10 FP/PM : 550 5 FP/PM ©2013 David Consulting Group 29
    • 31. Estimating Using Delivery Rates DEFINITION CAPABILITY ESTIMATE PROJECT PROJECT X ScheduleREQUIREMENT X RATE SIZE COMPLEXITY OF DELIVERY Costs Effort FUNCTIONAL SIZE PROFILE Person 58 X 16 = 3.6 Months©2013 David Consulting Group 30
    • 32. Estimating Techniques• Manual, using organizational baseline data• Manual, using standard industry data PROJECT PROJECT X CAPACITY X Input SIZE COMPLEXITY to DELIVER Output Estimating Model©2013 David Consulting Group 31
    • 33. Industry Data Reveals Best Practices MEASURES CHARACTERISTICS Research Software Size Skill Levels Level of Effort Automation Time to Market Process Delivered Defects Management Cost User Involvement Environment PERFORMANCE Analysis LEVELS PROFILES • Correlate Performance Levels to Characteristics Results • Substantiate Impact of Characteristics • Identify Best Practices©2013 David Consulting Group 32
    • 34. Delivery Rates Productivity per Person Month by Application Release Category 1-150 151-300 301-500 501-750 751+ New Dev Mainframe 12.2 10.6 9.3 8.1 6.7 New Dev C-S 17 15.9 13.8 11.8 9.9 Enh Mainframe Internal 15.7 14.7 12.3 10.8 8.8 Enh Mainframe Package 16.3 17.5 14.7 12.9 10.1 Enh C-S Internal 16.6 15.9 13.3 11.4 9.3 Note: Above values are expressed in Function Points delivered per Person Month (equivalent to 130 hours) Delivery Cycle Time in Calendar Months by Application Release Category 1-150 151-300 301-500 501-750 751+ New Dev Mainframe 4.7 6.3 8.3 11.9 14.8 New Dev C-S 3.8 6.3 8.7 10.4 12.7 Enh Mainframe Internal 3.9 7 9.6 12.5 16.6 Enh Mainframe Package 3.8 6.6 8.6 11.5 16.1 Enh C-S Internal 3.8 6.8 9.2 12.4 16.4 Note: Above values are expressed in Calendar Months to deliver a project within the specified range of Function Points. Any time for work stoppage is excluded.©2013 David Consulting Group 33
    • 35. Estimating Using Delivery Rates DEFINITION CAPABILITY ESTIMATE PROJECT PROJECT X ScheduleREQUIREMENT X RATE SIZE COMPLEXITY OF DELIVERY Costs Effort FUNCTIONAL SIZE PROFILE Person 58 X 15.7 = 3.7 Months©2013 David Consulting Group 34
    • 36. Estimating Techniques• Manual, using organizational baseline data• Manual, using standard industry data• Automated, using a commercial estimating software package CAPACITY PROJECT X PROJECT X to Input SIZE COMPLEXITY DELIVER Output Estimating Model©2013 David Consulting Group
    • 37. Benefits of Automation• Sophisticated Analysis• Information displays – charts, graphs, reports• Interfaces to PM systems, others• Simulation, modeling capabilities• Multi-variable modeling• Calibrated based on actuals©2013 David Consulting Group
    • 38. SEER-SEM User Interface Menu Bar Parameter Window Tool Bar WBS Window Inputs Reports Window OutputsViews Window Charts Window ©2013 David Consulting Group 37
    • 39. SEER: Risk-Driven EstimatesThe Engine for Project Evaluation Least, likely, and most• SEER predicts outcomes inputs provide a range of cost and schedule outcomes• SEER uses inputs to develop probability distributions• The result is a probabilistic estimate• SEER will predict a likely range of outcomes• Monte Carlo provides project-level assessments of risk Confidence (probability) can be set and displayed for any estimated item©2013 David Consulting Group 38
    • 40. SEER for Software Example;Goals/Risks/Outcome Probabilities©2013 David Consulting Group
    • 41. What Can a Parametric Model Tell You? Firm Fixed Price? Feel lucky? What is likely to happen©2013 David Consulting Group
    • 42. SEER for Software Example: Staffing©2013 David Consulting Group
    • 43. Metrics & Analysis Benchmarking: Useto Substantiate and Benchmark©2013 David Consulting Group 42
    • 44. The Estimating Process The estimate is based on the best available information. A poor requirements document Profile REQUIREMENT will result in a poor estimate Size Time Analyst GENERATE SIZE SELECT ESTIMATE ESTABLISH ACTUALS REQUIREMENT PARAMETERS MATCHING PROFILE WHAT IF ANALYSIS Project Measurement Software PM / User Metrics Manager Specialist Accurate estimating is a function Database of using historical data with an effective Plan vs. Actual estimating process. Report©2013 David Consulting Group 43
    • 45. What is “Estimating on Demand?”• Estimating On Demand is an estimating service• It provides organizations with the information they need in order to make important decisions regarding the status of selected projects• Establish an Estimating Center of Excellence©2013 David Consulting Group 44
    • 46. The On-demand Approach• Takes advantage of acquired knowledge• An available estimating resource when and where you need it• Similar to a Software as a Service model• The cost of the service is often less than 2% of the total project cost©2013 David Consulting Group 45
    • 47. On-demand Process Customer Step 1 Defined Needs 1. Project Estimation Determine the type of project Specification Understand the basis for Estimate the estimate. Set expectations Type 2. Measure Size, as to the deliverables. Complexity and Risk Step 3 Generate required estimates including 3. Generate Step 2 Estimate level of effort, duration, Assess the size and risk analysis, and quality. the complexity. Provide input to all the required project variables No 4. Readout • Staffing levels, & experience Step 4 • Requirements stability Review the estimate • Confidence level with the project team. • Target/Host systems Make necessary Acceptable • Schedule Considerations Estimate • Reusability adjustments. Re-estimate • Integration as necessary Yes • Labor rates • Maintenance levels Step 5 5. Final Estimate . Generate final estimate©2013 David Consulting Group 46
    • 48. Estimating: Which One is Right for You?• Manual, using organizational baseline data – Baseline, commitment – Rigorous data collection – Accurately reflects organization• Manual, using standard industry data – FP centric – Values may not be representative – Quick start-up• Automated, using a commercial estimating software package – Investment in software – Develop expertise – Increasing accuracy with calibration©2013 David Consulting Group 47
    • 49. Contact UsEmail: d.herron@davidconsultinggroup.comPhone: 1-610-644-2856, ext 21http://www.davidconsultinggroup.com @DavidConsultGrp /DavidConsultGrp /company/David-Consulting-Group©2013 David Consulting Group 48

    ×