Software Estimation - Better Information, Better Decisions

1,137 views

Published on

This presentation highlights what measure may be missing from a managers tool box. Once we establish what measures are necessary, we will learn more about the missing measure and how to apply it to manage projects, manage performance and even manage the customer.

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,137
On SlideShare
0
From Embeds
0
Number of Embeds
15
Actions
Shares
0
Downloads
1
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Estimating is part of the decision making process
  • Software Estimation - Better Information, Better Decisions

    1. 1. Software Estimation:Better Information, Better Decisions David Herron d.herron@davidconsultinggroup.com 2013
    2. 2. Decisions, Decisions• What decisions need to be made?• Who makes those decisions?• What information is available?• How effectively do we make decisions?• What is the impact of poorly made decisions?©2013 David Consulting Group
    3. 3. Business and Technical Decisions Strategic Positioning Continuous (Business & Technical) Deliver Process Improvement Value Improve Productivity Satisfy Customer Reduce Time to Market Improve Competitive Position Increase Quality Increase Lower Market Costs Share Improve Increase Margins Shareholder Revenues Value©2013 David Consulting Group
    4. 4. Establish an Information Framework Executive Management Enterprise Dashboard Milestone Baseline Plan Actual % Var Project Resource and Effort Status Business Decisions Checkpoint A – Charter & Kickoff 1/10/2008 1/10/2008 1/10/2008 0% Requirements Complete 1/28/2008 1/28/2008 1/28/2008 0% 1,600 2/4/2008 2/4/2008 2/15/2008 Project Resources/Hours Vendor Selection Complete 7% 1,400 PMP/Schedule Complete 2/12/2008 2/12/2008 2/28/2008 11% 1,200 Checkpoint B– Planning & Reqs 2/28/2008 3/15/2008 11% 1,000 Design Complete 3/15/2008 4/15/2008 20% 800 Development Complete 4/15/2008 4/30/2008 10% 600 Checkpoint C– Midpoint 4/30/2008 5/15/2008 10% 4/30/2008 5/15/2008 400 Testing Complete 10% 5/10/2008 5/30/2008 200 Training Complete 13% 5/30/2008 6/15/2008 0 Go Live 11% 8 8 8 8 08 8 8 08 8 8 8 8 6/1/2008 6/30/2008 l0 0 0 0 r0 n0 0 0 b0 0 Lessons Learned/Cust Sat Survey Complete 19% g n ar ct ec ov ay Sep Ju Ap Fe Au Ju Ja O M D N M Checkpoint D – Deploy & Close 6/1/2008 6/30/2008 19% Performance Cum Planned Effort Allocated Cum Actual Effort Spent "Earned Value" Baseline Total Hours Project Defect Status Requirements Growth and Stability 1000 900 800 200 150 Measures # of Requirements 700 # of Defects 600 100 500 400 50 300 200 0 100 0 -50 8 8 8 08 8 08 8 8 8 8 8 l0 08 8 8 08 0 8 8 8 8 r0 n0 0 0 08 b0 0 8 08 l0 g0 n0 r0 n0 0 g p0 n 0 ar ct ay Sep Ju ar ov ec Ap ay Fe Au Ju Ja Feb ct Ju O M Ap M Au Ju Ja Se M O D N M Added Changed Deleted Total Reqs Total Defects Discovered Total Closed Defects Process Process Management Measurement Baseline Repository Process Impro Measures Enterprise .. ve Contr Project Score Mngmnt Req Des Build Test Environ BI Product Releases | Q2 2007 56.2 68 62 68 58 41 35 ol Database EDW Phase IV: Applicant Tracking System 44.3 68 49 57 35 28 35 CRM Product Maintenance Releases | Q3 2007 60.2 73 74 68 65 41 27 Road to 90: In Bound 36.4 57 44 32 46 22 27 Exec SAR PM 2.0 37.5 50 51 25 46 28 27 Defin Measur Meetings | Teleconf. vendor selection 46.6 68 62 57 38 25 27 ute e CoBRA Application 53.6 77 64 50 46 50 31 e Web 2.1 53.2 61 72 48 58 41 31 Proce Web 2.0 Q1 Maintenance 43.7 61 54 20 58 44 31 Q3 2007 Web v2.1 Enhancements / Maintenance 47.3 61 54 20 58 41 31 ss Web v2.2 (EPN) 59.8 77 69 55 58 53 31 Web v2.2 Enhancements / Maintenance | Q4 2007 44.2 61 54 20 65 41 31 Project PAL Historical Measures Project X Project Y Project Z Project Data©2013 David Consulting Group
    5. 5. Why Are We Estimating? Planning Budgeting Managing Projects Allocating ResourcesManaging ExpectationsProj. Portfolio Mngmnt 0% 20% 40% 60% 80% 100%©2013 David Consulting Group
    6. 6. Decisions Relevant to Project Managers• Shifting priorities – Scope – Schedule• Managing expectations – Customer – Senior Management• Lack of performance – Productivity – Skill levels• Controlling Costs©2013 David Consulting Group
    7. 7. Estimating Best PracticesThe Software Engineering Institute’s (SEI) requirements for goodestimating:• Corporate historical database• Structured processes for estimating product size and reuse• Mechanisms for extrapolating benchmark characteristics of past projects• Audit trails• Integrity in dealing with dictated costs and schedules• Data collection and feedback processes foster correct data interpretation©2013 David Consulting Group
    8. 8. Basis of Estimates Prior Experience Historical Data Commercial Data Bases Industry Bnchmk Data Other 0% 20% 40% 60% 80% 100%©2013 David Consulting Group
    9. 9. Establishing a Level of Performance COLLECT COLLECT QUANTITATIVE DATA QUALITATIVE DATA Size Process Collection Effort Methods Duration Skills Cost Tools Quality Management Analysis Measured Capability Performance Profiles Results Baseline Performance©2013 David Consulting Group
    10. 10. Developing a Performance ProfileProduct Performance Performance RiskDeliverable D Indicators Factors C D Time to Market Management B Definition Level of Effort A Quality (Defects) Design Build Test Environment DELIVERY PROFILES MEASURES (Size) RATE 21 19 FP/PM PROFICIENCIES A B 36 13 FP/PM INADEQUACIES C 58 16 FP/PM D 110 10 FP/PM : 550 5 FP/PM ©2013 David Consulting Group
    11. 11. Quantitative Performance Evaluation Quantitative Assessment: COLLECT QUANTITATIVE DATA Size  Perform functional sizing on all selected projects. Effort  Collect data on project level of effort, cost, duration Duration and quality. Cost  Calculate productivity rates for each project, including Quality functional size delivered per staff month, cost per functional size, time to market, and defects delivered. Measured Performance Results: Baseline Productivity Average Project Size 133 Average FP/SM 10.7 Average Time-To-Market (Months) 6.9 Average Cost/FP $939 Delivered Defects/FP 0.0301©2013 David Consulting Group
    12. 12. Create a Profile of Key VariablesManagement Definition Design• Team Dynamics • Clearly Stated • Formal Process• Morale Requirements • Rigorous Reviews• Project Tracking • Formal Process • Design Reuse• Project Planning • Customer Involvement • Customer Involvement• Automation • Experience Levels • Experienced Development• Management Skills • Business Impact Staff • AutomationBuild Test Environment• Code Reviews • Formal Testing Methods • New Technology• Source Code Tracking • Test Plans • Automated Process• Code Reuse • Staff Testing Experience • Adequate Training• Data Administration • Effective Test Tools • Organizational Dynamics• Experienced Staff • Customer Involvement • Certification• Automation©2013 David Consulting Group 11
    13. 13. Qualitative Performance Evaluation COLLECT Qualitative Assessment QUALITATIVE DATA  Conduct interviews with members of each Process project team Methods Skills  Collect Project Profile information Tools  Develop Performance Profiles to display Management strengths and weaknesses among the selected Capability projects Profiles Results Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent Accounts Payable 55.3 47.73 82.05 50.00 46.15 43.75 50.00 Priotity One 27.6 50.00 48.72 11.36 38.46 0.00 42.31 HR Enhancements 32.3 29.55 48.72 0.00 42.31 37.50 42.31 Client Accounts 29.5 31.82 43.59 0.00 30.77 37.50 42.31 ABC Release 44.1 31.82 53.85 34.09 38.46 53.13 42.31 Screen Redesign 17.0 22.73 43.59 0.00 15.38 0.00 30.77 Customer Web 40.2 45.45 23.08 38.64 53.85 50.00 34.62 Whole Life 29.2 56.82 28.21 22.73 26.92 18.75 53.85 Regional - East 22.7 36.36 43.59 0.00 30.77 9.38 30.77 Regional - West 17.6 43.18 23.08 0.00 26.92 9.38 26.92 Cashflow 40.6 56.82 71.79 0.00 38.46 43.75 38.46 Credit Automation 23.5 29.55 48.72 0.00 38.46 6.25 26.92 NISE 49.0 38.64 56.41 52.27 30.77 53.13 53.85 Help Desk Automation 49.3 54.55 74.36 20.45 53.85 50.00 38.46 Formula One Upgrade 22.8 31.82 38.46 0.00 11.54 25.00 46.15©2013 David Consulting Group 12
    14. 14. Overall Information Framework Executive Management Enterprise Dashboard Milestone Baseline Plan Actual % Var Project Resource and Effort Status Business Decisions Checkpoint A – Charter & Kickoff 1/10/2008 1/10/2008 1/10/2008 0% Requirements Complete 1/28/2008 1/28/2008 1/28/2008 0% 1,600 2/4/2008 2/4/2008 2/15/2008 Project Resources/Hours Vendor Selection Complete 7% 1,400 PMP/Schedule Complete 2/12/2008 2/12/2008 2/28/2008 11% 1,200 Checkpoint B– Planning & Reqs 2/28/2008 3/15/2008 11% 1,000 Design Complete 3/15/2008 4/15/2008 20% 800 Development Complete 4/15/2008 4/30/2008 10% 600 Checkpoint C– Midpoint 4/30/2008 5/15/2008 10% 4/30/2008 5/15/2008 400 Testing Complete 10% 5/10/2008 5/30/2008 200 Training Complete 13% 5/30/2008 6/15/2008 0 Go Live 11% 8 8 8 8 08 8 8 08 8 8 8 8 6/1/2008 6/30/2008 l0 0 0 0 r0 n0 0 0 b0 0 Lessons Learned/Cust Sat Survey Complete 19% g n ar ct ec ov ay Sep Ju Ap Fe Au Ju Ja O M D N M Checkpoint D – Deploy & Close 6/1/2008 6/30/2008 19% Cum Planned Effort Allocated Cum Actual Effort Spent Performance "Earned Value" Baseline Total Hours Project Defect Status Requirements Growth and Stability Measures 1000 200 900 800 150 # of Requirements 700 # of Defects 600 100 500 400 50 300 200 0 100 0 -50 8 8 8 08 8 08 8 8 8 8 8 l0 08 8 8 08 0 8 8 8 8 r0 n0 0 0 08 b0 0 8 08 l0 g0 n0 r0 n0 0 g p0 n 0 ar ct ay Sep Ju ar ov ec Ap ay Fe Au Ju Ja Feb ct Ju O M Ap M Au Ju Ja Se M O D N M Added Changed Deleted Total Reqs Total Defects Discovered Total Closed Defects Process Process Management Measurement Baseline Repository Process Impro Measures Enterprise .. ve Contr Project Score Mngmnt Req Des Build Test Environ BI Product Releases | Q2 2007 56.2 68 62 68 58 41 35 ol Database EDW Phase IV: Applicant Tracking System 44.3 68 49 57 35 28 35 CRM Product Maintenance Releases | Q3 2007 60.2 73 74 68 65 41 27 Road to 90: In Bound 36.4 57 44 32 46 22 27 Exec SAR PM 2.0 37.5 50 51 25 46 28 27 Defin Measur Meetings | Teleconf. vendor selection 46.6 68 62 57 38 25 27 ute e CoBRA Application 53.6 77 64 50 46 50 31 e Web 2.1 53.2 61 72 48 58 41 31 Proce Web 2.0 Q1 Maintenance 43.7 61 54 20 58 44 31 Q3 2007 Web v2.1 Enhancements / Maintenance 47.3 61 54 20 58 41 31 ss Web v2.2 (EPN) 59.8 77 69 55 58 53 31 Web v2.2 Enhancements / Maintenance | Q4 2007 44.2 61 54 20 65 41 31 Project PAL Historical Measures End User Project X Project Y Project Z Project Estimates©2013 David Consulting Group 13
    15. 15. Benefits of Good Estimating• Reduce Risk• Reduce Costs• Gain Credibility• Manage Expectations• Resource Capacity Planning• Improve Decision Making Capability©2013 David Consulting Group
    16. 16. Barriers to Successful Estimating Availability of Data Lack of SkillsNo Documented Procedure Lack of AutomationLack of Sr. Mngmnt Support Est. Not Id as Problem Other No Perceived Value 0% 20% 40% 60% 80% 100%©2013 David Consulting Group 15
    17. 17. Basic Estimating Model• Quantify the size• Assess the complexity• Understand the capacity to deliver DEFINITION CAPABILITY ESTIMATE CAPACITY PROJECT X PROJECT X ScheduleREQUIREMENT to SIZE COMPLEXITY DELIVER Costs Effort Estimating Model©2013 David Consulting Group 16
    18. 18. Why is Sizing Important?Finding – Nine out of ten projects that fail have not been properly sizedConsider – When you build a house you specify all the functions and features you want – these are your requirements The builder then generates an estimate based on the size (square footage) of your requirements• Size is the key to effectively managing software projects©2013 David Consulting Group 17
    19. 19. Sizing Options Internal vs.Organizational External Power vs. Specific Industry Definitions Definitions Defined Ease of Use Fewer More Rules Rules Modules, Story Lines of Use Case Cosmic, Use Cases, Points Code Points NESMA FP Hours, Story Use Case COSMIC Days Points Points NESMA FP Test Cases IFPUG Function Points IFPUG Function Points Mark II Mark II Easier to Harder to Learn LearnLess Accurate Consistency/ More Accurate Accuracy Ease of Use Power / Ease Index Story Lines of Use Case COSMIC Hours, Power Increases Points Code Points NESMA FP Days IFPUG Function Points Mark II ©2013 David Consulting Group
    20. 20. Why Function Points?Function Point Analysis is a standardized method for measuring thefunctionality delivered to an end user.• Consistent method• Easy to learn• Available early in the lifecycle• Acceptable level of accuracy• Meaningful internally and externally• Function Point counts have replaced Line of Code counts as a sizing metric that can be used consistently and with a high degree of accuracy.©2013 David Consulting Group 19
    21. 21. The Function Point MethodologyThe software deliverable is sized based upon the functionality delivered.• Inputs• Outputs• Inquiries Five key components are• Data Stores identified based on logical user view• Interface Files Input Inquiry Output Data Stores Interface Application File©2013 David Consulting Group 20
    22. 22. Simplifying the MethodologyThe Formal Process:1. Identify Components2. Assess Complexity3. Apply Weightings4. Compute Function Points Components: Low Avg High Total Data Stores __ x 7 __ x 10 __ x 15 ___ Interfaces __ x 5 __ x 7 __ x 10 ___ Inputs __ x 3 __ x 4 __ x 6 ___ Outputs __ x 4 __ x 5 __ x 7 ___ Inquiries __ x 3 __ x 4 __ x 6 ___ Total Function Points ___©2013 David Consulting Group 21
    23. 23. Common Criticisms of Function Points• FP methodology terms are confusing• Too long to learn, need an expert• Need too much detailed data• Does not reflect the complexity of the application• Takes too much time• We tried it before©2013 David Consulting Group 22
    24. 24. Simplifying the Methodology• Assume complexity to be average Complexity Components: Low Avg . High Total Data Stores __ x 7 __ x 10 __ x 15 ___ Interfaces __ x 5 __ x 7 __ x 10 ___ Inputs __ x 3 __ x 4 __ x 6 ___ Outputs __ x 4 __ x 5 __ x 7 ___ Inquiries __ x 3 __ x 4 __ x 6 ___ Total Function Points ___©2013 David Consulting Group 23
    25. 25. Exercise: Identify the Functionality PURCHASE USER USER ORDER Input SYSTEMInputs ADD, CHG PAYMENTS INVOICES Interface PURCHASE ORDER INFO PAYMENTS INVOICES Inquiry VENDOR Data Stores USER PAYMENT STATUS ACCOUNTS PAYABLE Output USER PAID INVOICES©2013 David Consulting Group 24
    26. 26. Determine the Functional SizeThe FP Lite Process: TM1. Identify Components USER USER PURCHASE ORDER2. Assess Complexity SYSEM PAYMENTS3. Apply Weightings INVOICES VENDOR USER4. Compute Function Points ACCOUNTS PAYABLE USER Components Low Average High Total Data Stores _______ x7 3 x10 _______ x15 30 Interfaces _______ x5 1 x7 _______ x10 7 Inputs _______ x3 3 x4 _______ x6 12 Outputs _______ x4 1 x5 _______ x7 5 Inquiries _______ x3 1 x4 _______ x6 4 58 Function Point Size©2013 David Consulting Group 25
    27. 27. Estimating Techniques• Manual, using organizational baseline data• Manual, using standard industry data• Automated, using a commercial estimating software package CAPACITY PROJECT X PROJECT X to Input SIZE COMPLEXITY DELIVER Output Estimating Model©2013 David Consulting Group 26
    28. 28. Estimating Using Delivery Rates DEFINITION CAPABILITY ESTIMATE PROJECT X RATE PROJECT ScheduleREQUIREMENT X OF SIZE COMPLEXITY DELIVERY Costs Effort FUNCTIONAL SIZE PROFILE©2013 David Consulting Group 27
    29. 29. A Comprehensive Measurement of Capability COLLECT COLLECT QUANTITATIVE DATA QUALITATIVE DATA ProcessCollection Size Methods Effort Skills Duration Tools Defects ManagementAnalysis Measured Capability Performance Profiles Results Benchmark Data Delivery Rates Best Action Practices©2013 David Consulting Group 28

    ×