• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Pitt spin-sept-10-ev-in-sw-projects-psp
 

Pitt spin-sept-10-ev-in-sw-projects-psp

on

  • 332 views

Presentation on using Earned Value in Software Development Projects at the Pittsburgh SPIN, September 2010

Presentation on using Earned Value in Software Development Projects at the Pittsburgh SPIN, September 2010

Statistics

Views

Total Views
332
Views on SlideShare
332
Embed Views
0

Actions

Likes
0
Downloads
3
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Pitt spin-sept-10-ev-in-sw-projects-psp Pitt spin-sept-10-ev-in-sw-projects-psp Presentation Transcript

    • Earned Value for SoftwareDevelopmentIs NOT a Myth!Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213William Nichols and James McHaleSeptember 2010 © 2010 Carnegie Mellon University
    • AgendaWhy EV won’t Work (for Software Development Projects)How to Make EV workPulling it all Together (TSP) Earned Value for Software Development SPIN 08-Sept 2010 2 © 2008-09 Carnegie Mellon University
    • Why EV won’t work? Earned Value for Software Development SPIN 08-Sept 2010 3 © 2008-09 Carnegie Mellon University
    • Why EV won’t work for softwareSoftware development work is hard to estimate with sufficient accuracy. Earned Value for Software Development SPIN 08-Sept 2010 4 © 2008-09 Carnegie Mellon University
    • Why EV won’t work for softwareSoftware project work is hard to estimate with sufficient accuracy.You can’t get an accurate progress report from a software developer. Earned Value for Software Development SPIN 08-Sept 2010 5 © 2008-09 Carnegie Mellon University
    • Why EV won’t work for softwareSoftware development work is hard to estimate with sufficient accuracy.You can’t get an accurate progress report from a software developer.Until the software tests successfully we don’t know that we are done. Earned Value for Software Development SPIN 08-Sept 2010 6 © 2008-09 Carnegie Mellon University
    • Why EV won’t work for softwareSoftware development work is hard to estimate with sufficient accuracy.You can’t get an accurate progress report from a software developer.Until the software tests successfully we don’t know that we are done.Software is iterative, work is revised a number of times before complete Earned Value for Software Development SPIN 08-Sept 2010 7 © 2008-09 Carnegie Mellon University
    • Why EV won’t work for softwareSoftware development work is hard to estimate with sufficient accuracy.You can’t get an accurate progress report from a software developer.Until the software tests successfully we don’t know that we are done.Software is iterative, work is revised a number of times before completeSoftware project have imprecise requirements, scope isn’t fixed Earned Value for Software Development SPIN 08-Sept 2010 8 © 2008-09 Carnegie Mellon University
    • So what do you do?The 95% rule for ETC?Developers are always 95% done, Just ask them. So how much time remains? Earned Value for Software Development SPIN 08-Sept 2010 9 © 2008-09 Carnegie Mellon University
    • So what do you do?The 95% rule for ETC?It takes “95% of the estimated schedule/cost to complete 95% of the work and ANOTHER 95% TO Finish it!” Earned Value for Software Development SPIN 08-Sept 2010 10 © 2008-09 Carnegie Mellon University
    • So what do you do?The 90% rule for ETC?It takes “90% of the estimated schedule/cost to complete 90% of the work and ANOTHER 90% TO Finish it!”DON’T do this! Earned Value for Software Development SPIN 08-Sept 2010 11 © 2008-09 Carnegie Mellon University
    • How to make EV work? Earned Value for Software Development SPIN 08-Sept 2010 12 © 2008-09 Carnegie Mellon University
    • The Four Core Requirements For Earned Value†A credible schedule of the planned workA time phased budget for the planned workA means of collecting progress to plan of the work performedA means of collecting cost information for the work performed† The Earned Value Management Maturity Model®, Ray W. Stratton, Management Concepts, 2006. Earned Value for Software Development SPIN 08-Sept 2010 13 © 2008-09 Carnegie Mellon University
    • Software development workis hard to estimate withsufficient accuracy. Earned Value for Software Development SPIN 08-Sept 2010 14 © 2008-09 Carnegie Mellon University
    • Estimate AccuratelySoftware development work is hard to estimate accurately?Start by defining “what done looks like”Decompose the work into work packages.Learn how to estimate a work package.Use history as a guide Earned Value for Software Development SPIN 08-Sept 2010 15 © 2008-09 Carnegie Mellon University
    • Ex: Text Pages versus Writing Time 120 100 80 Time (hours) y = 2.4366x + 4.1297 2 60 R = 0.6094 40 20 0 0 10 20 30 40 50 Text pages Earned Value for Software Development SPIN 08-Sept 2010 16 © 2008-09 Carnegie Mellon University
    • Ex: LOC versus Development Time 6000 5000 4000 Time (min.) 3000 2000 y = 6.4537x - 252.94 2 1000 R = 0.9582 0 0 200 400 600 800 -1000 C++ LOC Earned Value for Software Development SPIN 08-Sept 2010 17 © 2008-09 Carnegie Mellon University
    • PSP Estimating Accuracy 40 PSP 0Majority are under-estimating 20 0 -200% -100% 0% 100% 40Balance of over-estimates and under- PSP 1estimates 20 0 -200% -100% 0% 100% 40Much tighter balance around zero PSP 2 20 0 -200% -100% 0% 100% Effort Estimation Accuracy Earned Value for Software Development SPIN 08-Sept 2010 18 © 2008-09 Carnegie Mellon University
    • Improving Estimating Accuracy Estimated Minutes - Actual Minutes / Estimated Minutes Effort Estimation Accuracy Trend 0.7 PSP0 PSP1 PSP2 0.6 0.5 Mean Time Misestimation PSP Level Average 0.4 0.3 298 developers 0.2 0 1 2 3 4 5 6 7 8 9 10 11 Program Number Earned Value for Software Development SPIN 08-Sept 2010 19 © 2008-09 Carnegie Mellon University
    • You can’t get an accurateprogress report from asoftware developer? Earned Value for Software Development SPIN 08-Sept 2010 20 © 2008-09 Carnegie Mellon University
    • You can’t get an accurate progress report from asoftware developer?Make a credible plan, and track to the plan.Break the work package into smaller steps.For each step, now what DONE looks like. Have a standard.Use history to estimate the effort in each step. Earned Value for Software Development SPIN 08-Sept 2010 21 © 2008-09 Carnegie Mellon University
    • You can’t get an accurate progress report from asoftware developer?Make a credible plan, and track to the plan.Break the work package into smaller steps.For each step, now what DONE looks like. Have a standard.Use history to estimate the effort in each step. Earned Value for Software Development SPIN 08-Sept 2010 22 © 2008-09 Carnegie Mellon University
    • How do you break a work package into steps?Use a process! Plan Build Personal review Team inspection Test Field use Earned Value for Software Development SPIN 08-Sept 2010 23 © 2008-09 Carnegie Mellon University
    • What is a Process?A process is a defined and measured set of steps for doing a job.A process guides your work.A process is usually defined for a job that is done multiple times.A process provides a foundation for planning. • A process is a template, a generic set of steps. • A plan is a set of steps for a specific job, plus other information such as effort, costs, and dates. Earned Value for Software Development SPIN 08-Sept 2010 24 © 2008-09 Carnegie Mellon University
    • TSP Process Elements Phase Purpose To guide you in developing module-level programs Inputs Required Problem description PSP project plan summary form Time and defect recording logs Scripts Document the process entry criteria, phases/ Defect type standard Stop watch (optional) 1 Planning - Produce or obtain a requirements statement. - Estimate the required development time. - Enter the plan data in the project plan summary form. - Complete the time log. 2 Development - Design the program. - Implement the design. - Compile the program and fix and log all defects found. - Test the program and fix and log all defects found. - Complete the time recording log. steps, and exit criteria. The purpose is to 3 Postmortem Exit Criteria Complete the project plan summary form with actual time, defect, and size data. - A thoroughly tested program - Completed project plan summary with estimated and actual data provide expert-level guidance as you use the - Completed defect and time logs process. Measures Measure the process and the product. They Student Program Instructor Date Program # Language provide insight into how the process is Summary LOC/Hour Actual Time Planned Time CPI(Cost-Performance Index) % Reuse % New Reuse Plan Actual To Date (Actual/Planned) working and the status of the work. Test Defects/KLOC Total Defects/KLOC Yield % % Appraisal COQ % Failure COQ COQ A/F Ratio Program Size (LOC): Plan Actual To Date Base(B) (Measured) (Measured) Deleted (D) (Estimated) (Counted) Modified (M) (Estimated) (Counted) Added (A) Forms Provide a convenient and consistent (N-M) (T-B+D-R) Reused (R) (Estimated) (Counted) Total New & Changed (N) (Estimated) (A+M) Total LOC (T) (N+B-M-D+R) (Measured) Total New Reused (Estimated) (Counted) Total Object LOC (E) framework for gathering and retaining (Estimated) (Counted) Upper Prediction Interval (70%) Lower Prediction Interval (70%) Time in Phase (min.) Plan Actual To Date To Date % Planning Design data Design review Code Code review Compile Test Postmortem Total Total Time UPI (70%) Total Time LPI (70%) Provide consistent definitions that Standards guide the work and gathering of data. Earned Value for Software Development SPIN 08-Sept 2010 25 © 2008-09 Carnegie Mellon University
    • Example Process Script - Requirements Earned Value for Software Development SPIN 08-Sept 2010 26 © 2008-09 Carnegie Mellon University
    • Until the software tests successfully we don’tknow that we are done?Then you are done when the test complete successfully!But, test is highly variable.Then make a quality plan. Plan to remove the defects you’ve put in.Defects can result in • incorrect functionality • poor operation • improper installation • confusing or incorrect documentation • error-prone modification and enhancement Earned Value for Software Development SPIN 08-Sept 2010 27 © 2008-09 Carnegie Mellon University
    • Why Focus on Quality?The fastest way to finish is to do it right the first time! Do it right, or do itoverCompleted tasks should be verified with performance measures.This links to TPM, Technical Performance Measures Earned Value for Software Development SPIN 08-Sept 2010 28 © 2008-09 Carnegie Mellon University
    • Why Focus on Defects?In most engineering organizations, a significant number of resources arededicated to fixing defects. Often more than 40% of cost and schedule.Defects are very costly. It is beneficial to find and remove defects early inthe process.The reasons for managing defects are to • produce better products • improve your ability to develop products on time and within budget Earned Value for Software Development SPIN 08-Sept 2010 29 © 2008-09 Carnegie Mellon University
    • Quality MeasuresThe TSP uses three quality measures for planning and tracking.1. Defect injection and removal rates2. Defect density3. Review rates Earned Value for Software Development SPIN 08-Sept 2010 30 © 2008-09 Carnegie Mellon University
    • Defect Injection and Removal RatesDefect Injection Rate is defined as the number of defects injected per hour whileperforming activities in a process phase.Defect Removal Rate is defined as the number of defects removed per hour whileperforming activities in a process phase.Typical defect injection phases include requirements and design.Typical defect removal phases include reviews, inspections, and testing. Earned Value for Software Development SPIN 08-Sept 2010 31 © 2008-09 Carnegie Mellon University
    • Defect DensityDefect Density is the ratio of the number of defects removed and theproduct size.A common defect density measure in software projects is number ofdefects found per thousand lines of code (defects/KLOC).An example of another defect density measure, used by the SEI whendeveloping training slides, is number of defects found per slide.Defect density is also a good product planning, tracking, and predictivemeasure. Earned Value for Software Development SPIN 08-Sept 2010 32 © 2008-09 Carnegie Mellon University
    • Review RatesReview rates is the ratio between the size of a product and the time spentin reviewing or inspecting the product.A common example in software projects is lines of code reviewed per hour(LOC/hr).Another example is number of requirements pages reviewed per hour(Req Pages/hr).Review rate is the control variable for inspections and reviews.It is used to • allocate appropriate time during planning • predict the effectiveness of the review or inspection Earned Value for Software Development SPIN 08-Sept 2010 33 © 2008-09 Carnegie Mellon University
    • Defect Injection and RemovalIn your process, you will have activities that • inject defects Plan • remove defectsDefect injection typically occurs when you Build • determine the job requirements Personal • specify the product review • build the product TeamDefect removal typically occurs when you inspection • review the products Test • test the products • use the products Field use Earned Value for Software Development SPIN 08-Sept 2010 34 © 2008-09 Carnegie Mellon University
    • Defect Removal Techniques Reviews (inspections, walkthroughs, personal reviews) • examine the product or interim development artifacts of the product • find and eliminate defects Testing • exercises the product or parts of the product • proves that the product works correctly • identifies potential defects or symptoms In many cases, projects rely on testing to find and fix defects. When this happens, many defects are found in the field by the customer. Earned Value for Software Development SPIN 08-Sept 2010 35 © 2008-09 Carnegie Mellon University
    • PSP Quality Results Defects Per KLOC Removed in Compile and Test 120 110 100 Mean Number of Defects Per 90 Mean Compile + Test 80 70 PSP2 PSP Level Mean Comp + Test 60 50 PSP1 40 30 PSP0 KLOC 20 10 298 developers 0 0 1 2 3 4 5 6 7 8 9 10 11 Program Number Earned Value for Software Development SPIN 08-Sept 2010 36 © 2008-09 Carnegie Mellon University
    • The ‘Worst’ as Good as the ‘Best’? Compile and Test Defects - from PSP Training 810 developers 250 200Defects/KLOC 1st Quartile 150 2nd Quartile 100 3rd Quartile 4th Quartile 50 Defect 0 reduction 1Q: 80.4% 10 1 2 3 4 5 6 7 8 9 2Q: 79.0% og og og og og og og og og og Pr Pr Pr Pr Pr Pr Pr Pr Pr 3Q: 78.5% Pr 4Q: 77.6% PSP Assignment Number Earned Value for Software Development SPIN 08-Sept 2010 37 © 2008-09 Carnegie Mellon University
    • Software is iterative, work isrevised a number of timesbefore complete? Earned Value for Software Development SPIN 08-Sept 2010 38 © 2008-09 Carnegie Mellon University
    • Software is iterative, work is revised a number oftimes before complete?Incremental and iterative development isn’t a bug, it’s a feature!DoD 5000.02s procurement cycle, incremental and iterative approaches are used.This is a fact on almost any project. (ever change a leaky faucit?)Apply the appropriate method to deal with this.. Earned Value for Software Development SPIN 08-Sept 2010 39 © 2008-09 Carnegie Mellon University
    • TSP Cyclic Development StrategyTSP favors an iterative or cyclicdevelopment strategy. Launch • develop in increments • use multiple cycles Re-launch • work-aheadProjects can start on any phase or cycle. Development phase or cycleEach phase or cycle starts with a launchor re-launch. Phase or cycle PostmortemTSP permits whatever process structuremakes the most business and technicalsense. Project Postmortem Earned Value for Software Development SPIN 08-Sept 2010 40 © 2008-09 Carnegie Mellon University
    • Core Success Criteria for Earned Value and SoftwareDevelopmentDefine the outcomes of the work effort in some tangible way.Define the way progress is going to be measured. 0/100% for each work task.Either its done or its not done, there is no “we’re trying real hard”.Define the tangible evidence, the date the tangible evidence is expected, and the associated costs.. Earned Value for Software Development SPIN 08-Sept 2010 41 © 2008-09 Carnegie Mellon University
    • Software project haveimprecise requirements,scope isn’t fixed Earned Value for Software Development SPIN 08-Sept 2010 42 © 2008-09 Carnegie Mellon University
    • Use a Planning ProcessApply the appropriate method to deal with this.Usually most portions of the project can be planned.A common SW Dev project mistake is staffing up too soon! Earned Value for Software Development SPIN 08-Sept 2010 43 © 2008-09 Carnegie Mellon University
    • TSP: Pulling it all together Earned Value for Software Development SPIN 08-Sept 2010 44 © 2008-09 Carnegie Mellon University
    • Plan Execution -1Tasks in personal task plans are ordered according to team priorities.Developers adjust the order as needed and work ahead as needed.Developers select a task planned for current week and begin tracking time. Earned Value for Software Development SPIN 08-Sept 2010 45 © 2008-09 Carnegie Mellon University
    • Plan Execution -2While working they also record When they are done they • any defects they find • stop time tracking • any risks they identify • mark task completed if done • record size if the process step calls for it • any process improvement ideas Earned Value for Software Development SPIN 08-Sept 2010 46 © 2008-09 Carnegie Mellon University
    • Monitor and Control the Plan: Script WEEKTeam members meet each week to Performance Data Reviewedassess progress. • Baseline Plan Value • Plan Value • Role managers present their • Earned Value evaluation of the team’s plan and • Predicted Earned Value data. • Earned Value Trend • Plan Task Hours • Goal owners present status on • Actual Task Hours product and business objectives. • Tasks/Milestones completed • Tasks/Milestones past due • Risk owners present status on risk • Tasks/Milestones next 2 weeks mitigation plans and new risks. • Effort against incomplete tasks • Estimation Accuracy • Team members present status on • Review and Inspection Rates their plans. • Injection Rates • Removal RatesPlan deviations are addressed each • Time in Phase Ratios • Phase and Process Yieldweek. • Defect Density • Quality Profile (QP)Significant deviations like new • QP Indexrequirements trigger a replan. • Percent Defect Free • Defect Removal Profile • Plan to Actual Defects Injected/Removed Earned Value for Software Development SPIN 08-Sept 2010 47 © 2008-09 Carnegie Mellon University
    • Form Week -1 Teams use Form Week to review status at their weekly meetings.TSP Week Summary - Form WEEK Name Consolidation Date 3/1/2007 Team Voyager Status for Week 11 Selected Assembly Cycle Week Date 1/22/2007 SYSTEM Plan / Plan - Task Hours %Change Weekly Data Plan Actual Actual Actual Project End Dates Baseline 660.8 Schedule hours for this week 51.0 48.1 1.06 2.9 Baseline 3/19/2007 Current 745.5 Schedule hours this cycle to date 344.0 395.0 0.87 -51.0 Plan 3/26/2007 %Change 12.8% Earned value for this week 5.6 0.7 8.10 4.9 Predicted 7/30/2007 Earned value this cycle to date 43.8 22.0 1.99 21.8 To-date hours for tasks completed 163.9 314.5 0.52 To-date average hours per week 31.3 35.9 0.87 EV per completed task hour to date 0.134 0.070 Baseline or Re- Plan Actual EV or Committed Date Plan Date and Slip Date and Predicted Date Assembly Phase Task source Hrs. Hrs. PV CPI and Week Week Week and Week OPEN MILESTONESCD Mag PEG Window MGMT CD Mag PEG Milestone ne 0.0 0.0 12/4/2006 4 1/8/2007 9 1/29/2007 12 1/29/2007 12CD Mag PEG a MGMT Complete Mag PEG Milestone bf 0.0 0.0 0.0 1/8/2007 9 1/1/2007 8 1/22/2007 11 1/29/2007 12INTEG Mag UI FW MGMT Complete Mag UI PEG Milestone bf 0.0 0.0 0.0 2/5/2007 13 1/22/2007 11 2/12/2007 14 3/5/2007 17INTEG Mag UI FW MGMT Test Mag UI TES On Target ne 10.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28INTEG Mag UI FW MGMT Test Mag UI TES On Target Milestone ne 0.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28INTEG Mag UI FW MGMT Test Mag UI TES On Target bf 10.0 0.0 0.0 3/19/2007 19 3/5/2007 17 3/26/2007 20 7/2/2007 34 TASKS COMPLETED IN WEEK 11CD Mag PEG Frame PM CD Mag UI FW Frame Phase1 PM bf 0.0 0.4 0.0 0.00 12/4/2006 4 12/18/2006 6 1/23/2007 11Mag General MGMT Mag Initialization bf 0.0 1.9 0.0 0.00 12/18/2006 6 1/22/2007 11Mag General DLDR Mag Initialization DLDR phase 1 bf 0.0 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11CD Mag PEG Frame PM CD Mag UI FW Frame Phase2 PM bf 0.3 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11CD Mag UI FW Flow PM CD Mag FW Flow PM bf 0.1 0.1 0.0 0.98 12/4/2006 4 12/18/2006 6 1/23/2007 11CD Mag UI FW Flow bf 0.0 0.0 0.0 12/18/2006 6 12/18/2006 6 1/23/2007 11CD Mag PEG Window Builder ODEINSP C CD Mag PEG Window Builder CODEINSP nm 0.3 1.0 0.0 0.25 12/4/2006 4 1/1/2007 8 1/25/2007 11 Earned Value for Software Development SPIN 08-Sept 2010 48 © 2008-09 Carnegie Mellon University
    • Form Week -2 The top of form Week displays a summary of current status for any week up to the current week.TSP Week Summary - Form WEEK Name Consolidation Date 3/1/2007 Team Voyager Status for Week 11 Selected Assembly Cycle Week Date 1/22/2007 SYSTEM Plan / Plan - Task Hours %Change Weekly Data Plan Actual Actual Actual Project End Dates Baseline 660.8 Schedule hours for this week 51.0 48.1 1.06 2.9 Baseline 3/19/2007 Current 745.5 Schedule hours this cycle to date 344.0 395.0 0.87 -51.0 Plan 3/26/2007 %Change 12.8% Earned value for this week 5.6 0.7 8.10 4.9 Predicted 7/30/2007 Earned value this cycle to date 43.8 22.0 1.99 21.8 To-date hours for tasks completed 163.9 314.5 0.52 To-date average hours per week 31.3 35.9 0.87 EV per completed task hour to date 0.134 0.070 Earned Value for Software Development SPIN 08-Sept 2010 49 © 2008-09 Carnegie Mellon University
    • Form Week -3 The bottom half of Form Week displays the status of open milestones, tasks completed in the selected week, and tasks due in the next two weeks. Baseline or Re- Plan Actual EV or Committed Date Plan Date and Slip Date and Predicted Date Assembly Phase Task source Hrs. Hrs. PV CPI and Week Week Week and Week OPEN MILESTONESCD Mag PEG Window MGMT CD Mag PEG Milestone ne 0.0 0.0 12/4/2006 4 1/8/2007 9 1/29/2007 12 1/29/2007 12CD Mag PEG a MGMT Complete Mag PEG Milestone bf 0.0 0.0 0.0 1/8/2007 9 1/1/2007 8 1/22/2007 11 1/29/2007 12INTEG Mag UI FW MGMT Complete Mag UI PEG Milestone bf 0.0 0.0 0.0 2/5/2007 13 1/22/2007 11 2/12/2007 14 3/5/2007 17INTEG Mag UI FW MGMT Test Mag UI TES On Target ne 10.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28INTEG Mag UI FW MGMT Test Mag UI TES On Target Milestone ne 0.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28INTEG Mag UI FW MGMT Test Mag UI TES On Target bf 10.0 0.0 0.0 3/19/2007 19 3/5/2007 17 3/26/2007 20 7/2/2007 34 TASKS COMPLETED IN WEEK 11CD Mag PEG Frame PM CD Mag UI FW Frame Phase1 PM bf 0.0 0.4 0.0 0.00 12/4/2006 4 12/18/2006 6 1/23/2007 11Mag General MGMT Mag Initialization bf 0.0 1.9 0.0 0.00 12/18/2006 6 1/22/2007 11Mag General DLDR Mag Initialization DLDR phase 1 bf 0.0 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11CD Mag PEG Frame PM CD Mag UI FW Frame Phase2 PM bf 0.3 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11CD Mag UI FW Flow PM CD Mag FW Flow PM bf 0.1 0.1 0.0 0.98 12/4/2006 4 12/18/2006 6 1/23/2007 11CD Mag UI FW Flow bf 0.0 0.0 0.0 12/18/2006 6 12/18/2006 6 1/23/2007 11CD Mag PEG Window Builder ODEINSP C CD Mag PEG Window Builder CODEINSP nm 0.3 1.0 0.0 0.25 12/4/2006 4 1/1/2007 8 1/25/2007 11 Earned Value for Software Development SPIN 08-Sept 2010 50 © 2008-09 Carnegie Mellon University
    • Schedule Management -1TSP teams routinely meet their schedule commitments.They use earned value management, task hour management, and qualitymanagement at the team and personal level to help manage schedule. Teams monitor earned value per week and cumulative earned value for the cycle TSP Week Summary - Form WEEK Name Consolidation Date 3/1/2007 Team Voyager Status for Week 11 Selected Assembly Cycle Week Date 1/22/2007 SYSTEM Plan / Plan - Task Hours %Change Weekly Data Plan Actual Actual Actual Project End Dates Earned value for this week 5.6 0.7 8.10 4.9 Baseline 3/19/2007 Earned value this cycle to date 43.8 22.0 1.99 21.8 Plan 3/26/2007 To-date hours for tasks completed 163.9 314.5 0.52 Predicted 7/30/2007 Earned Value for Software Development SPIN 08-Sept 2010 51 © 2008-09 Carnegie Mellon University
    • Schedule Management -2Intuit’s 2007 release of QuickBooks met every major milestone anddelivered 33% more functionality than planned.First-time TSP projects at Microsoft had a 10 times better meanschedule error than non-TSP projects at Microsoft as reflected in thefollowing table.Microsoft Schedule Results Non-TSP Projects TSP ProjectsReleased on Time 42% 66%Average Days Late 25 6Mean Schedule Error 10% 1%Sample Size 80 15 Source: Microsoft Earned Value for Software Development SPIN 08-Sept 2010 52 © 2008-09 Carnegie Mellon University
    • Reporting to Higher Level Management: Script STATUSThe team presents status tomanagement and the customer atspecified intervals. • weekly, bi-weekly, monthlyProject status • earned value and projection • task hours • milestones planned/completed • product quality indicatorsProject risks • status of existing risks • newly-identified risks Earned Value for Software Development SPIN 08-Sept 2010 53 © 2008-09 Carnegie Mellon University
    • Questions ? Earned Value for Software Development SPIN 08-Sept 2010 54 © 2008-09 Carnegie Mellon University
    • Contact InformationJim McHale – jdm@sei.cmu.eduBill Nichols – wrn@sei.cmu.edu Earned Value for Software Development SPIN 08-Sept 2010 55 © 2008-09 Carnegie Mellon University
    • Earned Value for Software DevelopmentSPIN 08-Sept 2010 56© 2008-09 Carnegie Mellon University