Your SlideShare is downloading. ×
0
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Bizier.brenda
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Bizier.brenda

14,296

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
14,296
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. EARNED VALUE IN SOFTWAREUsing Software Metrics & Measures for Earned ValueDEVELOPED BY: PRESENTED BY:NAVAL AIR SYSTEMS COMMAND Brenda BizierAIR-4.2 Cost Department AIR-4.2.3 Integrated Project Management DivisionAIR-4.2.3 Earned Value Management Division & IPM Process Owner, Earned Value ManagementAIR-4.2.1.4 Advanced Concepts/Special Studies &Databases Branch Amy Houle CarusoPatuxent River, MD E-6 Block I Modification Team LeadMay 2004 Program Management/Software Management NAVAIR Public Release 09-33 Distribution Statement A – “Approved for public release; distribution is unlimited”
  • 2. Reality For more and more DOD systems, software development will consume the majority of resources, schedule and cost while generating the bulk of program risk. 2BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 3. Outline • Current Issues • WBS • Software Measures • Rework • Conclusion: Final Thoughts 3BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 4. Current Challenges in S/W Earned Value • Excessive use of Level of Effort (LOE) • Crediting full-earned value for tasks and requirements even though all tasks and requirements have not been completed • Basing earned value on metrics and measures that do not directly relate to implementation of the software requirement • Basing earned value on the metrics and measures that are obsolete or inaccurate • Utilizing EVM in isolation vice in conjunction with other software measurements and metrics to evaluate program status • Failure to consider rework in developing the Performance Measurement Baseline (PMB) • Failure to correlate earned value with Technical Performance Measurement (TPM) 4BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 5. Needing to Know What is Important (Example) Basing Earned Value on metrics and measures that do not directly relate to implementation of the software requirement Requirement: Build 100 miles of highway in 10 months for $10M Contractor Estimate: 10,000 loads of fill, concrete, and other material required to complete requirement Status: After 5 months, 30 miles have been completed and 5000 truckloads have been used. How is the project doing? (or, how do we know we are “x%” complete and what are the “artifacts” to prove it) 5BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 6. Needing to Know What is Important (Example) Basing Earned Value on metrics and measures that do not directly relate to implementation of the software requirement Requirement: Build 100 miles of highway in 10 months for $10M Contractor Estimate: 10,000 loads of fill, concrete, and other material required to complete requirement Status: After 5 months, 30 miles have been completed and 5000 truckloads have been used. How is the project doing? If the measure for EV was defined as follows: Customer Measure: # of miles completed Contractor Measure: # of truckloads used Earned value would be reported as: Measure: # of miles completed BCWP 67% over cost and 40% behind schedule Measure: # of truckloads used BCWP on cost and project is on schedule 6BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 7. Things You Want to Know Sooner rather than Later • Requirements are the primary cost driver of software development efforts. – Software requirements tend to increase by 1–5% per month between the end of requirements analysis and the start of systems and integration testing, with the national average being about 2%. – Sometimes changes in requirements continue after testing begins. – According to Capers Jones, approximately 20% of all defects in software are caused by poorly defined and contradictory requirements. • Size is often underestimated – EV results are over optimistic cost and schedules • Just because a developer’s earned value system is compliant with EVMS, does not mean that the base and derived measures driving it will provide adequate information on program status. – It is essential the customer has contractually established a measurement program that will insure that measures capable of identifying deviations from the programs cost schedule and technical objectives are delivered to the customer. – The contract should be implemented so that the Measurement IPT is able to modify the program measures as the information needs of the program change over its life cycle. 7BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 8. Easier Said Than Done (a.k.a. DUH!) • An EV system based on a robust flexible measurement program that adapts to the current program needs will be much more effective than one that is not. • It is also essential when selecting measures and setting up an earned value system that the benefit of tracking a specific risk or information need is determined for the program. • All of the measures are directly or indirectly required for organizations which have achieved a SW-CMM®3 or CMMI®4 level III certification. NAVAIR requires that Developers working on ACAT I, II, III and IV software intensive systems have achieved Level III certification. Thus asking the developer to change the measure driving their earned value system should have only minimal or no impact on the cost of implementing their earned value and measurement program. • A software development program should contractually establish quality criteria defining the maximum number of defects of different priority for the effort. • Reasonable amount of change should be built into the project plan based on the developer’s and acquisition organization’s prior history. • Tasks being reviewed must be broken down so that they can be completed in less than a month and preferably less than a week. 8BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 9. EV is a part of Risk Management • If an effective risk management, measurement and earned value program is combined, it can serve to identify the occurrence of the risks at an earlier date when it is more likely that effective corrective action which minimizes perturbations to the program plan can be made. • Requirements deferral is always the result of a risk occurring. • The more effective a program is at managing and tracking its risks, the less functionality or requirements will be deferred. 9BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 10. Types of Code and EV Issues • New Code 1. Size is often underestimated. This results in overoptimistic costs and schedules with resulting low Cost Performance Index (CPI) and Schedule Performance Index (SPI). 2. Generally the most expensive type of code to produce. Every phase of software development must be implemented. • Reuse Code 1. The amount of functionality that can be gained through software reuse is often overestimated. This results in a reduction in the amount of reuse and an increase in new and/or modified code resulting in higher cost and longer schedules. CPI and SPI degrade. 2. Often cost and schedule overruns are experienced in development efforts based on software reuse integration. These overruns can often be attributed to developer unfamiliarity with the code, poor documentation, and low quality of code. • Modified Code – The amount of modified code that can be used in a system is often overestimated and/or the amount of modification the code will require is underestimated. This results in increasing cost and schedule for implementing and integrating the modified code or additional new code and subsequent degradation in CPI and SPI. 10BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 11. Types of Code and EV Issues (Continued) • Deleted Code – Similar to the problems with modified code. Often the amount of code to be deleted and the amount of testing after the deletions is underestimated as part of the modification effort. This leads to higher cost and schedule and lower CPI and SPI. • Automatically Generated Code – Automatically generated code does not produce free code. Requirements analysis, design and testing are still necessary for automatically generated code. • Converted/Ported Code – Automatically converted/translated/ported code may require extensive manual corrections to get it to run in the new environment. However, if it was automatically generated without comments it is likely to be much more difficult, costly and time consuming to understand and modify or correct. • Commercial Off the Shelf (COTS) 1. Selection of a COTS product must not only consider the technical merit of the product but the commercial viability of the vendor. If the vendor’s long-term prospects are questionable, a mitigation plan for replacing the product must be developed and additional funding to cover the risk built into the project. 2. The project plan must include plans for upgrading any COTS tools. Failure to do so is likely to result in unplanned for costs associated with product upgrades, integration and testing. 11BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 12. Software WBS Challenges • Obscuring the software development effort by burying it deeply within the system WBS often as a sub component of much cheaper, lower risk hardware efforts can only aggravate these problems. • The deeper the software is buried in the effort, the deeper the contractor must report and the greater the burden placed on the contractor’s financial tracking and reporting system. • It is important to keep these issues in mind when developing the Program and Contract WBS. 12BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 13. WBS Challenges (WBS Example) Contract WBS - SW Treated as subsystem Contract WBS - SW Treated as equal system 1(3) 2(4) 3(5) 4(6) 5(7) 1(3) 2(4) 3(5) 4(6) 5(7) Fire Control Fire Control Radar Radar Receiver Receiver Applications Software Receiver Applications Software Build 1 Build 1 Build 2 Build 2 Integration Testing Integration Testing Systems Software Receiver Systems Software Build 1 Build 1 Build 2 Build 2 Integration Testing Integration Testing Transmitter Transmitter Antenna Antenna Radar Integration Radar Integration Platform Integration Platform Integration Software could be placed on a parallel level with the hardware on which it executes or possibly as a sub component of a higher component of the entire system as shown in the figure above. 13BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 14. COMPARISON OF SOFTWARE LIFE CYCLE PHASES vs. DEVELOPMENT FUNCTIONS Software Life Cycle PHASE SOFTWARE SYSTEM REQUIREMENTS CODE & INTEGRATION OPERATION & DEFINITION DESIGN UNIT TEST & TEST MAINTENANCE ANALYSIS FUNCTION • Formulate • Execute plan • Manage SOFTWARE Inital plan • Update plan software PROJECT changes MANAGEMENT • Define system • Analyze & refine • Define how the • Write the code • Combine units & • Change the concept software require- software is • Test Individual check out software SOFTWARE • Allocate reqts ments structured and units of code DEVELOPMENT to software how it works • Analyze • Analyze • Plan for tests • Develop test • Test integrated • Retest as SOFTWARE testability testability • Define tests procedures units required INTEGRATION & TEST • Plan for facilities • Execute plan • Maintain SOFTWARE • Identify long-lead facilities DEVELOPMENT items FACILITIES • Plan for QA & CM • Maintain • Execute QA &CM quality & SOFTWARE plans configuration QA/CM control 14BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 15. Additional Insight Breaking the development effort/WBS into phases results in tasks of shorter duration contributing to EVM accuracy and early warning of project problems. 15BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 16. Base vs Derived Measures • Base measures can be thought of as raw data. On their own, they provide little information on the status of the project. • A derived measure combines data from two or more base measures to provide insight into the actual status of the project and the basis upon which alternative corrective courses of action can be developed and program management decisions made. • Earned value is a formally defined derived measure. It is one of many measures that can be used to evaluate the status of a project during the software lifecycle. Some measures are useful throughout the project lifecycle; others are applicable to only specific tasks within specific development phases. In the majority of cases earned value is determined based on other derived software measures. No single measure should ever be used as the sole measure of evaluating status or to make program management decisions. 16BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 17. Software Measures • Level of Effort (LOE) • Schedule Milestones • Software Defects • Test Procedures/Cases • Modules • Source Lines of Code (SLOC) / Equivalent SLOC (ESLOC) • Function Points • Technical Performance Measures (TPMs) • Requirements 17BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 18. Requirements Measure Effectiveness - GOOD • Since Requirements are the ultimate driver in determining software cost and schedule, they are also an excellent choice for determining earned value • Requirements: – Are applicable to all phases of the system and software development. – Are directly related to producing the functionality the Customer want in a new system – Other metrics/measures that are indirectly related to implementing the desired functionality can inject errors into the earned value calculations. If requirements are not considered when determining earned value , then earned value will not reflect actual progress in meeting the requirements. 18BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 19. S/W Measures Summary 19BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 20. Measures for Phases 20BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 21. Rework • It is unreasonable to assume that there will be no defects detected in any of the requirements, design or code. • Time and effort for rework is usually based on the developers estimate of the number of defects likely to occur and the average amount of time required to correct such defects. • If such rework phases are not planned for, it can cause severe problems to the earned value system when it is attempted to determine how to implement it at the spur of a moment. • Programmatically, any project plan that does not include time for rework is unexecutable and questions the maturity of the developing organization. • The developer must take into consideration that some percentage of the requirements will not pass testing. 21BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 22. Rework (Cont.) • The rework must not only include time to correct the flaw in requirements, design and/or code that caused the problem, but also to retest the corrected software. In a multi release/build development, this may mean that some or all of the failed requirements will be rolled into the next build/release. • Rework should be planned and tracked in separate work packages from the initial development of requirements, design and code. In planning incremental builds, all builds must include budget and schedule for rework of requirements, design and code to correct defects that were found in the current and previous builds. • To ensure adequate budget and period of performance, the planning assumptions for rework should include the planned rate or number of defects expected and the budgeted resources to fix the defects. Failure to establish a baseline plan for rework and objectively measure rework progress has caused many projects to get out of control. • All of this must be taken into account in the project plan. 22BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 23. Conclusion: Final thoughts • Ability to manage software will be dictated by the visibility into the effort – Where is the software in the WBS? – Is it reported by phase, build to optimize ability to track requirements & functionality ? • SW Measures not ideal for taking EV still help to answer questions “where are we at?”, “how much is it going to cost”, “how long will it take” – Defects (finding and burning down) – Headcounts (look for sharp drop offs in support) • Visibility into Subcontracted & ”Black” SW effort • Rework part of plan (baseline) not managed through Management Reserve 23BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT
  • 24. For More Information • Brenda Bizier (301) -757-2432 Brenda.bizier@navy.mil • Amy Houle Caruso (301) 757-4005 amyhoule.caruso@navy.mil • NAVAIR Using Software Metrics and Measurements for Earned Value Toolkit https://acc.dau.mil/CommunityBrowser.asp x?id=19591&lang=en-US 24BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

×