Software Management Metrics Herman P. Schultz
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,319
On Slideshare
1,318
From Embeds
1
Number of Embeds
1

Actions

Shares
Downloads
23
Comments
0
Likes
0

Embeds 1

http://www.slideshare.net 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Software Management Metrics Herman P. Schultz 1988 EEL6887: Software Engineering Chi-Hwa J Marcos 3/29/2006
  • 2. Reference Sources
    • Herman P. Schultz, “Software Management Metrics”, Hanscom AFB, MA 1988
    • David L. Hallowell, “Six Sigma Software Metrics”, iSixSigma LLC, http://software.isixsigma.com/library/content/c030910a.asp
    • Karl E. Wiegers, “A Software Metrics Primer”, Software Development, July 1999, http://www.processimpact.com/articles/metrics_primer.pdf
  • 3. Overview
    • Introduction
    • Coverage
    • Reporting
    • Analysis
      • Correlation
      • Extrapolation
    • Software Size Metric
    • Software Personnel Metric
    • Software Volatility Metric
    • Computer Resource Utilization Metric
    • Schedule Progress Metric
    • Metric Tool Examples
    • Summary
    • Conclusion
  • 4. Introduction
    • “ metric -- A quantitative measure of the degree to which a system, component, or process possesses a given attribute.” [IEEE Std 610.12-1990]
    • This report was the result of approximately 3 years of experience from government and industry use and analysis of metrics. The use of metrics data can detect potential problems in a software project while time still permits for resolution discovery. These potential problems may impact cost and schedule.
  • 5. Coverage
    • Metrics should cover all phases of software development.
    • Metrics can cover some development phases more than once.
    • Multiple coverage provides better visibility into each development phases.
    • Multiple phases allows consistency checks of metrics.
    • Metrics address two aspects of software development:
      • Progress metrics tracks deviation between plan and actual progress.
      • Planning metrics affects software development progress.
  • 6. Reporting
    • Recommends pre-Program Management Review (PMR) screening
    • For example:
      • Deliver metrics to government at least one week prior to PMR.
      • Discuss metrics during Technical Interchange Meeting (TIM).
      • Discuss TIM results with System Program Office (SPO) to separate issues between PMR and TIMs.
    • Metrics presented at PMR provides management with visibility into potential cost and schedule impact problems.
  • 7. Analysis
    • Metrics provides a means of evaluating the software plan credibility.
    • Metrics identify trends.
    • Two analysis methods:
      • Correlation
        • Identifying strong relationships between reported metrics.
        • Look for inconsistencies within a group of related metrics.
      • Extrapolation
        • Identify trends.
        • Shows potential impact on schedules.
  • 8. Correlation Example
  • 9. Extrapolation Example
  • 10. Extrapolation Example
  • 11. Extrapolation Example
  • 12. Extrapolation Example
  • 13. Extrapolation Example
  • 14. Software Size Metric
    • Purpose
      • Track magnitude changes in software development effort.
      • SLOC
    • Behavior - A lack of understanding and appreciation of requirements can cause increase or decrease in SLOC
      • Increase SLOC
        • Better understanding of requirements.
        • Better understanding of design implication and complexity.
        • Optimistic original estimate
      • Decrease SLOC
        • Overestimate at beginning of program.
    • Data Inputs
      • Estimated new SLOC
      • Estimated Reused SLOC
      • Estimated modified SLOC
      • Estimated total SLOC
  • 15. Software Size Metric Cont.
    • Tailoring Ideas
      • Delete SLOC types not applicable.
      • Separate data reporting for each coding language used.
      • Required separate reporting for each processor and/or CSCI.
      • Report object code size.
    • Interpretation Notes
      • Should not vary from previous reporting period by more than 5%.
      • SLOC vary by more than 5% form previous reporting period.
        • Software Developer provides detail explanation.
        • Related discussion regarding cost and schedule improvements.
      • Total SLOC does not linearly relate to effort.
      • New SLOC requires different effort than reuse or modified.
  • 16. Software Size Metric Example
  • 17. Software Personnel Metric
    • Purpose
      • Tracks planned staffing to maintain level and maintain sufficient staffing to complete task on schedule.
    • Behavior
      • Too few experience personnel will experience difficulties.
      • Bringing many personnel at later phases will experience difficulties.
      • Normal shape of total staffing profile:
        • Grow through design phases.
        • Peak through coding and testing phases.
        • Gradually tapered off through integration phases.
      • Normal shape of experience staff profile:
        • High during initial stages of the project.
        • Dip slightly during CSU development.
        • Grow during testing.
  • 18. Software Personnel Metric Cont.
    • Data Input
      • Initial
        • Planned total personnel level for each month of the contract.
        • Planned experience personnel level for each month of the contract.
        • Expected attrition rate.
      • Each reporting period
        • Total personnel.
        • Experience personnel.
        • Unplanned personnel losses.
    • Tailoring Ideas
      • Report staffing separately for each development task.
      • Report staffing separately for special development skills needed.
      • Report staffing separately for each development organization.
  • 19. Software Personnel Metric Cont.
    • Interpretation Notes
      • Understaffing result in schedule slippage.
      • Adding staff too late will seldom improve schedule and often cause more delays.
      • High experience personnel turnover rate will cause schedule delays.
      • Initial staffing level should be at least 25% of average staffing level.
  • 20. Software Personnel Metric Example
  • 21. Software Volatility Metric
    • Purpose
      • Tracks changes in requirements.
      • Tracks developers understanding of the requirements.
        • Software Action Items (SAI).
    • Behavior
      • More changes in requirements during requirement analysis and preliminary design phases.
      • Changes after CDR may have significant impact on schedule.
      • SAIs expected to rise at each review then tapered off exponentially.
        • Clear and complete specification produce less level of rise at each review.
        • Good communications among developers and customers will have higher rate of decay.
  • 22. Software Volatility Metric Cont.
    • Data Inputs
      • Current total number of requirements.
      • Cumulative number of requirements changes (addition, deletion and modification).
      • Number of new SAIs
      • Cumulative number of open SAIs
    • Tailoring Ideas
      • Track longevity of SAIs.
      • Track open SAIs by priority.
    • Interpretation Notes
      • Requirements volatility between CDR and TRR could cause significant schedule impact.
      • SAIs open for more than 60 days should be closely examined.
  • 23. Software Volatility Metric Example
  • 24. Software Volatility Metric Example
  • 25. Computer Resource Utilization Metric
    • Purpose
      • Tracks changes in estimated/actual computer utilization in target machine resources.
        • CPU, Memory and IO
    • Behavior
      • Most system experience upward creep in resource utilization.
      • Large system typically reserve 50% of resource for growth.
      • Dependencies among resource results in parallel movements of resources.
  • 26. Computer Resource Utilization Metric Cont.
    • Data Inputs
      • Initial
        • Planned spare for each resource.
      • Each reporting period
        • Estimated/actual percentage of CPU utilization.
        • Estimated/actual percentage of memory utilization.
        • Estimated/actual percentage of I/O channel utilization.
  • 27. Computer Resource Utilization Metric Cont.
    • Tailoring Ideas
      • Report combined utilization in a multi resource architecture that uses a load-leveling operating system.
      • Report utilization separately in a multi resource architecture that has dedicated functions.
      • Report average and worst case utilization.
      • Report separately for development and target Processors.
      • Consider memory addressing limit of the architecture when establishing utilization limits.
    • Interpretation Notes
      • Performance deteriorates rapidly when utilization exceeds 70 percent for real time applications.
      • Planned for resource expansion.
      • Necessary optimization cause by resource usage approaching limit will increase cost and schedule.
  • 28. Computer Resource Utilization Metric Example
  • 29. Schedule Progress Metric
    • Purpose
      • Tracks delivery of software work packages defined in the Work Break Down Schedule (WBS) against scheduled delivery.
      • Estimated Schedule (Months) =
      • Program Schedule (months) / (BCWP / BCWS)
      • BCWP - budgeted cost of work performed
      • BCWS - budgeted cost of work scheduled
    • Behavior
      • Tend to initially fall behind due to insufficient time allocated to the design process.
      • Likely to fall behind during testing due to inadequate test planning and testing at the CSU and CSC levels
  • 30. Schedule Progress Metric Cont.
    • Data Inputs
      • Initial
        • Number of months in program schedule.
      • Each reporting period
        • BCWP for software.
        • BCWS for software.
        • Number of months in program schedule if revised.
    • Tailoring Ideas
      • Track progress separately for each CSCI.
    • Interpretation Notes
      • Can be extrapolated to identify trends
      • If trend is up, it indicates a worsening condition.
      • If trend is down, productivity is under control and improving.
  • 31. Schedule Progress Metric Example
  • 32. Imagix Metric Tool
  • 33. Slim-Metric
  • 34. Slim-Metric Cont.
  • 35. Essential Metric
  • 36. SEER SEM
  • 37. Summary
    • Metrics is a valuable management tool allowing management to exercise control during each phases of a software development process.
    • Metrics provide control by giving different views or visibility into each of the phases of a development process.
    • Metric analysis can identify trends which may have impact on cost and schedule. Early detection of trends allow for effective recovery planning.
  • 38. Conclusions
    • Metric Tools are used by both project managers and software developers. Project managers are more interested in planning and progress metrics. Software developers mainly focus on software specific metrics such as defects, cyclomatic complexity, SLOC, etc.
    • Metric Tools
      • Essential Metric - http://www.powersoftware.com/em/screenshot.html
      • Imagix - http://www.imagix.com/products/metrics.html
      • McCabeIQ - http://www.mccabe.com/iq_qa.htm
      • SEER-SEM - http://www.gaseer.com/tools_sem.html
      • Semantic Design - http://www.semdesigns.com/Products/Metrics/index.html?Home=SoftwareMetrics
      • Slim Metric - http://www.qsm.com/slim_metrics.html
      • (Site with various metrics tools) - http://measurement.fetcke.de/products.html
  • 39. Conclusions Cont.
    • At CMM Level 2 basic management control is installed and software costs, schedules and functionality are tracked. Therefore a limited set metric gathering occurs at CMM Level 2. But it is not until CMM Level 4, where process are measure and quality quantified, that a complete set of metrics is gathered and trends identified.
  • 40. Acronyms
    • SRR - System Requirement Review SDR - System Design Review SSR - System Specification Review PDR - Preliminary Design Review CDR - Critical Design Reviews TRR - Test Readiness Review
    • http://sparc.airtime.co.uk/users/wysywig/semp39.htm (Descriptions for each review)