Your SlideShare is downloading. ×
0
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Metrics for Mofel-Based Systems Development

1,012

Published on

This presentation describes the value of metrics, key concepts for effective use of metrics, and provides some common metrics for project management, model-based design, and quality assurance. …

This presentation describes the value of metrics, key concepts for effective use of metrics, and provides some common metrics for project management, model-based design, and quality assurance. Created by Dr. Bruce Powel Douglass, Ph.D.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,012
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
63
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • The image below shows reports on the development team within a project dashboard. As work items are updated, the reports reflect the activity and trends of the team.Go to any of the project dashboards on jazz.net to see how we do it.http://jazz.net/jazz/web/projects/Jazz%20Collaborative%20ALM#action=jazz.viewPage&id=com.ibm.team.dashboard
  • The objective of cost per unit of work is to monitor the efficiency of effort / cost spending for a unit of work delivering. The team can use this metric to monitor are we getting better in spending the budget. In typical Agile project, measuring cost in term of cost per unit of work may not be desirable. The team can use release burndown or release burnup to control the cost/effort spending during a release.Cost Per Unit of Work for a given iteration = Cost of the team for the iteration / Velocity
  • It is common to track by type/severity of defect and by lifecycle phase during which the defect was foundDefects will often be recorded during the transition/release phase of delivery and once the solution has been deployed into productionAgile teams at scale may have an independent test team that reports defects back to the development team in parallel to their construction efforts. See http://www.ambysoft.com/essays/agileTesting.html#IndependentParallelTesting for independent testing
  • Not all metrics are appropriate for all team members. A different set of metrics can be used for each management level. Executive or Middle management are interested in measuring capacity and capabilities of the team in delivering the solutions based on committed planned. Team members are interested in detailed measures that can help them get day-to-day job done. This table show example of appropriate metrics for each management level in order to steer project execution. In the rest of presentation, we will focus on a set of metrics for middle management and executives.
  • Transcript

    • 1. ® IBM Software Group © 2013 IBM CorporationInnovation for a smarter planet Metrics You can’t control what you don’t measure Dr. Bruce Powel Douglass, Ph.D. Chief Evangelist Global Technology Ambassador Bruce.Douglass@us.ibm.com Twitter: @BruceDouglass Yahoo: http://tech.groups.yahoo.com/group/RT-UML
    • 2. IBM Software Group | Rational software Innovation for a smarter planet Agenda  Introductions  On the importance of being metric  Types of metrics Design Metrics Quality Metrics Project Metrics  Q&A
    • 3. IBM Software Group | Rational software Innovation for a smarter planet Dr. Bruce Powel Douglass Chief Evangelist Global Technology Ambassador IBM Rational
    • 4. IBM Software Group | Rational software Innovation for a smarter planet 4 On the Importance of Being Metric Companies that measure Companies that don’t measure On-time projects 75% 45% Late projects 20% 40% Cancelled projects 5% 15% Defect removal >95% Unknown Cost estimates Accurate Optimistic User satisfaction High Low Software status High Low Staff morale High Low Source: Applied Software Measurement 3rd Edition by Capers Jones 2008 Results improve when measurements are used
    • 5. IBM Software Group | Rational software Innovation for a smarter planet On the Importance of Being Metric  A metric is a measurement of something important to you  A metric includes  A scalar value which is measured  Range  Units  Consistent technique for measurement acquisition  Standard analysis of outcome  Metrics should be  Easy to measure  Correlate with the actual information desired  Have easy to understand appropriate interpretation for decision makers  Example: BMI  BMI = Weight in Kilograms / ( Height in Meters)2  What is the BMI for a body builder?  Example: LOC  Lines of code as a measure of work performed  If I optimized from 700K lines to 500K lines have I done negative work?
    • 6. IBM Software Group | Rational software Innovation for a smarter planet On the Importance of Metrics  Metrics are used To understand an aspect of a project or system  Static metric measures a static property of system (e.g. size, complexity, defect density)  Dynamic metric measures something that changes (e.g. velocity, iteration burndown, defect rate) To reduce risk To answer a question To enable informed decision making  Metrics are best used to change from open-loop (ballistic) decision making to closed loop (dynamic evidence-based) decision making Plan Act
    • 7. IBM Software Group | Rational software Innovation for a smarter planet Goal-Based versus Plan-Based Metrics  A goal is a designed outcome  Plans try to achieve goals but may themselves but suboptimal or in error  Ergo, Goal-based metrics are generally preferred to plan-based metrics  Plan-based metrics On planned schedule  Metric: Number of hours worked on project  Metric: Number of lines of code generated  Metric: Weight lost  Goal based metrics Progressing towards system delivery  Metric: Requirements delivered/designed/implemented/verified/validated  Metric: Amount of verified functionality delivered  Metric: body fat percentage
    • 8. IBM Software Group | Rational software Innovation for a smarter planet When are metrics useful? When there is a consensus on what to measure When all relevant measurements are made When the measurement is timely When the measurement correlates to the desired information When the measurement is precise and accurate enough When it allows you to make an informed correct decision When all performers know how to make the measurement When the measurement is properly analyzed When the analysis leads to proper action
    • 9. IBM Software Group | Rational software Innovation for a smarter planet Key Metric Success Factors Clear and easy performance of measurement Training for relevant personnel for measurement performance Support for metrics at all project levels Governance and enforcement of metric policies Clear assignment of roles and responsibilities Outcomes displayed in ways meaningful to consumers Retention of metrics for historical reference Consistent use of outcomes for decision making Early establishment of metrics
    • 10. IBM Software Group | Rational software Innovation for a smarter planet Breaking Bad (metrics) Lack of consensus and buy in Measurements inconsistently gathered Measurements inaccurate or lack consistent accuracy Measurements gathered too late Analysis inappropriate for metric Analysis not applied or is ignored Metric not retained or referenced Metric not highly correlated with desired information
    • 11. IBM Software Group | Rational software Innovation for a smarter planet 11 Use dashboards to provide summary views 11
    • 12. IBM Software Group | Rational software Innovation for a smarter planet Best Metric Practices  Explicitly link metrics to goals  Define what will be done with the results  Train staff in metrics gathering, analysis, and interpretation (as appropriate)  Prefer trends over statics  Use short plan  act  measure  analyze  plan… cycles  Audit metric acquisition to ensure consistency of data source, methods, frequency, and analysis  Change metrics when they stop driving improvements  Automate metric acquisition and analysis where possible  Render the analytic results in a form useful to the consumers of the metric  Apply intelligence to interpretation, don’t just blindly accept the obvious conclusion  Use previously captured metrics as guidelines for future planning but don’t slavishly follow them
    • 13. IBM Software Group | Rational software Innovation for a smarter planet Type of metrics Project Quality Design
    • 14. IBM Software Group | Rational software Innovation for a smarter planet Project Metrics  Project metrics measure project concerns Earned Value Schedule variance Velocity Iteration burndown Release burndown Requirements churn / enhancement request trend Age of enhancement (responsiveness metric) Cost per unit work Project Quality Design
    • 15. IBM Software Group | Rational software Innovation for a smarter planet Earned Value  Planned value (AKA Budgeted Cost of Work Scheduled BCWS)  Earned Value is how much value has been created so far:  Application Define the work as a set of mutually exclusive tasks Assign a Planned Value to each task (when complete) Define earning rules (0/100 simplest rule)  Schedule Variance  Schedule Performance Index (>1 is ahead of plan) Project Quality Design
    • 16. IBM Software Group | Rational software Innovation for a smarter planet 16 Velocity  A team’s velocity is the amount of functionality it completed in the previous iteration If something isn’t done then its points isn’t counted towards a teams velocity that iteration. Therefore a 5 point story that is 80% done is counted as 0 and not as 4 points that iteration.  Velocity is typically measured in a point system unique to the individual team You cannot compare teams using points because Team A measures using a different point system than Team B. Points are typically called story points or user story points by teams that have adopted a user-story driven approach to development Project Quality Design
    • 17. IBM Software Group | Rational software Innovation for a smarter planet 17 Iteration burndown  Shows amount of work taken on by the team for a single iteration and how much work is left to do  Usage:  Enables the team to identify where to adjust scope or resources to finish the iteration successfully  Provides delivery progress for the iteration Project Quality Design
    • 18. IBM Software Group | Rational software Innovation for a smarter planet 18 Release burndown  Shows the estimated functionality remaining to complete the current release  Usage: Track actual progress Estimate release date based on remaining work/velocity Project Quality Design
    • 19. IBM Software Group | Rational software Innovation for a smarter planet 19 Enhancement Request Trend  Many agile teams work on a new release of a solution which is already deployed into production  Shows the trend of enhancement requests received, approved, and closed during the project lifecycle Usage: Few enhancement requests may indicate lack of interest in the current production release OR may indicate satisfaction in the current release A high number of enhancement requests can indicate that the current production release is not functioning as stakeholders expect A growing backlog of enhancement requests may indicate an inability of the team to respond to changing requirements Project Quality Design
    • 20. IBM Software Group | Rational software Innovation for a smarter planet 20 Age of enhancement requests  Tracks the length of time stakeholder enhancement requests remain open  Usage: Indicates responsiveness of delivery team Unaddressed requests can impact the stakeholders' perception of value Project Quality Design
    • 21. IBM Software Group | Rational software Innovation for a smarter planet 21 Cost per unit of work  Tracks the cost of delivering a single unit of work (such as a user story point or use case point) across iterations.  Usage: Used to monitor costs throughout the project lifecycle based on the cost of the team and their velocity. Monitoring this metric in each iteration helps the team understand if their spending is sustainable. Project Quality Design
    • 22. IBM Software Group | Rational software Innovation for a smarter planet Design Metrics  Organizational Complexity  Requirements Complexity  Model Architectural Complexity  Model Semantic Complexity  Model Design Complexity Project Quality Design These metrics are instrumented in a Rhapsody wizard that is available at Merlin’s Cave: http://merlinscave.info/Merlins_Cave/Wizards/Entries/2010/1/8_Model_Metrics.html
    • 23. IBM Software Group | Rational software Innovation for a smarter planet Design Metrics  Organizational Project Quality Design Name Purpose Description NP Number of Packages Identifies the number of “work” or “configuration” units in the model DPC Depth of Package Containment A measure of the depth of the model organization unit EP Number of Elements In Package The number of classes and other elements (such as use cases and types) in a specified package, a measure of the size and granularity of the package AEP Average Number of Elements Per Package A measure of the overall granularity of the model organization MEPP Maximum Number of Elements Per Package A measure of the maximum complexity of model organizational elements PU Package Utility Number of developers the number of people who have read or usage access to a package / the number of developers who write element of a package
    • 24. IBM Software Group | Rational software Innovation for a smarter planet Project Quality DesignRequirements Complexity Name Purpose Description NUC Number of Use Cases A measure of the number of independent capabilities of the system FP UCP Function Points Use Case Points An estimate of the complexity of the problem to be solved, maps well to the NUC metric NA Number of Actors The number of actors associated with a given use case NUCA Number of Use Cases per Actor Given an actor, the number of use cases associated with the it NUCSD Number of Use Case Sequence Diagrams The total number of black-box sequence diagrams used as exemplars for use cases AUCSD Average number of Use Case Sequence Diagrams NUCSD / NUC. This is a measure of the average scope of a use case NUCS Number of Use Case States Total number of states + activities used to specify the use cases UCDC Use Case Decomposition Complexity The number of use cases derived from a single use case – this includes generalization and dependencies (both «includes», and «extends» relations) IIC Information Item Count Total number of Information Items in the use case model IICUC Information Item Count per Use Case IIC / NUC
    • 25. IBM Software Group | Rational software Innovation for a smarter planet Project Quality DesignModel Architecture Complexity Name Purpose Description NS Number of Subsystems The number of large-scale architectural units of a system. NT Number of Tasks Number of «active» objects in a system NAS Number of Address Spaces A measure of the scope of the distribution of a model across address spaces or computers. CASC Cross-Address Space Coupling A measure of the cohesion within address spaces versus cohesion across address spaces. RAS Redundant Architecture Scope Number of redundant architectural units for use in the Safety and Reliability Architecture (either homogeneous or heterogeneous) NP Number of Processors Number of processor nodes in the system. This may (or may not) be identical with the NAS metric NCP Number of Components per Processor Measures the cohesion of functionality within a processor node, assuming that a component provides a coherent set of functionality. NUCS Number of Use Cases per Subsystem For systems that decompose system use cases into subsystem level use cases
    • 26. IBM Software Group | Rational software Innovation for a smarter planet Project Quality DesignModel Semantic Complexity Name Purpose Description NoC Number of Classes A measure of the number of model size CC Class Coupling Measures the cohesion of the classes by computing the number of associations a class has with its peers TSC Total number of subclasses Measures the global use of generalization within a model CID Class Inheritance Depth The maximum length of a given class generalization taxonomy NC Number of Children Number of direct descendent (subclasses), a measure of the class reuse NM Number of Methods Number of methods within a class NP Number of ports Number of unique identifiable connection points of a class EF Encapsulation Factor Number of class features (attributes, methods, and event receptions) publicly visible divided by the total number of such features – a measure of the degree to which the internal structure of a class is encapsulated SF Specialization Factor The number of operations and statechart action sets which are specialized in subclasses
    • 27. IBM Software Group | Rational software Innovation for a smarter planet Project Quality DesignBehavioral Semantics Complexity Name Purpose Description SDS Sequence Diagram size Number of messages x number of lifelines DCC Douglass Cyclomatic Complexity Modified McCabe cyclomatic complexity to account for nesting and and-states WMC Weighted Methods per Class A measure of (non-reactive) class complexity = sum of methods x complexity for all methods. For classes without activity diagrams, method complexity can be estimated by Lines of Code in the method. ND Nesting Depth State and activity nesting depth – number of levels of nesting NE Nesting Encapsulation Number of transitions (other than default) that cross levels of nesting NAS Number of And-States Total number of and-states within a statechart SCN Statechart completeness Number of events received by a statemachine / number of transitions NPS Number of pseudostates Number of non-default pseudostates such as history, conditional, fork, join, junction, and stubs
    • 28. IBM Software Group | Rational software Innovation for a smarter planet Quality Metrics  Quality metrics focus on Compliance to plans Deviation of expected functionality and correctness  Compliance to standards AKA “Syntactic correctness” Audit / Review Performance Percentage Audit / Review Pass Percentage  Correctness AKA “Semantic correctness” Defect Density Defect Trend Requirements Coverage Trace Completeness Project Quality Design
    • 29. IBM Software Group | Rational software Innovation for a smarter planet 29 Defect Density  Tracks the number of defects found, fixed, and still remaining during a given period of time per thousand source lines of code (KSLOC), per model, or design unit  Usage: Indication of the quality of the product Indication of the effectiveness of testing efforts Project Quality Design
    • 30. IBM Software Group | Rational software Innovation for a smarter planet 30 Defect Trend Chart  Shows defect arrival and closure rates, indicates the remaining defect backlog, projects the future defect arrival/close rate up to and post-ship  Usage: – Indication of the quality of the product – Indication of the effectiveness of testing efforts Project Quality Design
    • 31. IBM Software Group | Rational software Innovation for a smarter planet 31 Test coverage of requirements  Indicates the percentage of requirements linked to validating tests  For agile teams, this is often the percentage of user stories which have one or more acceptance tests associated with them  Usage: When the coverage isn’t 100% it indicates that the solution isn’t fully tested When the coverage is 100% we need other metrics to determine sufficiency of testing Project Quality Design
    • 32. IBM Software Group | Rational software Innovation for a smarter planet 32 Requirements-based Code Coverage  Indicates the completeness of the coverage of the code versus the requirements  This metric is required for evidence by some safety standards (e.g. DO-178)  Usage: When the coverage isn’t 100% it indicates that the solution isn’t fully tested or that there is code for which there are no requirements Project Quality Design
    • 33. IBM Software Group | Rational software Innovation for a smarter planet Trace Completeness  Indicates the degree of consistency among engineering work products  Typically traces among elements from requirements, architecture, design, code, test, safety analysis (required by some safety standards e.g. DO-178, ISO26262)  Usage When it is important to ensure consistency of the design due to high consequences of failure Project Quality Design Design / Implementation Elements D1 D2 D3 D4 D5 Requirements R1 x x R2 R3 x R4 x Gold plating? Unimplemented requirement
    • 34. IBM Software Group | Rational software Innovation for a smarter planet 34 Dimensions Team (In Process) Middle Management (Development Mgmt.) Development Executive (VP Development) Time-to-Delivery / Schedule User Story Points / Use Case Points Iteration Burndown, Blocking Work Item Release Burndown Product Value: Iteration Velocity Stakeholder Feedback, # of Enhancement Request, Age of Enhancement Request Tested and Delivered Requirements, Business Value Velocity, Customer Satisfaction Product Cost / Expense Effort (Man-hours) Cost / Unit of work Development / Maintenance Costs Product Quality Technical Debt (Defect trend, defect density) Test Status, Test Coverage of Requirement, Test Execution Status Quality at Ship Predictability User Story Points / Use Case Points Planned/Actual Cost and Velocity Trend Variance. Likelihood of on-time delivery Note: Bold indicates that there is Out-Of-The-Box report supported by Rational tools From In Process (Team) To Executive Value: Appropriate Metrics for Each Management level
    • 35. IBM Software Group | Rational software Innovation for a smarter planet Adding Metrics into your process: Define your metrics Concerns • Identify your concerns Goals • Specify your goals Select • Define your metrics Implement • Define when and how metrics will be gathered Specify analytics • Define how metrics outcomes will be used
    • 36. IBM Software Group | Rational software Innovation for a smarter planet Adding Metrics into your process: Update your process Gathering • Identify how the metric data will be captured Instrumenting • Tool up for metrics Process update • Add into your process definition Train • Train workers to properly capture and/or use metric data Deploy • Instrument your project
    • 37. IBM Software Group | Rational software Innovation for a smarter planet Summary: Metric Guidance  Clearly define goals and explicitly link metrics to them  Prefer goal-based over plan-based metrics  Obtain buy in and consensus  Train and allocate resources  Use short enough cycles that metrics can positively affect outcomes  Collect only a few key measurements  Automate where possible  Metrics that do not result in changed actions are worse than useless  Understand Metrics are not the goal Most issues require more than a single metric Metrics augment but do not replace intelligent judgment
    • 38. IBM Software Group | Rational software Innovation for a smarter planet Final Thoughts

    ×