Your SlideShare is downloading. ×
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
ESTABLISHING REQUIREMENTS METRICS
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

ESTABLISHING REQUIREMENTS METRICS

975

Published on

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
975
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
79
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Insert photo here ESTABLISHING REQUIREMENTS METRICS Campbell Ferenbach Global Business Analysis Capability 1 / OCTOBER 2008 / EDS INTERNAL
  • 2. Objective This presentation outlines: 1) Why aren’t we all collecting metrics for Business Analysis? 2) What to count and how to count it. 3) Implementation - the challenges ahead. …to share information and gain new ideas BUSINESS ANALYSIS PERFORMANCE METRICS 2 / OCTOBER 2008 / EDS INTERNAL
  • 3. Agenda • Why do we need Metrics? • Requirements for Metrics Strategy • Approach • Current State • What to measure? • Defining the BA Measurements • Sample Reports • Recap • Questions? BUSINESS ANALYSIS PERFORMANCE METRICS 3 / OCTOBER 2008 / EDS INTERNAL
  • 4. Why do we need Metrics? • Performance measures are needed to : – Enable analysis of Business Analysis productivity and quality • Drive productivity savings – Determine effectiveness of, and justify investments in: • training, • process and • tool improvements – Provide foundation for estimating – Demonstrate professionalism to clients and BA industry – Enable Competitive Benchmarking BUSINESS ANALYSIS PERFORMANCE METRICS 4 / OCTOBER 2008 / EDS INTERNAL
  • 5. Cost of Requirements Defects • “Deficient requirements are the single biggest cause • “A requirements defect that is left undetected until of software project failure. From studying several construction or maintenance will cost 50 to 200 hundred organizations, Capers. Jones discovered times as much as it would have cost to fix at that RE is deficient in more than 75 percent of all requirements time.” (Taming Wild Software Schedules, enterprises. In other words, getting requirements Steve McConnell) right might be the single most important and difficult part of a software project. Despite its Defect Correction Cost importance, we know surprisingly little about the actual process of specifying software.” (Hubert F $10,000 Hofmann, Franz Lehner, ”Requirements Engineering as a success factor in software projects”, IEEE Software, July-Aug, 2001) $8,000 • “Of the IT projects that are initiated, 5% to 15% will be abandoned before or shortly after delivery as hopelessly inadequate. Many others will arrive late $6,000 and over budget or require massive reworking. Few IT projects, in other words, truly succeed… The $4,000 most common reasons for software failures include:1. unrealistic or unarticulated project goals, 2. inaccurate estimates of need resources, 3. badly $2,000 defined system requirements,…” (Robert N Charette, “Why Software Fails”, IEEE Spectrum, Sept 2005) • “Analysts report that as many as 71% of software $- projects that fail do so because of poor en ts s ig n Co de c ti on De requirements management, making it the single q uir em e est i P st i r od u nT eT biggest reason for project failure” (Christopher Linquist, Re rat io p ta nc “required: Fixing the Requirements Mess,” CIO, Nov 15, 2005) te g ce Requirements Defect Correction / In Design cDefect Correction tA te m en • “Us of experienced analysts versus inexperienced Code Defect Correction S y s C li analysts can reduce project cost by 33%. Use of experienced and capable analysts can reduce The above graph assumes the cost of project cost by 50%.” (Karl Wiegers, Software fixing a defect in the phase in which it was Requirements) inserted has a cost of $75.00 BUSINESS ANALYSIS PERFORMANCE METRICS 5 / OCTOBER 2008 / EDS INTERNAL
  • 6. Defining Performance Measures - Capability Performance Metrics at the Project level are no longer adequate – Projects are increasingly executed by “Virtual Teams”, with members typically geographically dispersed – Some projects are staffed by “Centers of Expertise”, where staff specialize in specific technologies and functions in the delivery lifecycle Capability Performance provides a finer level of detail than Project metrics • Development has: SLOC / Staff Month OR Function Points / Staff Month • Testing has: Test Cases executed / Staff Month Pre- & Post-Release defects Performance Metrics need to be developed for the different types of Capabilities – In order to improve the performance of a capability we need to understand the current baseline for that capability – Measurements need to be developed to address both the quality of work products created, as well as the quantity of the work products created by the capability BUSINESS ANALYSIS PERFORMANCE METRICS 6 / OCTOBER 2008 / EDS INTERNAL
  • 7. Requirements for BA Metrics Strategy • Key Requirements: – Enable measurement of Business Analysis performance – Ensure measurements cover both the “quality” as well as the “quantity” of the work products – Use data collected by standard EDS/HP tools – Enable meaningful comparison across different project profiles • Methodologies, target technologies, etc. – Integrate with & complement existing metrics programmes BUSINESS ANALYSIS PERFORMANCE METRICS 7 / OCTOBER 2008 / EDS INTERNAL
  • 8. Approach • Scope the Problem – Research existing productivity metrics internally & externally – Establish requirements for the Business Analysis productivity baseline • Define Productivity Model – Define the “Business Analysis” metrics – Define productivity model & data requirements • Collect Baseline Data – Develop metrics collection tools and process – Collect initial data • Analyse Data & Calibrate Model – Analyse data, calibrate model – Refine assumptions & model • Document Baselines – Formally document baselines from which improvements will be measured – Separate by project type, scale, methodology, more? BUSINESS ANALYSIS PERFORMANCE METRICS 8 / OCTOBER 2008 / EDS INTERNAL
  • 9. Industry Information What does industry information suggest? – The industry in general measures Project and Product productivity and quality. » Very little data is available on Business Analysis performance » Yet, widespread recognition that poor quality requirements are the reason for many project failures – The industry is starting to look at the quality criteria of requirement statements » e.g.: analyzing the word and grammar usage for clarity & ambiguity – Various tools are available to enhance the quality of requirement statements. Among them are: » NASA’s Software Assurance Technology Center use a method of measuring the quality of a requirements document by looking at “Completeness, Consistency, Correctness, Modifiability, Ranking, Traceability, Non- ambiguity, and Verifiability” and normalizing these metrics against size (lines of text). (http://satc.gsfc.nasa.gov/support/SEW_DEC96/sel.html ) » Some requirements management tools highlight text which is questionable and propose alternative text. The list of questionable terms is maintained by the tool administrator. It does not provide an overall score of a document but is does help the user improve the quality of the document. » Some tools identify certain types of incomplete requirements, such as missing alternatives following an ‘If’ statement. BUSINESS ANALYSIS PERFORMANCE METRICS 9 / OCTOBER 2008 / EDS INTERNAL
  • 10. Goal-Question-Metric (GQM) • Decide on your goals • What do you want to fix/improve/optimize? » Improve BA productivity! » Improve quality of requirements! • Ask your questions • What do you need to know to meet the goals? » Did we get more or less productive after a change? » Did we make more or fewer mistakes after a change? • Identify your metrics • What do you need to measure to answer the questions Let’s look at those… BUSINESS ANALYSIS PERFORMANCE METRICS 10 / OCTOBER 2008 / EDS INTERNAL
  • 11. What can we possibly count? • What phenomena do we have to measure? – Requirements produced & Effort = requirements written per unit of time – How to count? Not all requirements are equal… – What to include in Effort? • just BAs? Reviewers? Draft only? Rework? Other changes? • We looked at our process and requirements… – Scope? No • part of contracting, highly variable, BA portion not very quantifiable – High-Level Requirements? Yes • BA-specific work, produces approved, countable requirements – Detailed Requirements? Yes • BA-specific work, produces approved, countable requirements – Tracking BA effort only? No • Need to include reviewers and other contributors • Don’t include client’s time BUSINESS ANALYSIS PERFORMANCE METRICS 11 / OCTOBER 2008 / EDS INTERNAL
  • 12. A Consistent, Quantitative Measure • A requirements management tool can tell us how many are in the current project. • But… Simply counting all requirements is too simplistic – Only want requirements produced for the current release/milestone – Also those that were changed or deleted in the current release. • And… Not all requirements are equal – One Use Case involves much more effort than a single textual requirement and we wouldn’t want to compare them as equals – And what about the requirements embedded in models such as class (or data) models and business process models? • So… Count text-based requirements and ‘measure’ the size and complexity of use cases – Either a lot of counting of lines in MSWord-based UCs or come up with an intelligent counting tool – Need a streamlined, low-effort way of estimating key elements that would give a reliable indication of the scale and complexity (later) BUSINESS ANALYSIS PERFORMANCE METRICS 12 / OCTOBER 2008 / EDS INTERNAL
  • 13. Quantity – What are we going to count? • Number of High Level Requirements, includes: • Business Processes • High Level Requirements statements • Business Policies • Number of Detailed Requirement Statements, includes: • Functional Requirements and Use Cases • Non-functional Requirements • Business Rules • Class Model • Reworked and Changed Requirements, includes: • Number of requirements changed due to defects • Number of requirements changed due to external influences • Function Points, includes: • Function Point Count when Detailed Requirements are approved • Function Point Count at end of project • Effort, includes: • Effort to produce High Level and Detailed Requirements • Effort to Rework (due to defects) or Change (due to CRs) previously approved requirements BUSINESS ANALYSIS PERFORMANCE METRICS 13 / OCTOBER 2008 / EDS INTERNAL
  • 14. Quality – What to Measure? High Quality Requirements are: • Correct » Requirements as stated are 100% correct • Complete » Requirements as stated have no missing elements » All Requirements are captured » Requirements as stated have sufficient detail to be the foundation of the Design, Coding and Test phases • Consistent » Requirements as stated do not contradict other requirements • Unambiguous » Requirements as stated are not open to interpretation • Verifiable » Requirements as stated can be fully tested (through both the normal paths as well as the abnormal paths) • Ranked for Business Priority » Requirements have Business Priority defined (e.g. numeric priority from 1 – highest priority to 5 – lowest priority) • Strategic » Requirements are aligned with Client Strategic Vision We will measure the quality of the approved version of the requirements BUSINESS ANALYSIS PERFORMANCE METRICS 14 / OCTOBER 2008 / EDS INTERNAL
  • 15. Performance Metrics for BA Capability To measure Business Analysis in terms of the Quantity and Quality Criteria, three BA Performance Metrics have been identified: • Productivity (Quantity) – Answers “How many requirements were produced, for a given amount of effort?” • Rework (Quality) – Answers “Did we produce high quality requirements?” • Volatility (Stuff happens) – Percent of Requirements Statements and Use Cases modified, added or deleted due to new client requests – Answers “How much churn occurred after the requirements were approved?” BUSINESS ANALYSIS PERFORMANCE METRICS 15 / OCTOBER 2008 / EDS INTERNAL
  • 16. Defining Productivity Productivity = Outputs / Inputs • The Outputs: – Number of Requirements Statement and Weighted Use Cases • The Inputs: – Staff Months of effort BUSINESS ANALYSIS PERFORMANCE METRICS 16 / OCTOBER 2008 / EDS INTERNAL
  • 17. Summary: Business Analysis Performance Metrics # Requirements Per Staff Month Amount of Effort Productivity Analysis & Documentation Metric Client Elicitation Require- Require- Require- ments ments Client Baseline sign-off sign- % Reworked Reqts Defect Oops, that’s that’ due to defects Change not right! Request Rework Metric Approved Baseline Corrected Baseline Client forgot the Client % Changed /Added Scope CEO’s requirements CEO’ Change Requirements Change and asked “Can you Request Request still add those?” those?” Volatility Metric New Baseline BUSINESS ANALYSIS PERFORMANCE METRICS 17 / OCTOBER 2008 / EDS INTERNAL
  • 18. Performance Metrics Definitions Productivity (2 Measurement Points) Initial Requirements Productivity: – # of Requirement Statements and Use Cases / Requirements Effort in Staff Months – Requirements Effort includes all effort expended to achieve customer Approval of the High Level and Detailed Requirements Overall Project Productivity: – # of Requirement Statements and Use Cases / Business Analysis Effort in Staff Months – Business Analysis Effort includes all: » Requirements Effort, during High Level and Detailed Requirements phases » plus Rework Effort, during Delivery phases Rework – % of Reworked Requirements / Total # of Requirements Statements + Use Cases Volatility – % of Changed Requirements / Total # of Requirements Statements + Use Cases BUSINESS ANALYSIS PERFORMANCE METRICS 18 / OCTOBER 2008 / EDS INTERNAL
  • 19. Metrics Terminology Each metric requires a clear definition with explicit “counting” rules to ensure the resulting data is comparable across teams and across time Unit of Definition Measure The total Effort expended to elicit, document, analyse and review the High Level and Requirements Detailed Requirements Statements and Use Cases UNTIL the customer’s formal Effort acceptance of the Detailed Requirements. This include the efforts expended by all resources involved, excluding effort from resources associated with the client The total Effort expended to modify the Requirements Statements and Use Cases, due to Rework Effort Change Requests associated with requirement defects. The total Effort expended to modify the Requirements Statements and Use Cases, due to Volatility Effort Change Requests associated with client requests for new/changed functionality. The total Requirements Effort and Rework Effort to produce and maintain the approved Business Requirements Statements and Use Cases, over the duration of a given project lifecycle Analysis Effort (e.g., Release) Staff Month The standard number of hours of effort available per individual per month Requirement The count of all textual Detailed Requirement statements (non-Use Cases) Statements Use Cases The weighted count of all Detailed Use Cases (non-textual Requirements Statements) Reworked The count of Requirement Statements and Use Cases added, changed, or deleted, due to Requirements requirement defects Changed The count of Requirement Statements and Use Cases added, changed, or deleted, due to Requirements new client requests, AFTER Detailed Requirements Approval BUSINESS ANALYSIS PERFORMANCE METRICS 19 / OCTOBER 2008 / EDS INTERNAL
  • 20. Issues & Lessons Learned 1. Metric Measurements only make sense in Managed Work environments • Metrics are of little to no use in Staff Augmentation environments • Without consistency in processes and methods, we can’t isolate what causes a change in a metric from one release to the next 2. Tools need to be defined and configured such that metrics are automatically produced with little to no additional effort 3. Metrics need to be generated by the standard toolset • So they can very easily be collected in the corporate metrics repository 4. Very few Industry and EDS metrics exist at the BA Capability Level • We just don’t have clear baselines • We have to get started and accumulate our own BUSINESS ANALYSIS PERFORMANCE METRICS 20 / OCTOBER 2008 / EDS INTERNAL
  • 21. Sample Reports… • Project Metrics Report • Project Summary Report BUSINESS ANALYSIS PERFORMANCE METRICS 21 / OCTOBER 2008 / EDS INTERNAL
  • 22. Summary: Business Analysis Performance Metrics # Requirements Per Staff Month Amount of Effort Productivity Analysis & Documentation Metric Client Elicitation Require- Require- Require- ments ments Client Baseline sign-off sign- % Reworked Reqts Defect Oops, that’s that’ due to defects Change not right! Request Rework Metric Approved Baseline Corrected Baseline Client forgot the Client % Changed /Added Scope CEO’s requirements CEO’ Change Requirements Change and asked “Can you Request Request still add those?” those?” Volatility Metric New Baseline BUSINESS ANALYSIS PERFORMANCE METRICS 22 / OCTOBER 2008 / EDS INTERNAL
  • 23. Recap This presentation set out to cover: 1) Why aren’t we all collecting metrics for Business Analysis? 2) What to count and how to count it. 3) Implementation - the challenges ahead. …to share information and gain new ideas BUSINESS ANALYSIS PERFORMANCE METRICS 23 / OCTOBER 2008 / EDS INTERNAL
  • 24. Business Analysis Performance Metrics # Requirements Per Staff Month Amount of Effort Productivity Analysis & Documentation Metric Client Elicitation Require- Require- Require- ments ments Client Baseline sign-off sign- % Reworked Reqts Defect Oops, that’s that’ due to defects Change not right! Request Rework Metric Approved Baseline Corrected Baseline Client forgot the Client % Changed /Added Scope CEO’s requirements CEO’ Change Requirements Change and asked “Can you Request Request still add those?” those?” Volatility Metric New Baseline BUSINESS ANALYSIS PERFORMANCE METRICS 24 / OCTOBER 2008 / EDS INTERNAL
  • 25. EDS 5400 Legacy Drive Plano, TX 75024 +1 972 605 6000 campbell.ferenbach@eds.com or eds.com EDS and the EDS logo are registered trademarks of Hewlett-Packard Development Company, LP. HP is an equal opportunity employer and values the diversity of its people. ©2008 Hewlett-Packard Development Company, LP. 25 / OCTOBER 2008 / EDS INTERNAL

×