Software Quality AssociatesUse the Windshield, Not the Mirror:Predictive Metrics that Drive SuccessfulProduct ReleasesPres...
Agenda2 My “measurement aha! moment” Measures Today; The Missing Links Why Measure; The ROI What to Measure; The Portf...
Measures Today3 < 50% of projects are delivered successfully 40% of a Project Teams effort is wasted on unproductive rew...
To Effectively &Proactively Manage It!Use Measures…The Basics – Why Measure4MeasureMetricMeasurementTechniqueBaselineActio...
The Portfolio of MeasuresOutcome Measures5The Rear View MirrorReactiveHow Well Did You Execute?(Performance)Time Cost Qual...
The Portfolio of MeasuresPredictive Measures – The Missing Set6ResourcesResourcesInternal ProcessActivities*Internal Proce...
Creating the Line of Sight7ResourcesResourcesInternal ProcessActivities*-Process Compliance-Requirements Stability-Change ...
ThThe Measurement SystemHow to Turn the Portfolio of Measuresinto Action!8AppliedtoMeasurementBasedTechniqueMeasurementBas...
The Four Main Points Remember… Develop a Portfolio of Measures ~ balanced and integrated. Tie measures to Key Business ...
Putting it into Practice -A Case StudyCopyright © 2013 SQAAll Rights Reserved
Step One – Identify Goals and PerceivedIssues11 Three organizational goals were established; Avoid Client Impact Consis...
Step Two– Develop the QuestionsWhere do we begin to look?12 Predictor Questions Tools: Are the tools being utilized appr...
Step Three – Build the Portfolio of MeasuresWhat Measures are available?13Category Description Data Source Data ElementsDe...
Step Four – Gather and Analyze the MeasuresPredictive Measures14 Tools Available Suitable for Intended Use Integrated...
Step Four – Gather and Analyze the MeasuresPredictive Measures, continued156567 68 687081 82930102030405060708090100#ofReq...
Step Four – Gather and Analyze the MeasuresOutcome Measures16Effort Requirements unstable/changing, sixteen new builds pa...
Step Four – Gather and Analyze the MeasuresOutcome Measures170102030405060708090Options Trading Positions Accounts Quote P...
Step Four – Gather and Analyze the MeasuresOutcome Measures18CostTotal Production Defects =251Phase UnitCost(IndustryTrend...
Step Five – Take Action!Alignment of Measures Key to Success19DRIVECONSISTENCYSTRIVE TO BEBEST IN CLASSWorld ClassAVOIDCLI...
Thank You!For questions or additional information, please feel free to contact:Sharon M. Niemi(508) 254-2561sniemi@sqassoc...
Upcoming SlideShare
Loading in …5
×

Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful Product Releases

731 views

Published on

Sharon Niemi, Practice Director of SQA, talks about how the right combination of predictive and reactive metrics can help you build a measurement portfolio that improves product quality and release consistency. You’ll learn how to build a measurement system that incorporates leading and lagging indicators to improve your team’s consistency in delivering quality products on time and within budget.

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
731
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful Product Releases

  1. 1. Software Quality AssociatesUse the Windshield, Not the Mirror:Predictive Metrics that Drive SuccessfulProduct ReleasesPresented by:Sharon NiemiPractice Director, LifecycleOptimization
  2. 2. Agenda2 My “measurement aha! moment” Measures Today; The Missing Links Why Measure; The ROI What to Measure; The Portfolio of Measures How to Measure; The Measurement System Pulling it Together; A Case Study Seapine Tools Demonstration
  3. 3. Measures Today3 < 50% of projects are delivered successfully 40% of a Project Teams effort is wasted on unproductive rework 70% of defects uncovered in production are requirements related 90% of our clients most often measure and use: Schedule Budget Defects(found in Systems Test)Our Benchmark Studies and other sources of data have revealed….Our Benchmark Studies and other sources of data have revealed….So what’s missing?Portfolio ofMeasuresMeasurement System
  4. 4. To Effectively &Proactively Manage It!Use Measures…The Basics – Why Measure4MeasureMetricMeasurementTechniqueBaselineActionMeasures are a Means ~Not an End!Benchmark“You can’t manage what you don’t measure”“You can’t manage what you don’t measure”X
  5. 5. The Portfolio of MeasuresOutcome Measures5The Rear View MirrorReactiveHow Well Did You Execute?(Performance)Time Cost QualityCustomerSatisfaction
  6. 6. The Portfolio of MeasuresPredictive Measures – The Missing Set6ResourcesResourcesInternal ProcessActivities*Internal ProcessActivities*The Windshield* Methodology AgnosticCapability &CapacityToolsToolsEfficiencyEffectiveness-Process Compliance-Requirements Stability-Change Request Backlog-Velocity-Trends
  7. 7. Creating the Line of Sight7ResourcesResourcesInternal ProcessActivities*-Process Compliance-Requirements Stability-Change Request Backlog-Velocity-TrendsInternal ProcessActivities*-Process Compliance-Requirements Stability-Change Request Backlog-Velocity-TrendsPredictive(Proactive)For the Portfolio of Measures to provide value, it needs to beholisticOutcomes(Reactive)ToolsToolsLearning & FeedbackLearning & FeedbackHow much doyou want toImprove?OutcomesOutcomesCause
  8. 8. ThThe Measurement SystemHow to Turn the Portfolio of Measuresinto Action!8AppliedtoMeasurementBasedTechniqueMeasurementBasedTechniqueProcesses,Tools, andCapabilitiesProcesses,Tools, andCapabilitiesTo SupplyTo ImproveFact BasedData forDecisionsFact BasedData forDecisionsTechniquesBalanced ScorecardGoal –Question – MetricSix Sigma’s DMAICISO / CMMi / ITIL / GxPTechniquesBalanced ScorecardGoal –Question – MetricSix Sigma’s DMAICISO / CMMi / ITIL / GxPOwnedDefinedUsedImprovedPortfolioofMeasuresVisible
  9. 9. The Four Main Points Remember… Develop a Portfolio of Measures ~ balanced and integrated. Tie measures to Key Business Drivers and Goals ~ make themmeaningful and relevant. Implement a Measurement System that Drives Action ~ openlycommunicate progress, gaps, and action plans. Continue to update the Portfolio of Measures as Goals areattained and new goals identified!9
  10. 10. Putting it into Practice -A Case StudyCopyright © 2013 SQAAll Rights Reserved
  11. 11. Step One – Identify Goals and PerceivedIssues11 Three organizational goals were established; Avoid Client Impact Consistency in Delivery Strive to be Best in Class Perceived issues; Testing was perceived to be the roadblock to on time delivery. Software Development Process defined – Iterative in nature. Recently implemented Test Management tool and questioned if fully utilized.
  12. 12. Step Two– Develop the QuestionsWhere do we begin to look?12 Predictor Questions Tools: Are the tools being utilized appropriately? How integrated are they? Resources: How well trained are the testers? Workloads? Process: We hear that the SDLC is defined, but is it followed and effective? Process: How stable are the requirements? When do they baseline or “freeze”them? Trends: Are there any trending data that we can use? Outcome Questions Time: How much time is allocated for testing and how much effort does it actuallytake? Quality: What’s the state of the builds being deployed to Test and what defects arebeing uncovered throughout the development process? Cost: What is the cost of migrating defects? Customer Satisfaction: Were and if so where are the customers finding the defects?
  13. 13. Step Three – Build the Portfolio of MeasuresWhat Measures are available?13Category Description Data Source Data ElementsDefects/Errors reported by Customer Help Desk Excel SpreadsheetsPlanned Testing Effort vs. Actual Testing Effort Test Analysts LOE Actual vs. Planned by project/ releaseTests Planned vs. Actual Test Analysts # of Planned Test Cases Planned vs. # of Actual Test Cases RunDefects Identifited (full lifecycle) QA Analysts # of Defects Type by PriorityCost Cost of Rework QA Director LOE/Fully Loaded Cost/# of Defects FoundTimeQualitySatisfactionCategory Description Data Source Data ElementsSDLC PMO Excel SpreadsheetsChange Request Backlog PMO Excel SpreadsheetsRequirements and Testing Practices BA and Test Managers Excel SpreadsheetsSkills Matrix by Job Description HR Performance Management Excel SpreadsheetsTraining Planned and Completed Test Manager Training Plans and Performance ReviewsTools Utilization and Guidelines Test Manager Review of data and Adherence to GuidelinesTrending Metrics Test Cost per Defect QA Director LOE/Fully Loaded Cost/# of Defects FoundProcess AdherenceSkill LevelsTrainingOutcome MeasuresPredictive Measures
  14. 14. Step Four – Gather and Analyze the MeasuresPredictive Measures14 Tools Available Suitable for Intended Use Integrated Widespread Adoption (Test Organization) Applicable Resources Trained Appropriately Used Consistently Data Integrity Capability; Resource / Skills
  15. 15. Step Four – Gather and Analyze the MeasuresPredictive Measures, continued156567 68 687081 82930102030405060708090100#ofRequirementsRequirements 1.22.13 ReleasePlannedActualDesignCodeTest – Week2Test – Week1Requirements Stability Requirements Requirements were never frozen Requirements continued to be unstable / changing;36% increase to plan during week 2 of SystemsTesting Requirements reviews do not include representativefrom Test Added effort required to address rework due to impactof changesIncrease = 36% Process Defined and Documented Appropriate to the Culture Used Consistently Associated Metrics / Quantifiable Roadblocks Duplicate Work Gaps Outcomes Measured (time / cost / quality)
  16. 16. Step Four – Gather and Analyze the MeasuresOutcome Measures16Effort Requirements unstable/changing, sixteen new builds passed to the TestEnvironment in four weeks Test Analysts averaged 8 or more hours overtime 11/22, 11/29, 12/13 3 Test Analysts were out sick week ending 12/6 Due to # of defects being found, additional test cases were selected forexecution Test Analysts do not possess the right level of skills to complete the job
  17. 17. Step Four – Gather and Analyze the MeasuresOutcome Measures170102030405060708090Options Trading Positions Accounts Quote PDF Balances Quicken#ofEscalatedCallsCall CategoryEscalated Calls by Category 12/13/201212/6/201211/29/201211/22/2012Customer Satisfaction Over a four week period, 53%of the calls were due toOptions, Quote, and Tradingissues Only 10 % of Test Cases runare against Options, Quote,and Trading Options, Quote, and Tradingare not fully covered in theRegression SuitesOptions TradingPositions AccountsQuote PDF BalancesQuicken24.7% 13.2% 7.6% 12.2% 15.1% 7.9% 7.6% 11.8%
  18. 18. Step Four – Gather and Analyze the MeasuresOutcome Measures18CostTotal Production Defects =251Phase UnitCost(IndustryTrend)CalculatedCost# ofDefectsReportedMin $ Max $Requirements 1 $39 $3,120Design 3 to 5 $117 / $195 $9,360 $15,600Code 10 $390 $31,200Systems Test 15 to 40 $585 / $1560 $46,800 $124,800UserAcceptance30 to 70 $1170 / $2730 $293,670 $685,230Production 40 to 1000 $1560 / $39,000 80 $124,800 $3,120,000 Phase; Software Development Lifecycle phase Unit Cost per Defect; Based on Industry Trend Calculated Cost per Defect; Based on $585per Defect if Found in Systems Test Phase (costof Testers / average # of defects found = averagecost per defect in that particular phase) # of Defect Reported; Based on 80 defects(Requirements Related) reported in ProductionEnvironment (reported in a three month period) Min / Max $$; Cost to Fix if Found in Phase
  19. 19. Step Five – Take Action!Alignment of Measures Key to Success19DRIVECONSISTENCYSTRIVE TO BEBEST IN CLASSWorld ClassAVOIDCLIENTIMPACTTime/$/QualityProbabilityTargetNSatisfaction, Cost,&Quality; DefectPropagation & CustomerFound DefectsTime; Effort Planned vs.Actual, Test Cases Plannedvs. Actual due to changinglandscapeImprove RegressionTestingNeed Full Lifecycle DefectMgtTime/$/QualityProbabilityTargetNProcess; AbandonedWhen SchedulePressures AroseTools; ExpansionRequiredResources; Weren’tSkilled and CompletelyCapableNeed Application TrainingNeed Process AdherenceNeed StableRequirementsControlProtectProductionTime/$/QualityProbabilityTargetNTrends; Data Available(CpD)Measurement; Needintegrated Portfolio ofMeasures, Baselines, andBenchmarksNeed Expansion to BA /Development ActivitiesProtectProductionTime/$/QualityProbabilityTargetNNeed MeasurementSystem fully establishedNeed Governance andOversight; Improvementthrough Measurement asa way of lifeReactive (Rear ViewMirror)Proactive (Windshield)ProtectProductionControlAssureManage
  20. 20. Thank You!For questions or additional information, please feel free to contact:Sharon M. Niemi(508) 254-2561sniemi@sqassociates.comJeff Amfahr(513) 701-1593amfahrj@seapine.comor visit our websiteswww.sqassociates.comwww.seapine.comCopyright © 2013 SQAAll Rights Reserved

×