Controlling Project Performance
by Using a Defect Model
Ben Linders
Ericsson Telecommunicatie B.V., Rijen The Netherlands
...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-312
Overview
Business Needs
Project Defect Model
Experien...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-313
Ericsson, The Netherlands
Market Unit Northern Europe...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-314
Business Need for Quality
Multimedia functionality
St...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-315
Target
Business: Increased R&D Efficiency
R&D Scoreca...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-316
Measurement Values
Use Available Data over Collecting...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-317
Required
Control of Quality:
Clear requirements
Quali...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-318
Project Defect Model
Why?
– Control quality of the pr...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-319
Measuring quality
Insertion: Where are defects made? ...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3110
Quality Management
Plan
– Documents/code (nr defects...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3111
Project Status Deviation Report regarding QualityPro...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3112
History
2001 Defined, pilot project started
2002 Eva...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3113
Functional Test
Project:
– Incremental
– Function Te...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3114
Improve Inspections
Re-introduce Design Rules
Coach ...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3115
Release defect prediction
Number of defects predicte...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3116
Agile Approach
Planning game:
Analyze Quality
Demo:
...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3117
Agile experiences
Planning game:
Investigate solutio...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3118
Key Success Factors
Management Commitment
Everybody ...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3119
Management Targets
Target Target Owner
GA Defects St...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3120
Estimation
Analysis
Report
Defect
Modelling
Target
S...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3121
Defect Classification
Fault Slip Through: Could have...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3122
Feedback
Frequent, short
At the workplace
All data a...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3123
Benefits
Qualitative
Earlier risk signals: Deliver o...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3124
Learnings
Estimation & analysis with Design & Test L...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3125
Quality Prediction
Current Model: Estimation
– Extra...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3126
SEI Affiliate Research
Quality Factor Model
– Expert...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3127
Pilot Agile: Prevention
Determine defect insertion &...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3128
Conclusions
Quality has Business Value
You can Measu...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3129
Further reading
Papers
– Controlling Product Quality...
© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3130
Upcoming SlideShare
Loading in...5
×

Controlling Project Performance by using a Project Defect Model, Ben Linders, QA&Test Conference 2007

578

Published on

Wouldn’t it be nice if you could have more insight into the quality of a product, while it is developed, and not afterwards? To be able to estimate how many defects are inserted in the product in a certain phase, and how effective a (test) phase is in capturing these defects? The simple but effective model described in this paper makes it possible. It describes the model used in projects at Ericsson in the Netherlands. The model supports controlling the project (putting quality next to planning and budget data), evaluating risks, and taking decisions regarding product release and support.

To get more insight into the quality of the product during development, it is needed to measure the processes with two views: Introduction of defects, and detection of defects. Introduction is done during the requirements, architecture, design and coding phases; defects are either introduced into documents or into the actual product. Detection is done in test phases, and in the previously mentioned phases with inspections and reviews. By using these two measurements, a project can determine if and what the quality risk is: Too many defects in the product, or insufficient testing done.

To able to use the model in a broad set of projects, a template was made, including parts that are applicable in most projects in a configurable way. Also a user guide, with industry reference data, and training material and an overview presentation has been made.

The organization learned a lot from the model. Initially focus was on earlier defect detection, e.g. function- iso system test and inspection iso test. With the increase of data, more insight is acquired in design and coding activities, providing means for defect prevention. Now that data is available from a larger set of projects, processes can be compared enabling decisions about best practices from one project that can be spread towards future projects. Frequent estimation & feedback sessions with the design & test teams show that it is initially difficult to provide reliable estimates, but defect data that is acquired during the project enables early action taking, and better defect estimates resulting in early release quality predictions.

The presentation will focus upon:
• Goals: What is the purpose of the model, why developed, what do we want to reach?
• How: The definition of the model and implementation and application are highlighted.
• Tools: The tool that was developed to implement the model, how it works, strengths.
• Results: How does the model and tool help projects? Does it live up to its purpose?
• Success factors: What are the key issues that we are dealing successfully with? Why do we focus on them, and how? BTW: This includes searching for experiences with defect modeling, and applying available techniques like ODC and test matrices.
• Future: How is this model used in future projects, what could increase benefits?

The presentation will show many examples of actually using the model and tool and the benefits that it has brought to the project and organization. Furthermore it will show possibilities to manage process & product quality, and support decisions, based on data collected in the project and industrial data, i.e. without having to build up historical data in previous projects.

Additional the presentation will show what we have learned using this model, and how we have been able to apply available theories on defect prevention into a working model and tool. Of course there will be room for questions during the presentation

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
578
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
18
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Controlling Project Performance by using a Project Defect Model, Ben Linders, QA&Test Conference 2007

  1. 1. Controlling Project Performance by Using a Defect Model Ben Linders Ericsson Telecommunicatie B.V., Rijen The Netherlands Affiliate Software Engineering Institute, Pittsburgh, PA
  2. 2. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-312 Overview Business Needs Project Defect Model Experiences Conclusions Product quality and process effectiveness
  3. 3. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-313 Ericsson, The Netherlands Market Unit Northern Europe & R&D Center R&D: Value Added Services – Strategic product management – Marketing & technical sales support – Development & maintenance – Customization – Supply & support +/- 1300 employees, of which +/- 350 in R&D
  4. 4. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-314 Business Need for Quality Multimedia functionality Stability & Performance Customizations, flexibility Outsourcing
  5. 5. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-315 Target Business: Increased R&D Efficiency R&D Scorecard Lead-Time, Cost & Quality Quality: Lower Fault Slip Through (FST) FST = Number of defects detected in integration & customer test that should have been detected earlier “Should” implies that the defect is more cost effective to find earlier. The test strategy defines what is cost effective
  6. 6. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-316 Measurement Values Use Available Data over Collecting More Analyze over Measuring Give Feedback over Micro Managing Take Actions over Reporting
  7. 7. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-317 Required Control of Quality: Clear requirements Quality planned & tracked. Fact based decisions Known release quality Deliver on time Lower maintenance
  8. 8. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-318 Project Defect Model Why? – Control quality of the product during development – Improve development/inspection/test processes Business Benefit: Better planning & tracking Early risks signals Save time and costs Happy customers!
  9. 9. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-319 Measuring quality Insertion: Where are defects made? How to prevent? Detection: Where are defects found? Early/economic removal? Quality: How many defect are left in the product at release?
  10. 10. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3110 Quality Management Plan – Documents/code (nr defects made) – Inspection & Test effectiveness (% detection rate) Quality consequence of project approach Track – Actual nr. defects found – Estimate remaining defects Quality status, steer daily work Project decisions, early escalation Steer – Toll Gates, Quality Doors, Product Release Product Quality figures, quantitative decisions
  11. 11. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3111 Project Status Deviation Report regarding QualityProject Status Deviation Report regarding Quality Corrective actionsCorrective actions (Mandatory for targets with Minor or Major deviations.) WhatWhat When (due date)When (due date) WhoWho ………………………….. ………………………….. ………………………….. 200y-mm-dd 200y-mm-dd 200y-mm-dd xxxxx xxxxx xxxxx Status Analysis of current situationAnalysis of current situation Targets – ……………………….. Fact – ………………………….. Reason – …………………………. Consequence – ………………….. TG2 baseline actual Min Max actual estimate 0 10 20 30 40 50 60 70 80 # FST to Test # GA Defects DR % 20% 30% 40% 50% 60% 70% 80% 90% 100% [#]FST,GADefects [%]DetectionRate Reporting
  12. 12. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3112 History 2001 Defined, pilot project started 2002 Evaluated, 2 new projects 2003 Industrialized, used in all major projects 2004 Integrated in Project Steering Model 2005 Corporate process, Pilot Cost of Quality 2006 Corporate Good Practice 2007 R&D Efficiency, reduce Fault Slip Through, Agile
  13. 13. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3113 Functional Test Project: – Incremental – Function Test Team – Weekly analysis Functional Testing: More defects then estimated Root Cause Analysis: – Missed Inspection – Design Rules Defect Detected Function Test Increment 2 Increment 3 Increment 4 Increment 5 Increment 6 TRF Increment 7 Increment 8 Defect Detected Inspections Increment 2 Increment 3 Increment 4 Increment 5 Increment 6 TRF Increment 7 Increment 8
  14. 14. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3114 Improve Inspections Re-introduce Design Rules Coach Inspections More defects inspection Additional defects in test DetectionRate Inspection Increment2Increment3Increment4Increment5Increment6 TRFIncrement7Increment8 Actualtotal Target DetectionRate FunctionTest Increment2Increment3Increment4Increment5Increment6 TRFIncrement7Increment8 Actualtotal Target Improved Inspection and Function Test
  15. 15. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3115 Release defect prediction Number of defects predicted at release (General Avail.) Actual defects tracked in first 6 months of operation Accuracy: – Mostly within 150% range – Only 1 product > 100% off – Only 1 product more defects Maintenance dimensioning Reduce Cost of Poor Quality Definition: Defects predicted at GA / Actual defects (%) Product Release Expected GA Actual GA GA Estimate Accuracy R1 21 20 105% R2 32 18 178% B R7 2 2 100% C R1 5 5 100% D R1 6 1 600% R2.1 18 15 120% R3 13 17 76% R2.2 84 52 162% R3.0a R3.0b R3.0c 60 60 100% R3.0d 9 9 100% G R1 66 41 161% R2a 25 25 100% R2b 0 0 100% E A GA- 6MOP Defects H 104 71 146% F
  16. 16. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3116 Agile Approach Planning game: Analyze Quality Demo: Deliver Network test: Verify Team meeting: Feedback Balance Quality - Time – Costs Early Risk signals Optimized process
  17. 17. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3117 Agile experiences Planning game: Investigate solutions Define Test strategy Agree with Product Manager Estimate remaining defects Reduce Quality risks Team feedback: Root Causes: Test coverage, configuration problems Process update: Inspection, test strategy, delivery test
  18. 18. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3118 Key Success Factors Management Commitment Everybody involved Defect classification Frequent feedback
  19. 19. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3119 Management Targets Target Target Owner GA Defects Strategic Product Manager Defect Detection Rate Project Office Manager Fault Slip Through Design Manager
  20. 20. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3120 Estimation Analysis Report Defect Modelling Target Setting + - Start Pre-study Execution Finish Data Collection • Design • Test Target Commitment Bridging the gap
  21. 21. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3121 Defect Classification Fault Slip Through: Could have been found? Orthogonal Defect Classification Triggers Test Matrices Focus Discipline maps Proces Flow Agree & deploy consistently
  22. 22. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3122 Feedback Frequent, short At the workplace All data available Design/test leaders Show data ask questions form conclusions take needed actions Feedback: Collected data delivered to the people that have done the work, in order to support understanding of the situation and help them to take needed actions
  23. 23. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3123 Benefits Qualitative Earlier risk signals: Deliver on time Incremental Development: Collaboration design-test Better decisions: Release quality Process adherence: Increased efficiency Less defects after release: Maintenance Reduction Less disturbances: Employee motivated Quantitative Higher quality Reduced lead time Lower costs ROI 5:1
  24. 24. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3124 Learnings Estimation & analysis with Design & Test Leaders: Valuable quality feedback All defect information in 1 excel sheet: Detailed insight, easy root cause analysis. Feedback sessions with project members: Essential for analysis, conclusions, and actions. Quality data next to planning and budget. Deployment and optimizing processes & methods. Risks reduced: delivery date, budget & quality!
  25. 25. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3125 Quality Prediction Current Model: Estimation – Extrapolate past performance – Based on inserted/detected defects – Plan & track Wanted: Prediction – Causes of defects – What if Scenarios – Decision taking All models are wrong Some models are useful Deming
  26. 26. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3126 SEI Affiliate Research Quality Factor Model – Expert opinion, with data – Quick Quality Scan – Rough Prediction – Improvement Areas Defect Prediction Model – Data, tuned with expert opinion – Detailed Prediction – Improvement Business Case Process Inputsandoutputs Influencingfactors Measurement DefectsInserted (documentation, code) DefectsDetected (Inspection, test) (Un)happycustomers DesignProcess Competence, skills Tools, environment TestProcess Competence, skills TestCapacity Tools, environment ResidentDefectsin DeliveredProduct ResidentDefectsin DesignBase DetectionRate DefectDensity FaultSlipThrough DefectLevel DefectClassification Process Inputsandoutputs Influencingfactors Measurement DefectsInserted (documentation, code) DefectsDetected (Inspection, test) (Un)happycustomers DesignProcess Competence, skills Tools, environment TestProcess Competence, skills TestCapacity Tools, environment ResidentDefectsin DeliveredProduct ResidentDefectsin DesignBase DetectionRate DefectDensity FaultSlipThrough DefectLevel DefectClassification
  27. 27. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3127 Pilot Agile: Prevention Determine defect insertion & detection costs Predict savings due to less defects inserted Phase Quality Factor Detected defects Defects left Cost Req 4.5 Arch 5.1 Impl 5.1 Total development 49 Inspection 5.3 12 36 72 Early Test 5.0 12 25 132 Late Test 6.2 11 14 1136 Customer Test 5.0 5 10 516 Total development 1856 Maint 4000 Total 5856 Phase Quality Factor Detected defects Defects left Cost Savings Improvement 50 Req 4.9 Arch 5.1 Impl 5.1 Total development 49 Inspection 5.3 12 35 72 Early Test 5.0 11 24 121 Late Test 6.2 10 14 1033 Customer Test 5.0 5 2 516 Total development 1792 3% Maint 800 Total 2592 56%
  28. 28. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3128 Conclusions Quality has Business Value You can Measure & Manage Quality Estimate, Analyze, and Feedback: – Prevention – Early detection – Risk Management Why not start today? – Inspections & test – Release & maintenance – Agile
  29. 29. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3129 Further reading Papers – Controlling Product Quality During Development with a Defect Model, in Proceedings ESEPG 2003 & ESEPG 2004 conferences – Make what’s counted count, in Better Software magazine march 2004 – Measuring Defects to Control product Quality, in Measure! Knowledge! Action! The NESMA anniversary book. Oct 2004. ISBN: 90-76258-18-X – A Proactive Attitude Towards Quality: The Project Defect Model, in Software Quality Professional Dec 2004 (with Hans Sassenburg) – Controlling Project Performance Using the Project Defect Model, in Proceedings Practical Software Quality & Testing 2005 conference References – Managing the software process. Watts Humphrey. – Metrics and models in Software Quality Engineering. Stephen H. Kan. Ben Linders Ericsson Telecommunicatie B.V., Rijen, The Netherlands ben.linders@ericsson.com, +31 161 24 9885
  30. 30. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2007-08-3130
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×