Controlling Project Performance by using a Project Defect Model, Ben Linders, QA&Test Conference 2007
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Controlling Project Performance by using a Project Defect Model, Ben Linders, QA&Test Conference 2007

  • 663 views
Uploaded on

Wouldn’t it be nice if you could have more insight into the quality of a product, while it is developed, and not afterwards? To be able to estimate how many defects are inserted in the product in a......

Wouldn’t it be nice if you could have more insight into the quality of a product, while it is developed, and not afterwards? To be able to estimate how many defects are inserted in the product in a certain phase, and how effective a (test) phase is in capturing these defects? The simple but effective model described in this paper makes it possible. It describes the model used in projects at Ericsson in the Netherlands. The model supports controlling the project (putting quality next to planning and budget data), evaluating risks, and taking decisions regarding product release and support.

To get more insight into the quality of the product during development, it is needed to measure the processes with two views: Introduction of defects, and detection of defects. Introduction is done during the requirements, architecture, design and coding phases; defects are either introduced into documents or into the actual product. Detection is done in test phases, and in the previously mentioned phases with inspections and reviews. By using these two measurements, a project can determine if and what the quality risk is: Too many defects in the product, or insufficient testing done.

To able to use the model in a broad set of projects, a template was made, including parts that are applicable in most projects in a configurable way. Also a user guide, with industry reference data, and training material and an overview presentation has been made.

The organization learned a lot from the model. Initially focus was on earlier defect detection, e.g. function- iso system test and inspection iso test. With the increase of data, more insight is acquired in design and coding activities, providing means for defect prevention. Now that data is available from a larger set of projects, processes can be compared enabling decisions about best practices from one project that can be spread towards future projects. Frequent estimation & feedback sessions with the design & test teams show that it is initially difficult to provide reliable estimates, but defect data that is acquired during the project enables early action taking, and better defect estimates resulting in early release quality predictions.

The presentation will focus upon:
• Goals: What is the purpose of the model, why developed, what do we want to reach?
• How: The definition of the model and implementation and application are highlighted.
• Tools: The tool that was developed to implement the model, how it works, strengths.
• Results: How does the model and tool help projects? Does it live up to its purpose?
• Success factors: What are the key issues that we are dealing successfully with? Why do we focus on them, and how? BTW: This includes searching for experiences with defect modeling, and applying available techniques like ODC and test matrices.
• Future: How is this model used in future projects, what could increase benefits?

The presentation will show many examples of actually using the model and tool and the benefits that it has brought to the project and organization. Furthermore it will show possibilities to manage process & product quality, and support decisions, based on data collected in the project and industrial data, i.e. without having to build up historical data in previous projects.

Additional the presentation will show what we have learned using this model, and how we have been able to apply available theories on defect prevention into a working model and tool. Of course there will be room for questions during the presentation

More in: Business , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
663
On Slideshare
659
From Embeds
4
Number of Embeds
1

Actions

Shares
Downloads
18
Comments
0
Likes
0

Embeds 4

http://www.linkedin.com 4

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Controlling Project Performance by Using a Defect Model Ben LindersEricsson Telecommunicatie B.V., Rijen The Netherlands Affiliate Software Engineering Institute, Pittsburgh, PA
  • 2. Overview Business Needs Project Defect Model Experiences Conclusions Product quality and process effectiveness© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 2 2007-08-31
  • 3. Ericsson, The Netherlands Market Unit Northern Europe & R&D Center R&D: Value Added Services – Strategic product management – Marketing & technical sales support – Development & maintenance – Customization – Supply & support +/- 1300 employees, of which +/- 350 in R&D© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 3 2007-08-31
  • 4. Business Need for Quality Multimedia functionality Stability & Performance Customizations, flexibility Outsourcing© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 4 2007-08-31
  • 5. Target Business: Increased R&D Efficiency R&D Scorecard Lead-Time, Cost & Quality Quality: Lower Fault Slip Through (FST) FST = Number of defects detected in integration & customer test that should have been detected earlier “Should” implies that the defect is more cost effective to find earlier. The test strategy defines what is cost effective© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 5 2007-08-31
  • 6. Measurement Values Use Available Data over Collecting More Analyze over Measuring Give Feedback over Micro Managing Take Actions over Reporting© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 6 2007-08-31
  • 7. Required Control of Quality: Clear requirements Quality planned & tracked. Fact based decisions Known release quality Deliver on time Lower maintenance© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 7 2007-08-31
  • 8. Project Defect Model Why? – Control quality of the product during development – Improve development/inspection/test processes Business Benefit: Better planning & tracking Early risks signals Save time and costs Happy customers!© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 8 2007-08-31
  • 9. Measuring quality Insertion: Where are defects made? How to prevent? Detection: Where are defects found? Early/economic removal? Quality: How many defect are left in the product at release?© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 9 2007-08-31
  • 10. Quality Management Plan – Documents/code (nr defects made) – Inspection & Test effectiveness (% detection rate) Quality consequence of project approach Track – Actual nr. defects found – Estimate remaining defects Quality status, steer daily work Project decisions, early escalation Steer – Toll Gates, Quality Doors, Product Release Product Quality figures, quantitative decisions© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 10 2007-08-31
  • 11. Reporting StatusProject Status Deviation Report regarding Quality 80 100% Analysis of current situation 70 90% 60 80% [%] Detection Rate [#] FST, GA Defects Targets – ……………………….. 50 70% 40 60% Fact – ………………………….. 30 50% 20 40% Reason – …………………………. 10 30% 0 20% Consequence – ………………….. # FST to Test # GA Defects DR % actual TG2 baseline actual Min Max estimateCorrective actions (Mandatory for targets with Minor or Major deviations.)What When (due date) Who ………………………….. 200y-mm-dd xxxxx ………………………….. 200y-mm-dd xxxxx ………………………….. 200y-mm-dd xxxxx© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 11 2007-08-31
  • 12. History 2001 Defined, pilot project started 2002 Evaluated, 2 new projects 2003 Industrialized, used in all major projects 2004 Integrated in Project Steering Model 2005 Corporate process, Pilot Cost of Quality 2006 Corporate Good Practice 2007 R&D Efficiency, reduce Fault Slip Through, Agile© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 12 2007-08-31
  • 13. Functional Test Project: – Incremental Defect Detected Function Test – Function Test Team – Weekly analysis Increment 2 Increment 3 Increment 4 Increment 5 Increment 6 TRF Increment 7 Increment 8 Functional Testing: More defects then estimated Defect Detected Inspections Root Cause Analysis: – Missed Inspection Increment 2 Increment 3 Increment 4 Increment 5 Increment 6 TRF Increment 7 Increment 8 – Design Rules© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 13 2007-08-31
  • 14. Improve Inspections DetectionR ate Re-introduce Design Rules Inspection Coach Inspections A total ctual More defects inspection F TR Inc t 2 Inc t 3 Inc t 4 Inc t 5 t6 Inc t 7 8 Target nt en en en en en en me rem rem rem rem rem rem re Inc Inc Additional defects in test Detection Rate FunctionTest Improved Inspection and Function Test Actual total F TR Inc nt 2 Inc nt 3 Inc nt 4 Inc nt 5 6 Inc nt 7 8 nt nt Target me me me me me me me re re re re re re re Inc© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 14 Inc 2007-08-31
  • 15. Release defect prediction Definition: Defects predicted at GA / Actual defects (%) GA- 6MOP Defects Number of defects predicted GA Expected Actual Estimate at release (General Avail.) Product Release GA GA Accuracy Actual defects tracked in first A R1 21 20 105% R2 32 18 178% 6 months of operation B R7 2 2 100% C R1 5 5 100% Accuracy: D R1 6 1 600% – Mostly within 150% range R2.1 18 15 120% E – Only 1 product > 100% off R3 13 17 76% R2.2 84 52 162% – Only 1 product more defects R3.0a 146% F R3.0b 104 71 R3.0c 60 60 100% Maintenance dimensioning R3.0d 9 9 100% G R1 66 41 161% Reduce Cost of Poor Quality R2a 25 25 100% H R2b 0 0 100%© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 15 2007-08-31
  • 16. Agile Approach Planning game: Analyze Quality Demo: Deliver Network test: Verify Team meeting: Feedback Balance Quality - Time – Costs Early Risk signals Optimized process© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 16 2007-08-31
  • 17. Agile experiences Planning game: Investigate solutions Define Test strategy Agree with Product Manager Estimate remaining defects Reduce Quality risks Team feedback: Root Causes: Test coverage, configuration problems Process update: Inspection, test strategy, delivery test© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 17 2007-08-31
  • 18. Key Success Factors Management Commitment Everybody involved Defect classification Frequent feedback© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 18 2007-08-31
  • 19. Management Targets Target Target Owner GA Defects Strategic Product Manager Defect Detection Rate Project Office Manager Fault Slip Through Design Manager© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 19 2007-08-31
  • 20. Bridging the gap Start Pre-study Execution Finish Target Setting Target Defect Commitment Modelling + • Design Report Estimation • Test - Data Analysis Collection© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 20 2007-08-31
  • 21. Defect Classification Fault Slip Through: Could have been found? Orthogonal Defect Classification Triggers Test Matrices Focus Discipline maps Proces Flow Agree & deploy consistently© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 21 2007-08-31
  • 22. Feedback Frequent, short Feedback: Collected data delivered to the At the workplace people that have done the work, in order All data available to support understanding of the situation Design/test leaders and help them to take needed actions Show data ask questions form conclusions take needed actions© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 22 2007-08-31
  • 23. Benefits Qualitative Earlier risk signals: Deliver on time Incremental Development: Collaboration design-test Better decisions: Release quality Process adherence: Increased efficiency Less defects after release: Maintenance Reduction Less disturbances: Employee motivated Quantitative Higher quality ROI 5:1 Reduced lead time Lower costs© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 23 2007-08-31
  • 24. Learnings Estimation & analysis with Design & Test Leaders: Valuable quality feedback All defect information in 1 excel sheet: Detailed insight, easy root cause analysis. Feedback sessions with project members: Essential for analysis, conclusions, and actions. Quality data next to planning and budget. Deployment and optimizing processes & methods. Risks reduced: delivery date, budget & quality!© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 24 2007-08-31
  • 25. Quality Prediction Current Model: Estimation – Extrapolate past performance – Based on inserted/detected defects – Plan & track Wanted: Prediction – Causes of defects – What if Scenarios – Decision taking All models are wrong Some models are useful Deming© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 25 2007-08-31
  • 26. SEI Affiliate Research Quality FactorResident Defectsin Model D ign B e es as – Expert opinion, with data – Quick Quality Scan D ign Proces es s DefectsInserted Defect D ity ens –ompetence, skills Prediction C Rough (documentation, – , Improvement Areas Tools environment code) Detection Rate Tes Proces t s Defect Prediction Model DefectsD C petence, s om etected kills Fault Slip Through Resident D Des B e efectsin ign as Tes Ct apacity (Inspection, test) – Data, tuned with expertDefect Classification Tools environm , ent opinion C D ign Proces es ompetence, s s kills Tools environment , D efects Inserted (documentation, code) Defect Density – Detailed Prediction C Tes Proces t t ompetence, s s kills Tes Capacity Defects Detected (Inspection, test) Detection Rate Fault Slip Through Tools environment , – Improvement Business Case customers R ident D es efectsin (Un)happy Defect Classification Process Delivered Product R ident Defects in es Delivered Product (Un)happy customers Inputs and outputs Process Inputs and outputs Influencing factors Defect Level Influencing factors Measurement Defect Level M easurem ent© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 26 2007-08-31
  • 27. Pilot Agile: Prevention Determine defect insertion & detection costs Phase Quality Factor Detected defects Defects left Cost Req 4.5 Arch 5.1 Impl 5.1 Total development 49 Inspection 5.3 12 36 72 Early Test 5.0 12 25 132 Late Test 6.2 11 14 1136 Customer Test 5.0 5 10 516 Total development 1856 Maint 4000 Total 5856 Predict savings due to less defects inserted Phase Quality Factor Detected defects Defects left Cost Savings Improvement 50 Req 4.9 Arch 5.1 Impl 5.1 Total development 49 Inspection 5.3 12 35 72 Early Test 5.0 11 24 121 Late Test 6.2 10 14 1033 Customer Test 5.0 5 2 516 Total development 1792 3% Maint 800 Total 2592 56%© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 27 2007-08-31
  • 28. Conclusions Quality has Business Value You can Measure & Manage Quality Estimate, Analyze, and Feedback: – Prevention – Early detection – Risk Management Why not start today? – Inspections & test – Release & maintenance – Agile© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 28 2007-08-31
  • 29. Further reading References – Managing the software process. Watts Humphrey. – Metrics and models in Software Quality Engineering. Stephen H. Kan. Papers – Controlling Product Quality During Development with a Defect Model, in Proceedings ESEPG 2003 & ESEPG 2004 conferences – Make what’s counted count, in Better Software magazine march 2004 – Measuring Defects to Control product Quality, in Measure! Knowledge! Action! The NESMA anniversary book. Oct 2004. ISBN: 90-76258-18-X – A Proactive Attitude Towards Quality: The Project Defect Model, in Software Quality Professional Dec 2004 (with Hans Sassenburg) – Controlling Project Performance Using the Project Defect Model, in Proceedings Practical Software Quality & Testing 2005 conference Ben Linders Ericsson Telecommunicatie B.V., Rijen, The Netherlands ben.linders@ericsson.com, +31 161 24 9885© Ericsson Telecommunicatie B.V., Rijen, The Netherlands 29 2007-08-31
  • 30. © Ericsson Telecommunicatie B.V., Rijen, The Netherlands 30 2007-08-31