Controlling Project during Development with a Defect Model, Ben Linders, European SEPG 2003

1,394 views

Published on

To get more insight into the quality of the product during development, it is needed to measure processes with two views: Introduction of defects, and detection. Introduction is done during the requirements, architecture, design and coding phases; defects are either introduced into documents or into the actual product code. Detection is done in test phases, and in the previously mentioned phases by means of inspections and reviews. By using these two measurements, a project can determine if and what the quality risk is: Too many defects in the product, or insufficient testing done.

The presentation will focus upon:
- Goals: What was the purpose of the model, why was it developed, what did we want to reach?
- How: Both the definition of the model and its implementation and application will be highlighted?
- Tools: The tool that was developed to implement the model, how it works, strengths.
- Results: How did the model and tool help the project? Did it live up to its purpose?
- Success factors: What were the key issues that we have dealt successfully with? Why did we focus on them, and how?
- Future: How is this model used in future projects, what could further increase its benefits?

The presentation will show the benefits that the model/tool has brought to the project and organisation. Mainly it was the ability to manage process & product quality, and support decisions, based on data collected in the project and industrial date, i.e. without having to build up historical data in previous projects.

The defect model uses techniques like Orthogonal Defect Classification and Test Matrices for analysis of the defect data. Feedback of the data towards designers, testers, and project management was key in validating the data, and getting good analysis results for corrective and preventive actions.

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,394
On SlideShare
0
From Embeds
0
Number of Embeds
674
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Controlling Project during Development with a Defect Model, Ben Linders, European SEPG 2003

  1. 1. Controlling project performance using a defect model 1 May 8, 2003 Ben Linders Controlling project performance using a defect model ESEPG 2003 Conference, London June 17: Measurement Symposium Ben Linders Operational Development & Quality Ericsson R&D, The Netherlands Ben.Linders@etm.ericsson.se, +31 161 24 9885
  2. 2. Controlling project performance using a defect model 2 May 8, 2003 Ben Linders Overview • Why a defect model? • How does it work? • Experiences from the pilot • Conclusions Measurements for product quality and process effectiveness
  3. 3. Controlling project performance using a defect model 3 May 8, 2003 Ben Linders Ericsson, The Netherlands • Main R&D Design Center • Full product responsibility for Intelligent Networks – Strategic Product Management – Provisioning & total project management – Development & maintenance – Supply & support • 1800 employees, of which 400 in R&D Projects: Quality next to Lead-time and Costs
  4. 4. Controlling project performance using a defect model 4 May 8, 2003 Ben Linders Purpose Project Defect Model Why? – to control quality of the developed product during a project – and improve development/inspection/test processes Business Benefit: Better planning & tracking Early risks signals Save time and costs Happy customers!
  5. 5. Controlling project performance using a defect model 5 May 8, 2003 Ben Linders Why not Fault Density? Drawbacks – Difficult to plan with – Only measurable after phase/project finished – Provides no insight in the causes Dilemma Fault Density: High: Bad product, or effective testing? Low: Good product, or insufficient testing? Fault Density is insufficient for agile projects
  6. 6. Controlling project performance using a defect model 6 May 8, 2003 Ben Linders Defect Flow • Prevent defects insertion • Detect & remove defects where most economical • Track design/test progress
  7. 7. Controlling project performance using a defect model 7 May 8, 2003 Ben Linders Process View Process Inputs and outputs Influencing factors Measurement DefectsInserted (documentation, code) DefectsDetected (Inspection, test) (Un)happy customers Design Process Competence, skills Tools, environment Test Process Competence, skills Test Capacity Tools, environment Resident Defectsin Delivered Product Resident Defectsin Design Base Detection Rate Defect Density Fault Slip Through Defect Level Defect Classification
  8. 8. Controlling project performance using a defect model 8 May 8, 2003 Ben Linders Planning & Tracking of Quality • Plan Quality Up Front – Documents/code (# defects made) – Inspection & Test effectiveness (% detection rate) Quality consequence of project decisions • Track Quality during project – Actual # defects found (inspection/test) – Estimate remaining defects: to be found / delivered Quality view of design/test progress Quicker escalation of quality risks Timely insight in Quality Issues!
  9. 9. Controlling project performance using a defect model 9 May 8, 2003 Ben Linders Measurements: Defect Insertion • Input data: Expected # of defects inserted & expected size • Gathered data: Actual defects & size • Verify: – # of defects not found – Division of defects inserted over the phases Defect insertion Target Defect Density: Max 1 major/page per document! Phase Expec- ted #def Expec- ted size Expec- ted DD Act. Size Fnd #def DD Act Not found yet % Foun d % Exp of total Specification 4 10 0.4 10 4 0.40 0 100% 4% High Level Design 12 107 0.112 107 10 0.09 2 83% 12% Detailed Design 12 47 0.255 47 10 0.21 2 83% 12% Implementation 70 15000 4.667 13000 18 0.00 52 26% 71% Total 98 42 56 43% 100%
  10. 10. Controlling project performance using a defect model 10 May 8, 2003 Ben Linders Measurements: Defect Detection • Input data: Nr of defect expected to detect & detection rate goal • Gathered data: Actual defects & detection rate • Verify: Detected defects, test progress Defect detection Target detection rate: 70% document, 60% code, 50% test! Avail- Phase Def. Det # Goal %Det % Left Det # Det % Cum % Specification 4 2 70% 50% 2 2 50% 50% High Level Design 14 11 70% 79% 3 11 79% 69% Detailed Design 15 6 70% 40% 9 6 40% 61% Implementation 79 40 60% 51% 39 6 8% 23% Unit test 39 8 20% 21% 31 6 15% 30% Function test 31 15 50% 48% 16 3 10% 33% System Test 16 9 50% 56% 7 3 19% 36% Network Test 7 2 40% 29% 5 2 29% 14% Installation 5 1 15% 20% 4 2 40% 12% First Customer 4 1 10% 25% 3 1 25% 10% Average/Total: 95 42% 42 31% Actual totalExpected in phase
  11. 11. Controlling project performance using a defect model 11 May 8, 2003 Ben Linders Usage of Project Defect Model Steps: – Estimate # defects made, and where to be found – Collect data per phase (specification, design, implementation, etc) • Input from inspections and test • Classify introduction phase of defect – Feedback to design/test and Project Management – Analysis on data that signals problems/risks Quality Engineer: Measure, support, feedback Project Team: Analyze, decide, act! Focus on application (using existing theory)
  12. 12. Controlling project performance using a defect model 12 May 8, 2003 Ben Linders Experiences in Pilot Project • Quality tracked during the project: – Specification defects slip through: Clarified requirements in feasibility – Design defects (inspection): Re-enforced design rules – Code quality (inspection/test): Base Product risk, design rules – Test efficiency, defect slip though: Better inspection/Unit Test – Release Quality per requirement: Test focus, risk management • Prediction nr of defects at First Customer Delivery and Release: – Decisions on delivery/release, design follow up and maintenance planning – Actual defects: Expected 21, actual 14 (in first 4 of 6 months) Pilot Project Defect Detection rate: 95% (best in class)!
  13. 13. Controlling project performance using a defect model 13 May 8, 2003 Ben Linders Learning's from Pilot Project • Classification/analysis of defect with Design & Test Leaders provided very valuable information. • Feedback sessions with Project Management Group (weekly) have been essential for validating data, analysis, conclusions, and taking actions. • Model supported release decisions by providing Defects Detected info. This was received as very beneficial, and is requested for all projects! • Though some model conclusions are not surprising, they would have been overlooked or discovered too late without the model.
  14. 14. Controlling project performance using a defect model 14 May 8, 2003 Ben Linders Conclusions Project Defect Model helped the project to: – Estimate/track defects: Improve product release quality, save time/cost – Design/test progress: Better planning, risk management, decisions Future – Model used in several projects – Internal & Industry data: Better estimates – Exchange experiences with similar models? Questions?

×