Controlling Project during Development with a Defect Model, Ben Linders, European SEPG 2003

  • 394 views
Uploaded on

To get more insight into the quality of the product during development, it is needed to measure processes with two views: Introduction of defects, and detection. Introduction is done during the …

To get more insight into the quality of the product during development, it is needed to measure processes with two views: Introduction of defects, and detection. Introduction is done during the requirements, architecture, design and coding phases; defects are either introduced into documents or into the actual product code. Detection is done in test phases, and in the previously mentioned phases by means of inspections and reviews. By using these two measurements, a project can determine if and what the quality risk is: Too many defects in the product, or insufficient testing done.

The presentation will focus upon:
- Goals: What was the purpose of the model, why was it developed, what did we want to reach?
- How: Both the definition of the model and its implementation and application will be highlighted?
- Tools: The tool that was developed to implement the model, how it works, strengths.
- Results: How did the model and tool help the project? Did it live up to its purpose?
- Success factors: What were the key issues that we have dealt successfully with? Why did we focus on them, and how?
- Future: How is this model used in future projects, what could further increase its benefits?

The presentation will show the benefits that the model/tool has brought to the project and organisation. Mainly it was the ability to manage process & product quality, and support decisions, based on data collected in the project and industrial date, i.e. without having to build up historical data in previous projects.

The defect model uses techniques like Orthogonal Defect Classification and Test Matrices for analysis of the defect data. Feedback of the data towards designers, testers, and project management was key in validating the data, and getting good analysis results for corrective and preventive actions.

More in: Business , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
394
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
8
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Controlling project performance using a defect model ESEPG 2003 Conference, London June 17: Measurement Symposium Ben Linders Operational Development & Quality Ericsson R&D, The Netherlands Ben.Linders@etm.ericsson.se, +31 161 24 9885Controlling project performance using a defect model 1 May 8, 2003 Ben Linders
  • 2. Overview• Why a defect model?• How does it work?• Experiences from the pilot• Conclusions Measurements for product quality and process effectiveness Controlling project performance using a defect model 2 May 8, 2003 Ben Linders
  • 3. Ericsson, The Netherlands• Main R&D Design Center• Full product responsibility for Intelligent Networks – Strategic Product Management – Provisioning & total project management – Development & maintenance – Supply & support• 1800 employees, of which 400 in R&D Projects: Quality next to Lead-time and Costs Controlling project performance using a defect model 3 May 8, 2003 Ben Linders
  • 4. Purpose Project Defect Model Why? – to control quality of the developed product during a project – and improve development/inspection/test processes Business Benefit: Better planning & tracking Early risks signals Save time and costs Happy customers! Controlling project performance using a defect model 4 May 8, 2003 Ben Linders
  • 5. Why not Fault Density? Drawbacks – Difficult to plan with – Only measurable after phase/project finished – Provides no insight in the causes Dilemma Fault Density: High: Bad product, or effective testing? Low: Good product, or insufficient testing? Fault Density is insufficient for agile projects Controlling project performance using a defect model 5 May 8, 2003 Ben Linders
  • 6. Defect Flow• Prevent defects insertion• Detect & remove defects where most economical• Track design/test progress Controlling project performance using a defect model 6 May 8, 2003 Ben Linders
  • 7. Process View Resident Defects in Design Base Design Process Defects Inserted Defect Density Competence, skills (documentation, Tools, environment code) Detection Rate Test Process Competence, skills Defects Detected Fault Slip Through Test Capacity (Inspection, test) Tools, environment Defect Classification Resident Defects in (Un)happy customers Process Delivered Product Inputs and outputs Influencing factors Defect Level Measurement Controlling project performance using a defect model 7 May 8, 2003 Ben Linders
  • 8. Planning & Tracking of Quality• Plan Quality Up Front – Documents/code (# defects made) – Inspection & Test effectiveness (% detection rate) Quality consequence of project decisions• Track Quality during project – Actual # defects found (inspection/test) – Estimate remaining defects: to be found / delivered Quality view of design/test progress Quicker escalation of quality risks Timely insight in Quality Issues! Controlling project performance using a defect model 8 May 8, 2003 Ben Linders
  • 9. Measurements: Defect InsertionDefect insertion Target Defect Density: Max 1 major/page per document! Expec- Expec- Expec- Not % % Exp ted ted ted Act. Fnd DD found Foun ofPhase #def size DD Size #def Act yet d totalSpecification 4 10 0.4 10 4 0.40 0 100% 4%High Level Design 12 107 0.112 107 10 0.09 2 83% 12%Detailed Design 12 47 0.255 47 10 0.21 2 83% 12%Im plem entation 70 15000 4.667 13000 18 0.00 52 26% 71%Total 98 42 56 43% 100%• Input data: Expected # of defects inserted & expected size• Gathered data: Actual defects & size• Verify: – # of defects not found – Division of defects inserted over the phases Controlling project performance using a defect model 9 May 8, 2003 Ben Linders
  • 10. Measurements: Defect Detection Defect detection Target detection rate: 70% document, 60% code, 50% test! Avail- Expected in phase Actual total Phase Def. Det # Goal %Det % Left Det # Det % Cum % Specification 4 2 70% 50% 2 2 50% 50% High Level Design 14 11 70% 79% 3 11 79% 69% Detailed Design 15 6 70% 40% 9 6 40% 61% Im plem entation 79 40 60% 51% 39 6 8% 23% Unit test 39 8 20% 21% 31 6 15% 30% Function test 31 15 50% 48% 16 3 10% 33% System Test 16 9 50% 56% 7 3 19% 36% Netw ork Test 7 2 40% 29% 5 2 29% 14% Installation 5 1 15% 20% 4 2 40% 12% First Custom er 4 1 10% 25% 3 1 25% 10% Average/Total: 95 42% 42 31%• Input data: Nr of defect expected to detect & detection rate goal• Gathered data: Actual defects & detection rate• Verify: Detected defects, test progress Controlling project performance using a defect model 10 May 8, 2003 Ben Linders
  • 11. Usage of Project Defect ModelSteps: – Estimate # defects made, and where to be found – Collect data per phase (specification, design, implementation, etc) • Input from inspections and test • Classify introduction phase of defect – Feedback to design/test and Project Management – Analysis on data that signals problems/risks Quality Engineer: Measure, support, feedback Project Team: Analyze, decide, act! Focus on application (using existing theory) Controlling project performance using a defect model 11 May 8, 2003 Ben Linders
  • 12. Experiences in Pilot Project• Quality tracked during the project: – Specification defects slip through: Clarified requirements in feasibility – Design defects (inspection): Re-enforced design rules – Code quality (inspection/test): Base Product risk, design rules – Test efficiency, defect slip though: Better inspection/Unit Test – Release Quality per requirement: Test focus, risk management• Prediction nr of defects at First Customer Delivery and Release: – Decisions on delivery/release, design follow up and maintenance planning – Actual defects: Expected 21, actual 14 (in first 4 of 6 months) Pilot Project Defect Detection rate: 95% (best in class)! Controlling project performance using a defect model 12 May 8, 2003 Ben Linders
  • 13. Learnings from Pilot Project• Classification/analysis of defect with Design & Test Leaders provided very valuable information.• Feedback sessions with Project Management Group (weekly) have been essential for validating data, analysis, conclusions, and taking actions.• Model supported release decisions by providing Defects Detected info. This was received as very beneficial, and is requested for all projects!• Though some model conclusions are not surprising, they would have been overlooked or discovered too late without the model. Controlling project performance using a defect model 13 May 8, 2003 Ben Linders
  • 14. ConclusionsProject Defect Model helped the project to:– Estimate/track defects: Improve product release quality, save time/cost– Design/test progress: Better planning, risk management, decisionsFuture– Model used in several projects– Internal & Industry data: Better estimates– Exchange experiences with similar models? Questions? Controlling project performance using a defect model 14 May 8, 2003 Ben Linders