The document describes a project defect model used at Ericsson to control project quality and performance. It tracks defects inserted during phases like specification and design, and detects defects during testing phases. This provides early signals about quality risks. On a pilot project, the model helped clarify requirements, enforce design rules, and focus testing. It accurately predicted defects at release, allowing better planning. The model has now been adopted for several projects to improve quality tracking, risk management, and decision making.
Call Girls Kengeri Satellite Town Just Call 👗 7737669865 👗 Top Class Call Gir...
Controlling Project during Development with a Defect Model, Ben Linders, European SEPG 2003
1. Controlling project performance using a defect model 1 May 8, 2003 Ben Linders
Controlling project performance
using a defect model
ESEPG 2003 Conference, London
June 17: Measurement Symposium
Ben Linders
Operational Development & Quality
Ericsson R&D, The Netherlands
Ben.Linders@etm.ericsson.se, +31 161 24 9885
2. Controlling project performance using a defect model 2 May 8, 2003 Ben Linders
Overview
• Why a defect model?
• How does it work?
• Experiences from the pilot
• Conclusions
Measurements for product quality
and process effectiveness
3. Controlling project performance using a defect model 3 May 8, 2003 Ben Linders
Ericsson, The Netherlands
• Main R&D Design Center
• Full product responsibility for Intelligent Networks
– Strategic Product Management
– Provisioning & total project management
– Development & maintenance
– Supply & support
• 1800 employees, of which 400 in R&D
Projects: Quality next to Lead-time and Costs
4. Controlling project performance using a defect model 4 May 8, 2003 Ben Linders
Purpose Project Defect Model
Why?
– to control quality of the developed product during a project
– and improve development/inspection/test processes
Business Benefit:
Better planning & tracking
Early risks signals
Save time and costs
Happy customers!
5. Controlling project performance using a defect model 5 May 8, 2003 Ben Linders
Why not Fault Density?
Drawbacks
– Difficult to plan with
– Only measurable after phase/project finished
– Provides no insight in the causes
Dilemma Fault Density:
High: Bad product, or effective testing?
Low: Good product, or insufficient testing?
Fault Density is insufficient for agile projects
6. Controlling project performance using a defect model 6 May 8, 2003 Ben Linders
Defect Flow
• Prevent defects insertion
• Detect & remove defects where most economical
• Track design/test progress
7. Controlling project performance using a defect model 7 May 8, 2003 Ben Linders
Process View
Process
Inputs and outputs
Influencing factors
Measurement
DefectsInserted
(documentation,
code)
DefectsDetected
(Inspection, test)
(Un)happy customers
Design Process
Competence, skills
Tools, environment
Test Process
Competence, skills
Test Capacity
Tools, environment
Resident Defectsin
Delivered Product
Resident Defectsin
Design Base
Detection Rate
Defect Density
Fault Slip Through
Defect Level
Defect Classification
8. Controlling project performance using a defect model 8 May 8, 2003 Ben Linders
Planning & Tracking of Quality
• Plan Quality Up Front
– Documents/code (# defects made)
– Inspection & Test effectiveness (% detection rate)
Quality consequence of project decisions
• Track Quality during project
– Actual # defects found (inspection/test)
– Estimate remaining defects: to be found / delivered
Quality view of design/test progress
Quicker escalation of quality risks
Timely insight in Quality Issues!
9. Controlling project performance using a defect model 9 May 8, 2003 Ben Linders
Measurements: Defect Insertion
• Input data: Expected # of defects inserted & expected size
• Gathered data: Actual defects & size
• Verify:
– # of defects not found
– Division of defects inserted over the phases
Defect insertion Target Defect Density: Max 1 major/page per document!
Phase
Expec-
ted
#def
Expec-
ted
size
Expec-
ted
DD
Act.
Size
Fnd
#def
DD
Act
Not
found
yet
%
Foun
d
% Exp
of
total
Specification 4 10 0.4 10 4 0.40 0 100% 4%
High Level Design 12 107 0.112 107 10 0.09 2 83% 12%
Detailed Design 12 47 0.255 47 10 0.21 2 83% 12%
Implementation 70 15000 4.667 13000 18 0.00 52 26% 71%
Total 98 42 56 43% 100%
10. Controlling project performance using a defect model 10 May 8, 2003 Ben Linders
Measurements: Defect Detection
• Input data: Nr of defect expected to detect & detection rate goal
• Gathered data: Actual defects & detection rate
• Verify: Detected defects, test progress
Defect detection Target detection rate: 70% document, 60% code, 50% test!
Avail-
Phase Def. Det # Goal %Det % Left Det # Det % Cum %
Specification 4 2 70% 50% 2 2 50% 50%
High Level Design 14 11 70% 79% 3 11 79% 69%
Detailed Design 15 6 70% 40% 9 6 40% 61%
Implementation 79 40 60% 51% 39 6 8% 23%
Unit test 39 8 20% 21% 31 6 15% 30%
Function test 31 15 50% 48% 16 3 10% 33%
System Test 16 9 50% 56% 7 3 19% 36%
Network Test 7 2 40% 29% 5 2 29% 14%
Installation 5 1 15% 20% 4 2 40% 12%
First Customer 4 1 10% 25% 3 1 25% 10%
Average/Total: 95 42% 42 31%
Actual totalExpected in phase
11. Controlling project performance using a defect model 11 May 8, 2003 Ben Linders
Usage of Project Defect Model
Steps:
– Estimate # defects made, and where to be found
– Collect data per phase (specification, design, implementation, etc)
• Input from inspections and test
• Classify introduction phase of defect
– Feedback to design/test and Project Management
– Analysis on data that signals problems/risks
Quality Engineer: Measure, support, feedback
Project Team: Analyze, decide, act!
Focus on application (using existing theory)
12. Controlling project performance using a defect model 12 May 8, 2003 Ben Linders
Experiences in Pilot Project
• Quality tracked during the project:
– Specification defects slip through: Clarified requirements in feasibility
– Design defects (inspection): Re-enforced design rules
– Code quality (inspection/test): Base Product risk, design rules
– Test efficiency, defect slip though: Better inspection/Unit Test
– Release Quality per requirement: Test focus, risk management
• Prediction nr of defects at First Customer Delivery and Release:
– Decisions on delivery/release, design follow up and maintenance planning
– Actual defects: Expected 21, actual 14 (in first 4 of 6 months)
Pilot Project Defect Detection rate: 95% (best in class)!
13. Controlling project performance using a defect model 13 May 8, 2003 Ben Linders
Learning's from Pilot Project
• Classification/analysis of defect with Design & Test
Leaders provided very valuable information.
• Feedback sessions with Project Management Group
(weekly) have been essential for validating data,
analysis, conclusions, and taking actions.
• Model supported release decisions by providing Defects Detected info.
This was received as very beneficial, and is requested for all projects!
• Though some model conclusions are not surprising, they would have
been overlooked or discovered too late without the model.
14. Controlling project performance using a defect model 14 May 8, 2003 Ben Linders
Conclusions
Project Defect Model helped the project to:
– Estimate/track defects: Improve product release quality, save time/cost
– Design/test progress: Better planning, risk management, decisions
Future
– Model used in several projects
– Internal & Industry data: Better estimates
– Exchange experiences with similar models?
Questions?