A Comparison of Best Practices
8th Malaysian Software
Engineering Conference
(MySEC 2014)
Resort World Langkawi,
MalaysiaRelease Readiness Measurement
24-09-2014
Nico Koprowski, M. Firdaus Harun, Horst Lichter
(Universiti Teknologi Malaysia)
• Release Readiness
• Measurement for Release Readiness
 Defect Tracking
 Software Readiness Index
 ShipIt
• Discussion
• Comparison
• Summary
Introduction
2
Schedule Cost
Software Readiness
When to release the software product?
Release Readiness
3
Time
ReleaseReadiness
Time of release?
100% finished
Is a software product ready to be released?
Defect Tracking
4
Q: When is a software product ready to be released?
Defect Tracking: When the number of remaining defects is sufficently low!
Reliability:
The probability of executing a software system without failure for a
specified time period.
Defect Density (DD)
5
LOC: 100K LOC: 50K LOC: 100K
Defects: 700 Defects: 475 Defects: 600
History
Present
LOC: 140K
Defects / DD: ?
DD: 7/KLOC DD: 9.5/KLOC DD: 6/KLOC
V1.0 V2.0 V3.0
Defect Pooling
6
Team A Team B
Common findings
Unique findingsUnique findings
The more unique findings, the more defects remain
Defect A Defect B
Defect Seeding
7
Seed Detect
Team A
Team B
Idea: Remaining defects rate proportional to seeded defects detected rate
Software Product
Architectural Defect Tracking (ADF)
8
Presentation Tier
Business Tier
Data Access Tier
N-tier architecture: Parameters:
• # UIs
• # UI messages
• # Parents
• # Children
• Depth of Inheritances
• Coupling
• …
• # Selects
• # Insert/Updates
• # Deletes
• # Sub-queries
• …
User
Interfaces
Classes
SQL
Software Readiness Index (SRI)
9
Q: When is a software product ready to be released?
SRI: When it obtains the desired amount of quality!
QualityReliability
Functionality Efficiency
Usability
…
Quality
SRI Criteria
10Taken from: A.Asthana and J. Oliveri. Quantifying software reliability and readiness.
Thresholds
11
Green
Yellow
Red
Criterion
Measurement
0%
100%
Good
Ok, Sufficient
Bad, Not tolerable
ShipIt
12
Q: When is a software product ready to be released?
ShipIt: When the overall development progress is sufficiently advanced!
Requirements Design Testing
Time
Progress:
The ratio of the already spent effort to the overall planned effort
ShipIt Criteria
13
Requirements
Coding
Testing
Quality
Documentation
Supervision
Support
Gathered, Analysed and Designed
Modules, Objects codedBuild Times
Test CoverageOpen issues
Zero Failure Test hours COCOMO
Requirements, Design, Code Test Plan, User Guide
Installation and Training
Beta Test Bugs
Discussion: Defect Tracking
14
Scope
Simplicity
AvailabilityUniversality
Concreteness
Defect Density
Defect Pooling/Seeding
Architectural Defect
Tracking
• Concrete
Methods
• No support with
decision making
ADF only for n-tier
architectures and OOE
Simple methods
and criteria
Only Reliability
When Testing
Discussion: SRI
15
Scope
Simplicity
AvailabilityUniversality
Concreteness
Criteria with thresholds
No restrictions When Coding
• Many criteria
• But: light-weight
version available
Quality including Reliability
Discussion: ShipIt
16
Scope
Simplicity
AvailabilityUniversality
Concreteness
• Some criteria hard
to measure
• Fuzzy about
quality
From beginning of
the project
Many criteria and
subcriteria
Whole Progress including
Quality and Reliability
Only suitable for
waterfall model
Comparison
17
Defect Tracking SRI ShipItFits in Fits in
• Least release
criteria
• Used in final steps
• ADT where
applicable
• Universally
applicable
• Most comprehensive
and concreteness
• Thresholds
• Broadest scope
• Good for progress
communication
• Less concrete
AvailablityUniversality
Concreteness Simplicity
Scope
Summary
18
Question: Is the software product ready to be released?
Suggestion: Quantify release readiness properties via metrics
(applicable for all software designs) and project progress (any
software development types) i.e. Holistic approach.
1. Reliability measurement with Defect Tracking: Least
criteria
2. Quality measurement with SRI: Most comprehensive
approach
3. Progress measurement with ShipIt: Most complex
approach
The End
19
Thanks for your attention!
Defect Tracking SRI ShipIt
• Least release
criteria
• Used in final steps
• ADT where
applicable
• Universally
applicable
• Most comprehensive
and concreteness
• Thresholds
• Broader scope
• Good for progress
communication
• Less concrete
AvailablityUniversality
Concreteness Simplicity
Scope

A Comparison of Release Readiness Approaches

  • 1.
    A Comparison ofBest Practices 8th Malaysian Software Engineering Conference (MySEC 2014) Resort World Langkawi, MalaysiaRelease Readiness Measurement 24-09-2014 Nico Koprowski, M. Firdaus Harun, Horst Lichter (Universiti Teknologi Malaysia) • Release Readiness • Measurement for Release Readiness  Defect Tracking  Software Readiness Index  ShipIt • Discussion • Comparison • Summary
  • 2.
  • 3.
    Release Readiness 3 Time ReleaseReadiness Time ofrelease? 100% finished Is a software product ready to be released?
  • 4.
    Defect Tracking 4 Q: Whenis a software product ready to be released? Defect Tracking: When the number of remaining defects is sufficently low! Reliability: The probability of executing a software system without failure for a specified time period.
  • 5.
    Defect Density (DD) 5 LOC:100K LOC: 50K LOC: 100K Defects: 700 Defects: 475 Defects: 600 History Present LOC: 140K Defects / DD: ? DD: 7/KLOC DD: 9.5/KLOC DD: 6/KLOC V1.0 V2.0 V3.0
  • 6.
    Defect Pooling 6 Team ATeam B Common findings Unique findingsUnique findings The more unique findings, the more defects remain Defect A Defect B
  • 7.
    Defect Seeding 7 Seed Detect TeamA Team B Idea: Remaining defects rate proportional to seeded defects detected rate Software Product
  • 8.
    Architectural Defect Tracking(ADF) 8 Presentation Tier Business Tier Data Access Tier N-tier architecture: Parameters: • # UIs • # UI messages • # Parents • # Children • Depth of Inheritances • Coupling • … • # Selects • # Insert/Updates • # Deletes • # Sub-queries • … User Interfaces Classes SQL
  • 9.
    Software Readiness Index(SRI) 9 Q: When is a software product ready to be released? SRI: When it obtains the desired amount of quality! QualityReliability Functionality Efficiency Usability … Quality
  • 10.
    SRI Criteria 10Taken from:A.Asthana and J. Oliveri. Quantifying software reliability and readiness.
  • 11.
  • 12.
    ShipIt 12 Q: When isa software product ready to be released? ShipIt: When the overall development progress is sufficiently advanced! Requirements Design Testing Time Progress: The ratio of the already spent effort to the overall planned effort
  • 13.
    ShipIt Criteria 13 Requirements Coding Testing Quality Documentation Supervision Support Gathered, Analysedand Designed Modules, Objects codedBuild Times Test CoverageOpen issues Zero Failure Test hours COCOMO Requirements, Design, Code Test Plan, User Guide Installation and Training Beta Test Bugs
  • 14.
    Discussion: Defect Tracking 14 Scope Simplicity AvailabilityUniversality Concreteness DefectDensity Defect Pooling/Seeding Architectural Defect Tracking • Concrete Methods • No support with decision making ADF only for n-tier architectures and OOE Simple methods and criteria Only Reliability When Testing
  • 15.
    Discussion: SRI 15 Scope Simplicity AvailabilityUniversality Concreteness Criteria withthresholds No restrictions When Coding • Many criteria • But: light-weight version available Quality including Reliability
  • 16.
    Discussion: ShipIt 16 Scope Simplicity AvailabilityUniversality Concreteness • Somecriteria hard to measure • Fuzzy about quality From beginning of the project Many criteria and subcriteria Whole Progress including Quality and Reliability Only suitable for waterfall model
  • 17.
    Comparison 17 Defect Tracking SRIShipItFits in Fits in • Least release criteria • Used in final steps • ADT where applicable • Universally applicable • Most comprehensive and concreteness • Thresholds • Broadest scope • Good for progress communication • Less concrete AvailablityUniversality Concreteness Simplicity Scope
  • 18.
    Summary 18 Question: Is thesoftware product ready to be released? Suggestion: Quantify release readiness properties via metrics (applicable for all software designs) and project progress (any software development types) i.e. Holistic approach. 1. Reliability measurement with Defect Tracking: Least criteria 2. Quality measurement with SRI: Most comprehensive approach 3. Progress measurement with ShipIt: Most complex approach
  • 19.
    The End 19 Thanks foryour attention! Defect Tracking SRI ShipIt • Least release criteria • Used in final steps • ADT where applicable • Universally applicable • Most comprehensive and concreteness • Thresholds • Broader scope • Good for progress communication • Less concrete AvailablityUniversality Concreteness Simplicity Scope

Editor's Notes

  • #3 Sw Readiness:- Sw company aggressively find the possible way to deliver a finished software product at the right time. Consider schedule and cost To deliver a such high quality software might take a long time or over schedule or increase the cost to hire the developer. To make sure a software deliver in time, it might miss important features which lead to low quality software
  • #4 -This property called ‘Release Readiness’ -In this graph, it shows that in releasing software product we could not completed all features that we promised to customer and the same time we should consider the date of delivery. -Therefore, there must be a number of approaches that facilitate release manager or lead software developer when-to-release software to customer by considering quality and cost.
  • #6 DD – No. of defects per LOC. Point: 1) The more historical data you have, the more confident you can be in your pre release defect density targets.
  • #7 Points: Distinct arbitrary Operate independently Test the full scope of product
  • #9 ADF – estimate and predict defect in each layer by utilizing NN Prediction model. Apply 2-layered neural networks predictive models – predict defect class in each tier: -Kohonen Network – Ready / Not Ready based on defect class -General regression neural network – LOC changed with time required to change.
  • #11 Points: Contains 5 vectors which contains different variables Each vector is computed as weighted sum of the constituent variables; magnitude of each vector is normalized to 1.0. This weight can be decided based on our goals such as to release readiness or measuring a risk in a software development project. All computed vector values (additive, multiplicative or hybrid both model computation) will indicate that the software Ready to Go, Go with Condition or No decision. The decision value based on threshold value that we decided in each vectors (NEXT Slide).
  • #14 Points: 7 components with its sub components Adopted waterfall model Accumulate the value from each components.
  • #20 Broader scope – the measurement covers reliability, quality and progress.