SlideShare a Scribd company logo
1 of 31
Download to read offline
 

BT8
Concurrent Session 
11/14/2013 2:15 PM 
 
 
 
 
 

"How to (Effectively) Measure
Quality across Software
Deliverables"
 
 
 
 

Presented by:
David Herron
DCG
 
 
 
 
 

Brought to you by: 
 

 
 
340 Corporate Way, Suite 300, Orange Park, FL 32073 
888‐268‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
David Herron
DCG

One of the original founders of the David Consulting Group, David
Herron is now a business development manager and VP of knowledge
solution services with DCG. With more than thirty-five years of experience
in functional measurement and software process improvement, David has
provided consulting and coaching services to a variety of IT organizations
throughout the US and Canada. He is an acknowledged authority in
performance measurement, process improvement, and organizational
change management; an advisor on functional measures, software process
improvement, project estimating, and agile; and a lecturer and coauthor of
several books on IT performance measurement. Contact him
at dherron@davidconsultinggroup.com.
 
 
Better Software Conference 2013
How to (Effectively) Measure Quality
Across Software Deliverables
Presenter:
David Herron
dherron@davidconsultinggroup.com
Defining Software Quality
• How do you define software quality in your
organization?

©2012 David Consulting Group

1
Software Quality Defined
•
•
•
•
•
•
•
•

Absence of defects
Conformance to requirements
Meets certification standards
Maintainable
Scalable
Reliable
Usable
Secure

©2012 David Consulting Group
Tracking Software Quality
• Mr. I.M.A Pib is upset. He is the VP of the Store Systems Division.
• He has just seen the first quarter dashboard of results and his #1
priority project, Store Inventory, has the greatest number of defects
• Here is what was reported to Mr. Pib
Project

Delivery

Cost

Quality

(000's)

Defects

On Time

$500

12

Vendor Mods

Late

$760

18

Prising Adj.

Early

$80

5

On Time

$990

22

PO Special

Store Inventory

• You are the development manager. How might you respond to Mr.
Pib? Do we have all the information we need to properly evaluate
these outcomes?

©2012 David Consulting Group

3
Tracking Software Quality
• Size (value) can serve as a normalizing metric
• A cost per unit of work (Rate) can now be calculated
• Defect Density* for Mr. I. M. A. Pib project is in fact the lowest of all his
projects
Project

Rate

Quality

Size

(000's)

Defects

Value

On Time

$500

12

250

$

2,000.00

0.048

Vendor Mods

Late

$760

18

765

$

993.46

0.024

Prising Adj.

Early

$80

5

100

$

800.00

0.050

On Time

$990

22

1498

$

660.88

0.015

PO Special

Store Inventory

Delivery

Cost

* Defect Density is calculated as defects / size

©2012 David Consulting Group

Density
Size Does Matter
Finding –
Nine out of ten projects that fail have not been properly
sized
Consider When you build a house you specify all the functions and
features you want – these are your requirements
The builder then generates an estimate based on the size
(square footage) of your requirements.

• Size is the key to effectively managing software projects

©2012 David Consulting Group
Characteristics of an Effective Sizing
Metric
• Meaningful to both developer and business user
• Defined (industry recognized)
• Consistent (methodology)
• Easy to learn and apply
• Accurate, statistically based
• Available when needed (early)

©2012 David Consulting Group

6
Function Points - An
Effective Sizing Metric
Function Point Analysis is a standardized
method for measuring the functionality
delivered to an end user.

Benefits:
• Quantitative (Objective) Measure
• Industry Data as Basis for Comparison
• Expectations (Perceived Customer Value) Managed
• Software Process Improvement Requirements Satisfied

©2012 David Consulting Group

7
The Function Point Methodology
Five key components are
identified based on logical user view
•
•
•
•
•

External Inputs
External Outputs
External Inquiries
Internal Logical Files
External Interface Files

External
Input

External
Inquiry

External
Output

Internal
Logical
Files
External
Interface
File

©2012 David Consulting Group

Application

8
What Do We Count?
INPUT FILES
AND
INPUT TRANSACTIONS

APPLICATION

SCREENS
(ADDS, CHANGES,
DELETES, QUERIES)

OUTPUT FILES
AND OUTPUT
TRANSACTIONS
(BATCH INTERFACES)
INTERNAL LOGICAL FILES
(TABLES, DATA FILES,
CONTROL FILES)

CONTROL INFORMATION
EXTERNAL
TABLES & FILES REFERENCED
from other applications
(Not Maintained)

©2012 David Consulting Group

9

OTHER OUTPUTS
•REPORTS
•FILES
•XML
•VIEWS
•FICHE
•TAPE
•DISKETTES
•LETTERS
•NOTICES
•ALARMS
How Do We Count?
• Identify and classify the base functional components
– Measure the data functions
•
•

Internal Groupings of data called Internal Logical Files (ILF)
External Groupings of data or External Interface Files (EIF)

– Measure the transactional functions
•
•
•

External Inputs (EI)
External Outputs (EO)
External Inquires (EQ)

– Each function is assigned a functional complexity (L-A-H) and
a weight (FPs)
• Calculate the functional size
• Document the Function Point Count
• Report the result of the Function Point Count

©2012 David Consulting Group

10
Component Complexity & Weights
Complexity calculations are a function of -the number of data elements, the files referenced and data complexity
Complexity
Components:

Low

Data
Relationships

©2012 David Consulting Group

Record
Element
Types
0-1
or
2
File
3+
Types
Referenced

High

Total

__ x 7
__ x 5
__ x 3
__ x 4
__ x 3

__ x 10
__ x 7
__ x 4
__ x 5
__ x 4

__ x 15
__ x 10
__ x 6
__ x 7
__ x 6

___
___
___
___
___

Total Function Points

Internal Logical File (ILF)
External Interface File (EIF)
External Input (EI)
External Output (EO)
External Inquiry (EQ)

Avg .

___

Data Elements (# of unique data fields)
1-4
Low
Low
Average

11

5 - 15
Low
Average
High

16+
Average
High
High
The Counting Process
The Process
1) Identify Components
2) Assess Complexity
3) Apply Weightings
4) Compute Function Points

Complexity
Components:
Internal Logical File (ILF)
External Interface File (EIF)
External Input (EI)
External Output (EO)
External Inquiry (EQ)

©2012 David Consulting Group

Low
__
__
__
__
__

x
x
x
x
x

Avg.
7
5
3
4
3

12

__
__
__
__
__

x 10
x 7
x 4
x 5
x 4

High
__
__
__
__
__

x 15
x 10
x 6
x 7
x 6

Total
___
___
___
___
___
Identifying the Functionality

USER

PURCHASE
ORDER
SYSTEM

USER
ADD, CHG
INVOICES

PAYMENTS
PURCHASE
ORDER INFO
PAYMENTS

INVOICES
VENDOR

USER
PAYMENT
STATUS

ACCOUNTS PAYABLE

USER
PAID
INVOICES

©2012 David Consulting Group

13
Sizing Example
USER

The Process
1) Identify Components
2) Assess Complexity
3) Apply Weightings
4) Compute Function Points

PURCHASE
ORDER
SYSTEM

USER

PAYMENTS
INVOICES
VENDOR

USER

ACCOUNTS PAYABLE
USER

Complexity
Components:
Internal Logical File (ILF)
External Interface File (EIF)
External Input (EI)
External Output (EO)
External Inquiry (EQ)

Low
__
__
__
__
__

x
x
x
x
x

Avg.
7
5
3
4
3

x
x
x
x
x

10
7
4
5
4

High
__
__
__
__
__

x 15
x 10
x 6
x 7
x 6

Function Point Size
©2012 David Consulting Group

14

Total
Function Point Quality Measures
• Defect Density
– Measures the number of defects identified across one or more phases
of the development project lifecycle and compares that value to the
total size of the application.
Number of defects (by phase or in total)

Total number of function points

• Test Case Coverage
– Measures the number of test cases that are necessary to adequately
support thorough testing of a development project.
Number of test cases
Number of function points

©2012 David Consulting Group
Function Point Quality Measures
• Reliability
– A measure of the number of failures an application experiences
relative to its functional size.
Number of production failures
Total application function points

• Rate of Growth
– Growth  of  an  application’s  functionality  over  a  specified  period  of  time.
Current number of function points
Original number of function points

• Stability
– Used to monitor how effectively an application or enhancement has
met the expectations of the user.
Number of changes
Number of application function points
©2012 David Consulting Group
Measures of Quality
Defect Removal Efficiency

Used to evaluate the effectiveness of
development quality activities

Defect Density

Used to evaluate the overall quality of
the developed software

Delivered Defects

Used to evaluate the quality of the
delivered software

Test Cases Passed First Time

Used to determine the quality of
software being tested

Inspection Rate by Document

Used to determine if inspections
positively impact quality

Volatility

Used to monitor trends in the number of
changes per month

©2012 David Consulting Group
Non-FP Quality Measures
Defect Removal Efficiency
Tracks the number of defects removed by lifecycle phase.
Range
Insertion Rate
Defects Found
Removal Efficiency

Peer Reviews
Testing
Design
Code
Unit Test Sys. Test UAT
Prod
Total
21
30
35
17
11
3
117
5
16
27
31
24
12
2
117
4.3%
13.7%
23.1%
26.5%
20.5%
10.3%
1.7%
Review Effectiveness
41.0% Test Effectiveness
57.3%
Reqs.

Customer Satisfaction
Gather information relating to delivery performance,
communication, management, solutions, etc.
Level of importance.

©2012 David Consulting Group
A Measurement Baseline Model
QUANTITATIVE

Measures
how you
are doing

QUALITATIVE
Management
Requirements
Build
Test
Environment

Size
Effort
Duration
Cost
Quality

Capability
Maturity

Measured
Performance

Standard of
performance

©2012 David Consulting Group

Baseline of
Performance

19

Identifies
what you
are doing
Baseline Results: Example
• Small size projects are the norm.
• Performance levels vary across all
projects.
• The extent of variation is greater than
desired.
• Variation potentially driven by mixing
support and development tasks.

• Duration on small projects reflects
industry norms.
• Relatively high degree of consistency
seen in duration data suggests a basis
for an estimation model.
• Size to duration relationship suggests
that current methods are scalable.

Delivery Rate

Time to Market
300

300
G

250

250
Size

D

200

E

Size
B

F

H

100

A
J

50

I

K

200

K

150

G

E

150

F

100
L

I

50
0

0.0

10.0

©2012 David Consulting Group

20.0
30.0
Productivity (Hrs/FP)

40.0

20

B

H

C

0

D

A
J

L

C

0.0

2.0

4.0
6.0
8.0
Duration (Months)

10.0

12.0
Quantitative Performance Evaluation
Example
QUANTITATIVE
Size
Effort
Duration
Cost
Quality
Measured
Performance

Quantitative Assessment
 Perform functional sizing on all selected projects.
 Collect data on project level of effort, cost, duration
and quality.
 Calculate productivity rates for each project, including
functional size delivered per staff month, cost per
functional size, time to market, and defects delivered.

Baseline Results
Average Project Size
Average FP/SM
Average Time-To-Market (Months)
Average Cost/FP
Delivered Defects/FP

©2012 David Consulting Group

Baseline
Productivity
133
10.7
6.9
$939
0.0301
Qualitative Performance Evaluation
Qualitative Assessment
 Conduct Interviews with members of each project team.
 Collect Project Profile information.
 Develop Performance Profiles to display strengths and
weaknesses among the selected projects.

QUALITATIVE
Management
Requirements
Build
Test
Environment

Capability
Maturity

Management

Definition

Design

•
•
•
•
•
•
•

•

•
•
•
•
•
•

Team Dynamics
Morale
Project Tracking
Iteration Planning
Release Planning
Automation
Leadership Skills

•
•
•
•

Evolutionary
Requirements
Process
Product Owner
Involvement
Experience Levels
Business Impact

Process
Reviews
Design Reuse
Customer Involvement
Experience
Automation

Build

Test

Environment

•
•

•
•
•
•
•

•
•
•
•

•
•
•
•

©2012 David Consulting Group

Code Reviews
Configuration
Management
Code Reuse
Data Administration
Experienced Staff
Automation

Formal Testing Methods
Test Plans
Testing Experience
Effective Test Tools
Customer Involvement

•

New Technology
Automated Process
Adequate Training
Organizational
Dynamics
Certification
Modeled Improvements
Project Nam e

Accounts Payable
Priotity One
HR Enhancements
Client Accounts
ABC Release
Screen Redesign
Customer Web
Whole Life
Regional - East
Regional - West
Cashflow
Credit Automation
NISE
Help Desk Automation
Formula One Upgrade

Profile Score

Managem ent

Definition

Design

Build

Test

Environm ent

55.3
27.6
32.3
29.5
44.1
17.0
40.2
29.2
22.7
17.6
40.6
23.5
49.0
49.3
22.8

47.73
50.00
29.55
31.82
31.82
22.73
45.45
56.82
36.36
43.18
56.82
29.55
38.64
54.55
31.82

82.05
48.72
48.72
43.59
53.85
43.59
23.08
28.21
43.59
23.08
71.79
48.72
56.41
74.36
38.46

50.00
11.36
0.00
0.00
34.09
0.00
38.64
22.73
0.00
0.00
0.00
0.00
52.27
20.45
0.00

46.15
38.46
42.31
30.77
38.46
15.38
53.85
26.92
30.77
26.92
38.46
38.46
30.77
53.85
11.54

43.75
0.00
37.50
37.50
53.13
0.00
50.00
18.75
9.38
9.38
43.75
6.25
53.13
50.00
25.00

50.00
42.31
42.31
42.31
42.31
30.77
34.62
53.85
30.77
26.92
38.46
26.92
53.85
38.46
46.15

Performance Improvements:
Productivity ~ +131%
Time to Market ~ -49%
Defect Ratio ~ -75%

Process Improvements:
• Peer Reviews
• Requirements Management
• Configuration Management
Project Nam e

Accounts Payable
Priotity One
HR Enhancements
Client Accounts
ABC Release
Screen Redesign
Customer Web
Whole Life
Regional - East
Regional - West
Cashflow
Credit Automation
NISE
Help Desk Automation
Formula One Upgrade

©2012 David Consulting Group

Average Project Size
Average FP/SM
Average Time-To-Market (Months)
Average Cost/FP
Delivered Defects/FP

Baseline
Productivity
133
10.7
6.9
$939
0.0301

Profile Score

Managem ent

Definition

Design

Build

Test

Environm ent

75.3
57.6
52.3
69.5
74.1
67.0
59.2
50.2
57.7
52.6
67.6
60.5
79.0
79.3
52.8

61.73
57.00
32.55
53.82
55.82
43.73
49.45
49.82
59.36
55.18
66.82
41.55
68.64
64.55
49.82

82.05
55.72
51.72
65.59
69.85
63.59
27.08
32.21
49.59
30.08
71.79
78.72
76.41
74.36
52.46

60.00
18.36
23.00
12.00
49.09
21.00
58.64
27.73
0.00
0.00
0.00
0.00
62.27
47.45
0.00

60.15
45.46
42.31
50.77
52.46
36.38
53.85
31.92
30.77
33.92
49.46
50.46
65.77
63.85
31.54

53.75
22.00
57.50
67.50
63.13
20.00
54.00
24.75
9.38
19.38
53.75
26.25
53.13
54.00
25.00

50.00
49.31
49.31
49.31
49.31
51.77
49.62
53.85
50.77
26.92
49.46
46.92
53.85
58.46
56.15

23

Average Project Size
Average FP/SM
Average Time-To-Market (Months)
Average Cost/FP
Delivered Defects/FP

Productivity
Improvement
133
24.8
3.5
$467
0.0075
Overall Information Framework
Executive Management
Dashboard
%
Var

Baseline
1/10/2008

Plan
1/10/2008

1/28/2008

1/28/2008

1/28/2008

2/4/2008

2/4/2008

2/15/2008

11%

0%

19%

6/30/2008

19%

8

11%

6/30/2008

6/1/2008

n'
08

6/15/2008

6/1/2008

Ja

5/30/2008

Checkpoint  D  –  Deploy  &  Close

08

Go Live
Lessons Learned/Cust Sat Survey Complete

'0

10%
13%

l'0
8

10%
10%

5/15/2008
5/30/2008

O
ct
'0
8
N
ov
'0
8
D
ec
'0
8

20%

4/30/2008
5/15/2008

4/30/2008
5/10/2008

n'0
8

11%

4/15/2008

4/15/2008
4/30/2008

Ju

3/15/2008

3/15/2008

Testing Complete
Training Complete

Ju

2/12/2008

2/28/2008

Development Complete
Checkpoint  C–  Midpoint  

Sep

2/12/2008

Checkpoint  B–  Planning  &  Reqs
Design Complete

1,600
1,400
1,200
1,000
800
600
400
200
0
Au
g'

0%

PMP/Schedule Complete

Cum Planned Effort Allocated

Cum Actual Effort Spent

"Earned Value"

Baseline Total Hours

Project Defect Status

Requirements Growth and Stability

1000

200

900

# of Requirements

800
# of Defects

Business
Decisions

Project Resource and Effort Status

7%

2/28/2008

Vendor Selection Complete

Actual
1/10/2008

Ap
r'0
8
M
ay
'0
8

Requirements Complete

Project Resources/Hours

Milestone

Checkpoint  A  –  Charter  &  Kickoff

Fe
b'0
8
M
ar
'0
8

Enterprise

700
600
500
400
300
200

150
100
50

Performance
Measures

0

100

Process

Ja
n'0
8
Feb
'0
8
M
ar'
08
Ap
r'0
8
M
ay'
08
Ju
n'0
8
Ju
l'0
8
Au
g'0
8
Se
p'0
8
O
ct
'0
8
N
ov'
08
D
ec'
08

8

08

'0

l'0
8

O
ct
'0
8

n'0
8

Ju

Sep

Ju

Total Defects Discovered

Au
g'

Ap
r'0
8

M
ay
'0
8

Ja

M
ar
'0
8

-50
Fe
b'0
8

n'
08

0

Added

Total Closed Defects

Changed

Deleted

Total Reqs

Process Management
Impro
ve
Contr
ol
Defin
e

Exec
ute
Proce
ss

Measurement
Repository
Process
Measures Enterprise

Measur
e

Database

Baseline

..
Project

Score Mngmnt

Req

Des

Build

Test

Environ

BI Product Releases | Q2 2007

56.2

68

62

68

58

41

35

EDW Phase IV: Applicant Tracking System
CRM Product Maintenance Releases | Q3 2007

44.3
60.2

68
73

49
74

57
68

35
65

28
41

35
27

Road to 90: In Bound

36.4

57

44

32

46

22

27

SAR PM 2.0
Meetings | Teleconf. vendor selection

37.5
46.6

50
68

51
62

25
57

46
38

28
25

27
27

Project X Project Y Project Z

©2012 David Consulting Group

24

64

50

46

50

31

72
54

48
20

58
58

41
44

31
31

47.3

61

54

20

58

41

31

Web v2.2 (EPN)

Historical
Measures

77
61
61

Q3 2007 Web v2.1 Enhancements / Maintenance

PAL

53.6
53.2
43.7
59.8

77

69

55

58

53

31

Web v2.2 Enhancements / Maintenance | Q4 2007 44.2

Project

CoBRA Application
Web 2.1
Web 2.0 Q1 Maintenance

61

54

20

65

41

31

End User
Project
Estimates
Dashboard / Service Levels
Industry
Median)

Estimating Accuracy
- Effort

(actual labor hours estimated) / estimated

positive values represent
(1000-500)/500 =
overruns; negative underruns +100% overrun

+22%

0%

18%

Estimating Accuracy
- Schedule

(actual calendar months estimated) / estimated

positive values represent
(4 - 3)/3 = +33%
overruns; negative underruns
overrun

+21%

0%

18%

Productivity

function points / labor months

varies with project size

100 FPs/4 staff
months = 25

17

26

20

Unit Cost

dollars / function points

Dollars are estimated from
labor hours @ $110 per hour
* 145 hrs per staff month

$200,000/100 =
$2,000

$938

$613

$800

System Delivery
Rate

function points / calendar
months

QSM value is a mean median not available

100 FPs/ 2
calendar months =
50

32

49

40

Requirements
Volatility

added, changed, deleted /
total baselined rqts

10%

15%

ratings by project manager

10 changed / 100
baselined = 10%
5 = very satisfied
1 = very
unsatisfied

20%

Client Satisfaction

For all but one project, data
not available. Project
manager gave an estimate
For all but three projects,
ratings by clients
unavailable.

4

Not available

4

defects found in system test /
total defects

total defects = defects found
in system test + defects
found in production (first 30
days)

40 / 50 = 90%

83%

90%

90%

production = first 30 days

(5 defects / 200
FPs) * 100 = 2.5

2.3

1.3

1.8

Delivered Defect
Density (Defects per (defects found in production /
100 function points)
function points) * 100

©2012 David Consulting Group

Example

primarily Level 3 Goal by
2012
organizations

Calculation

System Test
Effectiveness

Notes

Median

Measure Name
Alternative Sizing Options
Sizing Technique

Standard

Lifecycle

Comparative
Data

Lines of Code

No

Build

No

Modules/Components

No

Design

No

Use Cases

No

Requirements

No

Story Points

No

Requirements

No

Function Points

Yes

Requirements

Yes

COSMIC

Yes

Requirements

Partial

NESMA

Yes

Requirements

Partial

Mark II

Yes

Requirements

Limited

©2012 David Consulting Group

26
Alternative Sizing Options
Organizational
Specific
Definitions

Internal v External
Definitions

Power v
Ease of Use

Industry
Defined

Modules, Story Lines of Use Case
Use Cases, Points Code Points
Test Cases

More
Rules

Fewer
Rules

Cosmic,
NESMA FP
IFPUG Function
Points
Mark II

Hours,
Days

Story
Points

Use Case
Points

Easier to
Learn

Consistency/
Accuracy

More Accurate

Story Lines of Use Case COSMIC
Hours,
Points Code Points NESMA FP
Days
IFPUG Function
Points
Mark II

©2012 David Consulting Group

Ease of Use

Less Accurate

COSMIC
NESMA FP
IFPUG Function
Points
Mark II

Power
/ Ease
Index
Power Increases

27

Harder to
Learn
Summary
• Quality defined as a measure of value for the customer

• Size is an critical normalizing metric
• FPA serves as an effective sizing method
• Historical baseline data can provide for potential predictive
capabilities

©2012 David Consulting Group

28

More Related Content

What's hot

Solution Design Overview - HelpSystems RJS
Solution Design Overview - HelpSystems RJSSolution Design Overview - HelpSystems RJS
Solution Design Overview - HelpSystems RJS
Bill Whalen, CDIA+
 
Manual testing concepts course 1
Manual testing concepts course 1Manual testing concepts course 1
Manual testing concepts course 1
Raghu Kiran
 

What's hot (20)

Software test management overview for managers
Software test management overview for managersSoftware test management overview for managers
Software test management overview for managers
 
Software Testing Process, Testing Automation and Software Testing Trends
Software Testing Process, Testing Automation and Software Testing TrendsSoftware Testing Process, Testing Automation and Software Testing Trends
Software Testing Process, Testing Automation and Software Testing Trends
 
Scrum Testing Methodology
Scrum Testing MethodologyScrum Testing Methodology
Scrum Testing Methodology
 
QA metrics in Agile (GUIDE)
QA metrics in Agile (GUIDE)QA metrics in Agile (GUIDE)
QA metrics in Agile (GUIDE)
 
Solution Design Overview - HelpSystems RJS
Solution Design Overview - HelpSystems RJSSolution Design Overview - HelpSystems RJS
Solution Design Overview - HelpSystems RJS
 
Building an Agile framework that fits your organisation
Building an Agile framework that fits your organisationBuilding an Agile framework that fits your organisation
Building an Agile framework that fits your organisation
 
Performance testing with Jmeter
Performance testing with JmeterPerformance testing with Jmeter
Performance testing with Jmeter
 
Agile Testing: The Role Of The Agile Tester
Agile Testing: The Role Of The Agile TesterAgile Testing: The Role Of The Agile Tester
Agile Testing: The Role Of The Agile Tester
 
Manual testing concepts course 1
Manual testing concepts course 1Manual testing concepts course 1
Manual testing concepts course 1
 
Agile Testing and Test Automation
Agile Testing and Test AutomationAgile Testing and Test Automation
Agile Testing and Test Automation
 
SOFTWARE TESTING
SOFTWARE TESTINGSOFTWARE TESTING
SOFTWARE TESTING
 
Deliver Fast, Break Nothing Via Effective Building Developer and Tester Colla...
Deliver Fast, Break Nothing Via Effective Building Developer and Tester Colla...Deliver Fast, Break Nothing Via Effective Building Developer and Tester Colla...
Deliver Fast, Break Nothing Via Effective Building Developer and Tester Colla...
 
defect tracking and management
defect tracking and management   defect tracking and management
defect tracking and management
 
Agile testing
Agile testingAgile testing
Agile testing
 
Testing fundamentals
Testing fundamentalsTesting fundamentals
Testing fundamentals
 
Agile Testing by Example
Agile Testing by ExampleAgile Testing by Example
Agile Testing by Example
 
Testing strategy for agile projects updated
Testing strategy for agile projects updatedTesting strategy for agile projects updated
Testing strategy for agile projects updated
 
Test Automation in Agile
Test Automation in AgileTest Automation in Agile
Test Automation in Agile
 
Exploratory Testing Explained
Exploratory Testing ExplainedExploratory Testing Explained
Exploratory Testing Explained
 
Types of software testing
Types of software testingTypes of software testing
Types of software testing
 

Viewers also liked

Viewers also liked (15)

Keynote: What Executives Value in Testing
Keynote: What Executives Value in TestingKeynote: What Executives Value in Testing
Keynote: What Executives Value in Testing
 
Requirements Elicitation—the Social Media Way
Requirements Elicitation—the Social Media WayRequirements Elicitation—the Social Media Way
Requirements Elicitation—the Social Media Way
 
Contextually-Driven System Architecture Reviews
Contextually-Driven System Architecture ReviewsContextually-Driven System Architecture Reviews
Contextually-Driven System Architecture Reviews
 
The Developer’s Guide to Test Automation
The Developer’s Guide to Test AutomationThe Developer’s Guide to Test Automation
The Developer’s Guide to Test Automation
 
Transform Your Agile Test Process to Ship Fast with High Quality
Transform Your Agile Test Process to Ship Fast with High QualityTransform Your Agile Test Process to Ship Fast with High Quality
Transform Your Agile Test Process to Ship Fast with High Quality
 
Scrum for Global-Scale Development
Scrum for Global-Scale DevelopmentScrum for Global-Scale Development
Scrum for Global-Scale Development
 
The Five Facets of an Agile Organization
The Five Facets of an Agile OrganizationThe Five Facets of an Agile Organization
The Five Facets of an Agile Organization
 
Keynote: The Bounty Conundrum: Incentives for Testing
Keynote: The Bounty Conundrum: Incentives for TestingKeynote: The Bounty Conundrum: Incentives for Testing
Keynote: The Bounty Conundrum: Incentives for Testing
 
The Seven Deadly Sins of Software Testing
The Seven Deadly Sins of Software TestingThe Seven Deadly Sins of Software Testing
The Seven Deadly Sins of Software Testing
 
The 21st Century Needs Radical Management
The 21st Century Needs Radical ManagementThe 21st Century Needs Radical Management
The 21st Century Needs Radical Management
 
Form Follows Function: The Architecture of a Congruent Organization
Form Follows Function: The Architecture of a Congruent OrganizationForm Follows Function: The Architecture of a Congruent Organization
Form Follows Function: The Architecture of a Congruent Organization
 
Design for Testability: A Tutorial for Devs and Testers
Design for Testability: A Tutorial for Devs and TestersDesign for Testability: A Tutorial for Devs and Testers
Design for Testability: A Tutorial for Devs and Testers
 
Deliver Projects On Time, Every Time
Deliver Projects On Time, Every TimeDeliver Projects On Time, Every Time
Deliver Projects On Time, Every Time
 
The Next Frontier of Agile: Journey to Continuous Delivery
The Next Frontier of Agile: Journey to Continuous DeliveryThe Next Frontier of Agile: Journey to Continuous Delivery
The Next Frontier of Agile: Journey to Continuous Delivery
 
Eight Steps to Kanban
Eight Steps to KanbanEight Steps to Kanban
Eight Steps to Kanban
 

Similar to How to (Effectively) Measure Quality across Software Deliverables

Company Software Design Proposal Powerpoint Presentation
Company Software Design Proposal Powerpoint PresentationCompany Software Design Proposal Powerpoint Presentation
Company Software Design Proposal Powerpoint Presentation
SlideTeam
 
Akhilesh Kumar_ Resume
Akhilesh Kumar_ ResumeAkhilesh Kumar_ Resume
Akhilesh Kumar_ Resume
Akhilesh Kumar
 
A Guide to Effective Benchmarking of Applications Development
A Guide to Effective Benchmarking of Applications DevelopmentA Guide to Effective Benchmarking of Applications Development
A Guide to Effective Benchmarking of Applications Development
Computer Aid, Inc
 

Similar to How to (Effectively) Measure Quality across Software Deliverables (20)

Importance of software quality metrics
Importance of software quality metricsImportance of software quality metrics
Importance of software quality metrics
 
Agile and Its Impact on Productivity
Agile and Its Impact on ProductivityAgile and Its Impact on Productivity
Agile and Its Impact on Productivity
 
Company Software Design Proposal Powerpoint Presentation
Company Software Design Proposal Powerpoint PresentationCompany Software Design Proposal Powerpoint Presentation
Company Software Design Proposal Powerpoint Presentation
 
Using Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process ImprovementUsing Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process Improvement
 
Benchmarking As a Tool for Optimising Software Development Performance
Benchmarking As a Tool for Optimising Software Development PerformanceBenchmarking As a Tool for Optimising Software Development Performance
Benchmarking As a Tool for Optimising Software Development Performance
 
Independent Software Assessments
Independent Software AssessmentsIndependent Software Assessments
Independent Software Assessments
 
The value of benchmarking software projects
The value of benchmarking software projectsThe value of benchmarking software projects
The value of benchmarking software projects
 
The value of benchmarking IT projects - H.S. van Heeringen
The value of benchmarking IT projects - H.S. van HeeringenThe value of benchmarking IT projects - H.S. van Heeringen
The value of benchmarking IT projects - H.S. van Heeringen
 
Process and Project Metrics-1
Process and Project Metrics-1Process and Project Metrics-1
Process and Project Metrics-1
 
Jinan Babu
Jinan BabuJinan Babu
Jinan Babu
 
Agile Gurugram 2022 - Dinker Charak | Line of Sight from Engineering Excellen...
Agile Gurugram 2022 - Dinker Charak | Line of Sight from Engineering Excellen...Agile Gurugram 2022 - Dinker Charak | Line of Sight from Engineering Excellen...
Agile Gurugram 2022 - Dinker Charak | Line of Sight from Engineering Excellen...
 
Nesma event June '23 - How to use objective metrics as a basis for agile cost...
Nesma event June '23 - How to use objective metrics as a basis for agile cost...Nesma event June '23 - How to use objective metrics as a basis for agile cost...
Nesma event June '23 - How to use objective metrics as a basis for agile cost...
 
Software Metrics: Taking the Guesswork Out of Software Projects
Software Metrics: Taking the Guesswork Out of Software ProjectsSoftware Metrics: Taking the Guesswork Out of Software Projects
Software Metrics: Taking the Guesswork Out of Software Projects
 
GSTi India Overview
GSTi India OverviewGSTi India Overview
GSTi India Overview
 
ROI Driven Digital Development
ROI Driven Digital DevelopmentROI Driven Digital Development
ROI Driven Digital Development
 
Enterprise Software Development Proposal PowerPoint Presentation Slides
Enterprise Software Development Proposal PowerPoint Presentation SlidesEnterprise Software Development Proposal PowerPoint Presentation Slides
Enterprise Software Development Proposal PowerPoint Presentation Slides
 
Learn to see, measure and automate with value stream management
Learn to see, measure and automate with value stream managementLearn to see, measure and automate with value stream management
Learn to see, measure and automate with value stream management
 
Akhilesh Kumar_ Resume
Akhilesh Kumar_ ResumeAkhilesh Kumar_ Resume
Akhilesh Kumar_ Resume
 
A Guide to Effective Benchmarking of Applications Development
A Guide to Effective Benchmarking of Applications DevelopmentA Guide to Effective Benchmarking of Applications Development
A Guide to Effective Benchmarking of Applications Development
 
jgordonres112015
jgordonres112015jgordonres112015
jgordonres112015
 

More from TechWell

More from TechWell (20)

Failing and Recovering
Failing and RecoveringFailing and Recovering
Failing and Recovering
 
Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build Architecture
 
System-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartSystem-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good Start
 
Build Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyBuild Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test Strategy
 
Testing Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTesting Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for Success
 
Implement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowImplement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlow
 
Develop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityDevelop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your Sanity
 
Ma 15
Ma 15Ma 15
Ma 15
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyEliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps Strategy
 
Transform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTransform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOps
 
The Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipThe Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—Leadership
 
Resolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsResolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile Teams
 
Pin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GamePin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile Game
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsAgile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
 
A Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationA Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps Implementation
 
Databases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessDatabases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery Process
 
Mobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateMobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to Automate
 
Cultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessCultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for Success
 
Turn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTurn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile Transformation
 

Recently uploaded

Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Recently uploaded (20)

Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 

How to (Effectively) Measure Quality across Software Deliverables

  • 1.   BT8 Concurrent Session  11/14/2013 2:15 PM            "How to (Effectively) Measure Quality across Software Deliverables"         Presented by: David Herron DCG           Brought to you by:        340 Corporate Way, Suite 300, Orange Park, FL 32073  888‐268‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
  • 2. David Herron DCG One of the original founders of the David Consulting Group, David Herron is now a business development manager and VP of knowledge solution services with DCG. With more than thirty-five years of experience in functional measurement and software process improvement, David has provided consulting and coaching services to a variety of IT organizations throughout the US and Canada. He is an acknowledged authority in performance measurement, process improvement, and organizational change management; an advisor on functional measures, software process improvement, project estimating, and agile; and a lecturer and coauthor of several books on IT performance measurement. Contact him at dherron@davidconsultinggroup.com.    
  • 3. Better Software Conference 2013 How to (Effectively) Measure Quality Across Software Deliverables Presenter: David Herron dherron@davidconsultinggroup.com
  • 4. Defining Software Quality • How do you define software quality in your organization? ©2012 David Consulting Group 1
  • 5. Software Quality Defined • • • • • • • • Absence of defects Conformance to requirements Meets certification standards Maintainable Scalable Reliable Usable Secure ©2012 David Consulting Group
  • 6. Tracking Software Quality • Mr. I.M.A Pib is upset. He is the VP of the Store Systems Division. • He has just seen the first quarter dashboard of results and his #1 priority project, Store Inventory, has the greatest number of defects • Here is what was reported to Mr. Pib Project Delivery Cost Quality (000's) Defects On Time $500 12 Vendor Mods Late $760 18 Prising Adj. Early $80 5 On Time $990 22 PO Special Store Inventory • You are the development manager. How might you respond to Mr. Pib? Do we have all the information we need to properly evaluate these outcomes? ©2012 David Consulting Group 3
  • 7. Tracking Software Quality • Size (value) can serve as a normalizing metric • A cost per unit of work (Rate) can now be calculated • Defect Density* for Mr. I. M. A. Pib project is in fact the lowest of all his projects Project Rate Quality Size (000's) Defects Value On Time $500 12 250 $ 2,000.00 0.048 Vendor Mods Late $760 18 765 $ 993.46 0.024 Prising Adj. Early $80 5 100 $ 800.00 0.050 On Time $990 22 1498 $ 660.88 0.015 PO Special Store Inventory Delivery Cost * Defect Density is calculated as defects / size ©2012 David Consulting Group Density
  • 8. Size Does Matter Finding – Nine out of ten projects that fail have not been properly sized Consider When you build a house you specify all the functions and features you want – these are your requirements The builder then generates an estimate based on the size (square footage) of your requirements. • Size is the key to effectively managing software projects ©2012 David Consulting Group
  • 9. Characteristics of an Effective Sizing Metric • Meaningful to both developer and business user • Defined (industry recognized) • Consistent (methodology) • Easy to learn and apply • Accurate, statistically based • Available when needed (early) ©2012 David Consulting Group 6
  • 10. Function Points - An Effective Sizing Metric Function Point Analysis is a standardized method for measuring the functionality delivered to an end user. Benefits: • Quantitative (Objective) Measure • Industry Data as Basis for Comparison • Expectations (Perceived Customer Value) Managed • Software Process Improvement Requirements Satisfied ©2012 David Consulting Group 7
  • 11. The Function Point Methodology Five key components are identified based on logical user view • • • • • External Inputs External Outputs External Inquiries Internal Logical Files External Interface Files External Input External Inquiry External Output Internal Logical Files External Interface File ©2012 David Consulting Group Application 8
  • 12. What Do We Count? INPUT FILES AND INPUT TRANSACTIONS APPLICATION SCREENS (ADDS, CHANGES, DELETES, QUERIES) OUTPUT FILES AND OUTPUT TRANSACTIONS (BATCH INTERFACES) INTERNAL LOGICAL FILES (TABLES, DATA FILES, CONTROL FILES) CONTROL INFORMATION EXTERNAL TABLES & FILES REFERENCED from other applications (Not Maintained) ©2012 David Consulting Group 9 OTHER OUTPUTS •REPORTS •FILES •XML •VIEWS •FICHE •TAPE •DISKETTES •LETTERS •NOTICES •ALARMS
  • 13. How Do We Count? • Identify and classify the base functional components – Measure the data functions • • Internal Groupings of data called Internal Logical Files (ILF) External Groupings of data or External Interface Files (EIF) – Measure the transactional functions • • • External Inputs (EI) External Outputs (EO) External Inquires (EQ) – Each function is assigned a functional complexity (L-A-H) and a weight (FPs) • Calculate the functional size • Document the Function Point Count • Report the result of the Function Point Count ©2012 David Consulting Group 10
  • 14. Component Complexity & Weights Complexity calculations are a function of -the number of data elements, the files referenced and data complexity Complexity Components: Low Data Relationships ©2012 David Consulting Group Record Element Types 0-1 or 2 File 3+ Types Referenced High Total __ x 7 __ x 5 __ x 3 __ x 4 __ x 3 __ x 10 __ x 7 __ x 4 __ x 5 __ x 4 __ x 15 __ x 10 __ x 6 __ x 7 __ x 6 ___ ___ ___ ___ ___ Total Function Points Internal Logical File (ILF) External Interface File (EIF) External Input (EI) External Output (EO) External Inquiry (EQ) Avg . ___ Data Elements (# of unique data fields) 1-4 Low Low Average 11 5 - 15 Low Average High 16+ Average High High
  • 15. The Counting Process The Process 1) Identify Components 2) Assess Complexity 3) Apply Weightings 4) Compute Function Points Complexity Components: Internal Logical File (ILF) External Interface File (EIF) External Input (EI) External Output (EO) External Inquiry (EQ) ©2012 David Consulting Group Low __ __ __ __ __ x x x x x Avg. 7 5 3 4 3 12 __ __ __ __ __ x 10 x 7 x 4 x 5 x 4 High __ __ __ __ __ x 15 x 10 x 6 x 7 x 6 Total ___ ___ ___ ___ ___
  • 16. Identifying the Functionality USER PURCHASE ORDER SYSTEM USER ADD, CHG INVOICES PAYMENTS PURCHASE ORDER INFO PAYMENTS INVOICES VENDOR USER PAYMENT STATUS ACCOUNTS PAYABLE USER PAID INVOICES ©2012 David Consulting Group 13
  • 17. Sizing Example USER The Process 1) Identify Components 2) Assess Complexity 3) Apply Weightings 4) Compute Function Points PURCHASE ORDER SYSTEM USER PAYMENTS INVOICES VENDOR USER ACCOUNTS PAYABLE USER Complexity Components: Internal Logical File (ILF) External Interface File (EIF) External Input (EI) External Output (EO) External Inquiry (EQ) Low __ __ __ __ __ x x x x x Avg. 7 5 3 4 3 x x x x x 10 7 4 5 4 High __ __ __ __ __ x 15 x 10 x 6 x 7 x 6 Function Point Size ©2012 David Consulting Group 14 Total
  • 18. Function Point Quality Measures • Defect Density – Measures the number of defects identified across one or more phases of the development project lifecycle and compares that value to the total size of the application. Number of defects (by phase or in total) Total number of function points • Test Case Coverage – Measures the number of test cases that are necessary to adequately support thorough testing of a development project. Number of test cases Number of function points ©2012 David Consulting Group
  • 19. Function Point Quality Measures • Reliability – A measure of the number of failures an application experiences relative to its functional size. Number of production failures Total application function points • Rate of Growth – Growth  of  an  application’s  functionality  over  a  specified  period  of  time. Current number of function points Original number of function points • Stability – Used to monitor how effectively an application or enhancement has met the expectations of the user. Number of changes Number of application function points ©2012 David Consulting Group
  • 20. Measures of Quality Defect Removal Efficiency Used to evaluate the effectiveness of development quality activities Defect Density Used to evaluate the overall quality of the developed software Delivered Defects Used to evaluate the quality of the delivered software Test Cases Passed First Time Used to determine the quality of software being tested Inspection Rate by Document Used to determine if inspections positively impact quality Volatility Used to monitor trends in the number of changes per month ©2012 David Consulting Group
  • 21. Non-FP Quality Measures Defect Removal Efficiency Tracks the number of defects removed by lifecycle phase. Range Insertion Rate Defects Found Removal Efficiency Peer Reviews Testing Design Code Unit Test Sys. Test UAT Prod Total 21 30 35 17 11 3 117 5 16 27 31 24 12 2 117 4.3% 13.7% 23.1% 26.5% 20.5% 10.3% 1.7% Review Effectiveness 41.0% Test Effectiveness 57.3% Reqs. Customer Satisfaction Gather information relating to delivery performance, communication, management, solutions, etc. Level of importance. ©2012 David Consulting Group
  • 22. A Measurement Baseline Model QUANTITATIVE Measures how you are doing QUALITATIVE Management Requirements Build Test Environment Size Effort Duration Cost Quality Capability Maturity Measured Performance Standard of performance ©2012 David Consulting Group Baseline of Performance 19 Identifies what you are doing
  • 23. Baseline Results: Example • Small size projects are the norm. • Performance levels vary across all projects. • The extent of variation is greater than desired. • Variation potentially driven by mixing support and development tasks. • Duration on small projects reflects industry norms. • Relatively high degree of consistency seen in duration data suggests a basis for an estimation model. • Size to duration relationship suggests that current methods are scalable. Delivery Rate Time to Market 300 300 G 250 250 Size D 200 E Size B F H 100 A J 50 I K 200 K 150 G E 150 F 100 L I 50 0 0.0 10.0 ©2012 David Consulting Group 20.0 30.0 Productivity (Hrs/FP) 40.0 20 B H C 0 D A J L C 0.0 2.0 4.0 6.0 8.0 Duration (Months) 10.0 12.0
  • 24. Quantitative Performance Evaluation Example QUANTITATIVE Size Effort Duration Cost Quality Measured Performance Quantitative Assessment  Perform functional sizing on all selected projects.  Collect data on project level of effort, cost, duration and quality.  Calculate productivity rates for each project, including functional size delivered per staff month, cost per functional size, time to market, and defects delivered. Baseline Results Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP ©2012 David Consulting Group Baseline Productivity 133 10.7 6.9 $939 0.0301
  • 25. Qualitative Performance Evaluation Qualitative Assessment  Conduct Interviews with members of each project team.  Collect Project Profile information.  Develop Performance Profiles to display strengths and weaknesses among the selected projects. QUALITATIVE Management Requirements Build Test Environment Capability Maturity Management Definition Design • • • • • • • • • • • • • • Team Dynamics Morale Project Tracking Iteration Planning Release Planning Automation Leadership Skills • • • • Evolutionary Requirements Process Product Owner Involvement Experience Levels Business Impact Process Reviews Design Reuse Customer Involvement Experience Automation Build Test Environment • • • • • • • • • • • • • • • ©2012 David Consulting Group Code Reviews Configuration Management Code Reuse Data Administration Experienced Staff Automation Formal Testing Methods Test Plans Testing Experience Effective Test Tools Customer Involvement • New Technology Automated Process Adequate Training Organizational Dynamics Certification
  • 26. Modeled Improvements Project Nam e Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional - East Regional - West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade Profile Score Managem ent Definition Design Build Test Environm ent 55.3 27.6 32.3 29.5 44.1 17.0 40.2 29.2 22.7 17.6 40.6 23.5 49.0 49.3 22.8 47.73 50.00 29.55 31.82 31.82 22.73 45.45 56.82 36.36 43.18 56.82 29.55 38.64 54.55 31.82 82.05 48.72 48.72 43.59 53.85 43.59 23.08 28.21 43.59 23.08 71.79 48.72 56.41 74.36 38.46 50.00 11.36 0.00 0.00 34.09 0.00 38.64 22.73 0.00 0.00 0.00 0.00 52.27 20.45 0.00 46.15 38.46 42.31 30.77 38.46 15.38 53.85 26.92 30.77 26.92 38.46 38.46 30.77 53.85 11.54 43.75 0.00 37.50 37.50 53.13 0.00 50.00 18.75 9.38 9.38 43.75 6.25 53.13 50.00 25.00 50.00 42.31 42.31 42.31 42.31 30.77 34.62 53.85 30.77 26.92 38.46 26.92 53.85 38.46 46.15 Performance Improvements: Productivity ~ +131% Time to Market ~ -49% Defect Ratio ~ -75% Process Improvements: • Peer Reviews • Requirements Management • Configuration Management Project Nam e Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional - East Regional - West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade ©2012 David Consulting Group Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP Baseline Productivity 133 10.7 6.9 $939 0.0301 Profile Score Managem ent Definition Design Build Test Environm ent 75.3 57.6 52.3 69.5 74.1 67.0 59.2 50.2 57.7 52.6 67.6 60.5 79.0 79.3 52.8 61.73 57.00 32.55 53.82 55.82 43.73 49.45 49.82 59.36 55.18 66.82 41.55 68.64 64.55 49.82 82.05 55.72 51.72 65.59 69.85 63.59 27.08 32.21 49.59 30.08 71.79 78.72 76.41 74.36 52.46 60.00 18.36 23.00 12.00 49.09 21.00 58.64 27.73 0.00 0.00 0.00 0.00 62.27 47.45 0.00 60.15 45.46 42.31 50.77 52.46 36.38 53.85 31.92 30.77 33.92 49.46 50.46 65.77 63.85 31.54 53.75 22.00 57.50 67.50 63.13 20.00 54.00 24.75 9.38 19.38 53.75 26.25 53.13 54.00 25.00 50.00 49.31 49.31 49.31 49.31 51.77 49.62 53.85 50.77 26.92 49.46 46.92 53.85 58.46 56.15 23 Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP Productivity Improvement 133 24.8 3.5 $467 0.0075
  • 27. Overall Information Framework Executive Management Dashboard % Var Baseline 1/10/2008 Plan 1/10/2008 1/28/2008 1/28/2008 1/28/2008 2/4/2008 2/4/2008 2/15/2008 11% 0% 19% 6/30/2008 19% 8 11% 6/30/2008 6/1/2008 n' 08 6/15/2008 6/1/2008 Ja 5/30/2008 Checkpoint  D  –  Deploy  &  Close 08 Go Live Lessons Learned/Cust Sat Survey Complete '0 10% 13% l'0 8 10% 10% 5/15/2008 5/30/2008 O ct '0 8 N ov '0 8 D ec '0 8 20% 4/30/2008 5/15/2008 4/30/2008 5/10/2008 n'0 8 11% 4/15/2008 4/15/2008 4/30/2008 Ju 3/15/2008 3/15/2008 Testing Complete Training Complete Ju 2/12/2008 2/28/2008 Development Complete Checkpoint  C–  Midpoint   Sep 2/12/2008 Checkpoint  B–  Planning  &  Reqs Design Complete 1,600 1,400 1,200 1,000 800 600 400 200 0 Au g' 0% PMP/Schedule Complete Cum Planned Effort Allocated Cum Actual Effort Spent "Earned Value" Baseline Total Hours Project Defect Status Requirements Growth and Stability 1000 200 900 # of Requirements 800 # of Defects Business Decisions Project Resource and Effort Status 7% 2/28/2008 Vendor Selection Complete Actual 1/10/2008 Ap r'0 8 M ay '0 8 Requirements Complete Project Resources/Hours Milestone Checkpoint  A  –  Charter  &  Kickoff Fe b'0 8 M ar '0 8 Enterprise 700 600 500 400 300 200 150 100 50 Performance Measures 0 100 Process Ja n'0 8 Feb '0 8 M ar' 08 Ap r'0 8 M ay' 08 Ju n'0 8 Ju l'0 8 Au g'0 8 Se p'0 8 O ct '0 8 N ov' 08 D ec' 08 8 08 '0 l'0 8 O ct '0 8 n'0 8 Ju Sep Ju Total Defects Discovered Au g' Ap r'0 8 M ay '0 8 Ja M ar '0 8 -50 Fe b'0 8 n' 08 0 Added Total Closed Defects Changed Deleted Total Reqs Process Management Impro ve Contr ol Defin e Exec ute Proce ss Measurement Repository Process Measures Enterprise Measur e Database Baseline .. Project Score Mngmnt Req Des Build Test Environ BI Product Releases | Q2 2007 56.2 68 62 68 58 41 35 EDW Phase IV: Applicant Tracking System CRM Product Maintenance Releases | Q3 2007 44.3 60.2 68 73 49 74 57 68 35 65 28 41 35 27 Road to 90: In Bound 36.4 57 44 32 46 22 27 SAR PM 2.0 Meetings | Teleconf. vendor selection 37.5 46.6 50 68 51 62 25 57 46 38 28 25 27 27 Project X Project Y Project Z ©2012 David Consulting Group 24 64 50 46 50 31 72 54 48 20 58 58 41 44 31 31 47.3 61 54 20 58 41 31 Web v2.2 (EPN) Historical Measures 77 61 61 Q3 2007 Web v2.1 Enhancements / Maintenance PAL 53.6 53.2 43.7 59.8 77 69 55 58 53 31 Web v2.2 Enhancements / Maintenance | Q4 2007 44.2 Project CoBRA Application Web 2.1 Web 2.0 Q1 Maintenance 61 54 20 65 41 31 End User Project Estimates
  • 28. Dashboard / Service Levels Industry Median) Estimating Accuracy - Effort (actual labor hours estimated) / estimated positive values represent (1000-500)/500 = overruns; negative underruns +100% overrun +22% 0% 18% Estimating Accuracy - Schedule (actual calendar months estimated) / estimated positive values represent (4 - 3)/3 = +33% overruns; negative underruns overrun +21% 0% 18% Productivity function points / labor months varies with project size 100 FPs/4 staff months = 25 17 26 20 Unit Cost dollars / function points Dollars are estimated from labor hours @ $110 per hour * 145 hrs per staff month $200,000/100 = $2,000 $938 $613 $800 System Delivery Rate function points / calendar months QSM value is a mean median not available 100 FPs/ 2 calendar months = 50 32 49 40 Requirements Volatility added, changed, deleted / total baselined rqts 10% 15% ratings by project manager 10 changed / 100 baselined = 10% 5 = very satisfied 1 = very unsatisfied 20% Client Satisfaction For all but one project, data not available. Project manager gave an estimate For all but three projects, ratings by clients unavailable. 4 Not available 4 defects found in system test / total defects total defects = defects found in system test + defects found in production (first 30 days) 40 / 50 = 90% 83% 90% 90% production = first 30 days (5 defects / 200 FPs) * 100 = 2.5 2.3 1.3 1.8 Delivered Defect Density (Defects per (defects found in production / 100 function points) function points) * 100 ©2012 David Consulting Group Example primarily Level 3 Goal by 2012 organizations Calculation System Test Effectiveness Notes Median Measure Name
  • 29. Alternative Sizing Options Sizing Technique Standard Lifecycle Comparative Data Lines of Code No Build No Modules/Components No Design No Use Cases No Requirements No Story Points No Requirements No Function Points Yes Requirements Yes COSMIC Yes Requirements Partial NESMA Yes Requirements Partial Mark II Yes Requirements Limited ©2012 David Consulting Group 26
  • 30. Alternative Sizing Options Organizational Specific Definitions Internal v External Definitions Power v Ease of Use Industry Defined Modules, Story Lines of Use Case Use Cases, Points Code Points Test Cases More Rules Fewer Rules Cosmic, NESMA FP IFPUG Function Points Mark II Hours, Days Story Points Use Case Points Easier to Learn Consistency/ Accuracy More Accurate Story Lines of Use Case COSMIC Hours, Points Code Points NESMA FP Days IFPUG Function Points Mark II ©2012 David Consulting Group Ease of Use Less Accurate COSMIC NESMA FP IFPUG Function Points Mark II Power / Ease Index Power Increases 27 Harder to Learn
  • 31. Summary • Quality defined as a measure of value for the customer • Size is an critical normalizing metric • FPA serves as an effective sizing method • Historical baseline data can provide for potential predictive capabilities ©2012 David Consulting Group 28