More Related Content Similar to Agile metrics what is... riga-version Similar to Agile metrics what is... riga-version (20) Agile metrics what is... riga-version1. © 2015 Accenture. All rights reserved. Accenture, its logo, and 'High Performance. Delivered.' are trademarks of Accenture.
Agile Day Riga 2015
Alex Birke, May 23, 2015
Essential Metrics for Agile Project
Management
2. Why Metrics?
“they help […] to make decisions”
(Eric Ries, Lean Startup)
2© 2015 Accenture. All rights reserved.
3. What makes a good metric?
As many as required, as less as possible
Leading indicators over trailing indicators
Measure results, not activity
Assess trends, not snapshots
Easy to collect
4© 2015 Accenture. All rights reserved.
4. But wait!
What about Scrum’s metrics as
the Burndown and the Velocity!
5© 2015 Accenture. All rights reserved.
5. A Set of Core Metrics for Agile Project management*
Scope Cost
Schedule Productivity
Story Point Effort
Performance IndicatorScope Volatility
Release Slippage Risk
Indicator
Sprint Burndown
Performance Indicator
Release Burnchart
Performance Indicator
Quality
Sprint Productivity
Required Productivity
© 2015 Accenture. All rights reserved.
*) Metrics are agnostic of technology or domain where Agile Project Management is used
Running Tested
Features
Delivered
Defect Rate
6. Sprint Productivity (P)
Definition
Productivity is measured as generated value in story points that can be completed per person
day.
Calculation
# story points [SP] of a sprint
P =
# person days of a sprint
8© 2015 Accenture. All rights reserved.
7. Required Productivity (RP)
Definition
RP is the productivity that the team would require in the remaining sprints, to complete at
least remaining “Must-Have” user stories, so that a Minimal Viable Release (MVR) can be
deployed to production.
Calculation
# of SP estimate of remaining “Must-Have” stories
RP =
Total planned effort in remaining sprints
9© 2015 Accenture. All rights reserved.
8. Required Productivity
10© 2015 Accenture. All rights reserved.
Required Velocity =
(
3 [SP] + 3 [SP] +
8 [SP] + 3 [SP]
)
/ 3
= 5.66 [SP]
Assuming 105 hours per upcoming Sprint:
RP = 5.66 [SP] / 105 [hours] = 0.054 [SP / hours]
9. Release Slippage Risk Indicator (RSRI)
Definition
RSRI indicates whether at least a minimal viable release (MVR) can be deployed to
production on the scheduled date.
Calculation
“Past” Sprint Productivity (P)
RSRI =
Required Sprint Productivity for release date (RP)
= 1 : as planned
RSRI < 1 : delayed
> 1 : ahead of plan
11© 2015 Accenture. All rights reserved.
10. Release Slippage Risk Indicator
12© 2015 Accenture. All rights reserved.
Required Productivity = ( 3 [SP] + 3 [SP] + 8 [SP] + 3 [SP] ) / (3 [Sprints] * 105 [hours] ) =
= 0.054 [SP / hours]
Past Productivity = 5 [SP] / 90 [hours] = 0.055 [SP / hours]
Release Slippage Risk = 1.02
11. What makes a good metric?
As many as required, as less as possible
Leading indicators over trailing indicators
Measure results, not activity
Assess trends, not snapshots
Easy to collect
4© 2015 Accenture. All rights reserved.
12. 0 20 60
160
280
400
560
720
820
920
1015
1110
1210
1330
490
0
200
400
600
800
1000
1200
1400
1600
1800
0 1 2 3 4 5 6 7 8 9 10 11 12 13
Work(StoryPoints)
Sprint #
Release Burn Up Chart
Must-Have Scope Must-Have + Should-Have Scope
Planned Dev Complete or Accenture Scope Complete
Ideal Burn-Up Required Burn
2 new “Billing” epics added
“Summary Reports” added
“Rating” epics de-scoped
In the example above, at the end of Sprint 7: RBPI = 490 / 720 = 0.68,
which indicates that the amount of user stories done was less than expected.
Release BurnChart Performance Indicator
© 2015 Accenture. All rights reserved.
13. Sprint Burndown Performance Indicator (SBPI)
Definition
SBPI shows the deviation in completed work compared to work planned, for the current
sprint.
Calculation
Ideal remaining work
SBPI =
Actual remaining work
= 1: as planned
SBPI < 1: delayed
> 1: ahead of plan
15© 2015 Accenture. All rights reserved.
14. Sprint Burndown Performance Indicator
= 120 [hours] / 160 [hours]
= .75
(i.e. -25% deviation)
Sprint Burndown Performance Indicator
= 40 [SP] / 50 [SP]
= .80
Fig. 1: Task Burndown (effort based) Fig. 2: User Story Burndown (Story Point based)
Sprint Burndown Performance Indicator
© 2015 Accenture. All rights reserved.
15. Scope Volatility (SV)
Definition
Scope Volatility depicts the amount of change in size of the release scope, comparing the
release scope size measured at start of the release and after the last completed sprint.
Calculation
Current Size of Must-Have Scope – Initial Size of Must-Have Scope
SV = * 100
Initial Size of Must-Have Scope
> 0: Scope creep
SV < 0: Scope drop
= 0: Planned scope size retained (often corridor)
17© 2015 Accenture. All rights reserved.
16. 1143
1228
Must-Have Scope 1237
1273 1330
0
200
400
600
800
1000
1200
1400
1600
1800
0 1 2 3 4 5 6 7 8 9 10 11 12 13
Work(StoryPoints)
Sprint #
Release Burn Up Chart
Must-Have Scope Must-Have + Should-Have Scope
Planned Dev Complete or Accenture Scope Complete
Ideal Burn-Up Required Burn
2 new “Billing” epics added
“Rating” epics de-scoped
In the example above, Scope Volatility = (1273 [SP] – 1143 [SP] ) / 1143 [SP] * 100 = .11
Scope Volatility
© 2015 Accenture. All rights reserved.
17. Story Point Effort Performance Indicator (SPEPI)
Definition
SPEPI (aka CPI, sprint-wise) indicates if the ongoing project release is currently on budget,
depicting deviation in planned effort per story point to the actual effort per story point.
Calculation
Planned Effort to be spent in the release so far
Planned Story points delivered in the release so far
SPEPI =
Actual Effort spent on the release so far
Actual Story points delivered in the release so far
= 1 : as planned
SPEPI < 1 : cost overrun
> 1 : under budget
19© 2015 Accenture. All rights reserved.
18. Story Point Effort Performance Indicator
“per”
20
Value
Effort /
money
SPEPI = (875 / 630 ) / ( 957 / 490 ) = 0.71, which indicates a cost overrun.
© 2015 Accenture. All rights reserved.
19. Running Tested Features* (RTF)
Definition
RTF depicts the variance of working (running) features over total features built to date.
Calculation
# completed user stories that still pass all acceptance tests
RTF = x 100
total # of completed user stories to date
21© 2015 Accenture. All rights reserved.
*) “A Metric leading to Agility”, Ron Jeffries
20. But wait!
What about Scrum’s metrics as
the Burndown and the Velocity!
5© 2015 Accenture. All rights reserved.
21. Delivered Defect Rate (DDR)
Definition
DDR indicates the effectiveness of the review and testing activities, thus ensuring that fewer
defects are identified on the delivered product (increment).
Calculation
Defects identified after Done-ness of user stories
DDR =
Total effort spent in all tasks till date
Alternative metric: Defect Density is # of defects identified in completed user stories per size
of Done user stories.
23© 2015 Accenture. All rights reserved.
22. Delivered Defect Rate
24© 2015 Accenture. All rights reserved.
DDR Sprint 1 Sprint 2 Sprint 3 Sprint 4
Defects Identified 7 12 19 25
SPE Effort 400 831 1262 1685
Defect Rate 0,018 0,014 0,015 0,015
Engineering effort
includes effort from Agile
Lifecycle tool for design,
build, test, defect fix
tasks. Includes PO & SM
time as % of completed
stories
Count only the
defects logged after
the story is marked
complete by the
developer
In the example above, DDR trend is stable. Therefore the delivered quality is stable.
23. A Set of Core Metrics for Agile Project management*
Scope Cost
Schedule Productivity
Story Point Effort
Performance IndicatorScope Volatility
Release Slippage Risk
Indicator
Sprint Burndown
Performance Indicator
Release Burnchart
Performance Indicator
Quality
Sprint Productivity
Required Productivity
© 2015 Accenture. All rights reserved.
*) Metrics are agnostic of technology or domain where Agile Project Management is used
Running Tested
Features
Delivered
Defect Rate
24. Some further sometimes helpful metrics
26
% Stories accepted
Stakeholder Involvement Index
% Changed Scope
Average Time to Market
(Retrospective)
Process
improvement
Customer Satisfaction
Test Automation
Coverage
Employee Engagement
Epic Progress Report
25. A Set of Core Metrics for Agile Project management*
Scope Cost
Schedule Productivity
Story Point Effort
Performance IndicatorScope Volatility
Release Slippage Risk
Indicator
Sprint Burndown
Performance Indicator
Release Burnchart
Performance Indicator
Quality
Sprint Productivity
Required Productivity
© 2015 Accenture. All rights reserved.
*) Metrics are agnostic of technology or domain where Agile Project Management is used
Running Tested
Features
Delivered
Defect Rate