Sagi Smolarski ITG - Enterprise Metrics on Agile
Upcoming SlideShare
Loading in...5
×
 

Sagi Smolarski ITG - Enterprise Metrics on Agile

on

  • 1,791 views

 

Statistics

Views

Total Views
1,791
Views on SlideShare
1,500
Embed Views
291

Actions

Likes
1
Downloads
32
Comments
1

2 Embeds 291

http://www.agilesparks.com 262
http://agilesparks.com 29

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Photo by Daniel Goodwin - http://www.flickr.com/photos/dgoodphoto/
  • Transition of teams include 3 days training, followed by 2 days at the end of the first iteration.
  • Sten [CC-BY-SA-3.0-2.5-2.0-1.0 (www.creativecommons.org/licenses/by-sa/3.0) or GFDL (www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons
  • William Thomson, 1st Baron KelvinDiscovered the first and second law of thermodynamics
  • How many of you relate to these needs?How many of you are collecting process/quality metrics?
  • Inspect and Adapt is key and we want to make the most of itTeams have a hard time interpreting charts, especially in situations where they are not very good or very badTeams often don’t take the time to look at the chartsAn easy indication of good/bad can at least tell them that there might be a problem
  • I’m going to focus on the quantitative side
  • The First "Computer Bug" Moth found trapped between points at Relay # 70, Panel F, of the Mark II Aiken Relay Calculator while it was being tested at Harvard University, 9 September 1947. The operators affixed the moth to the computer log, with the entry: "First actual case of bug being found". They put out the word that they had "debugged" the machine, thus introducing the term "debugging a computer program". In 1988, the log, with the moth still taped by the entry, was in the Naval Surface Warfare Center Computer Museum at Dahlgren, Virginia, which erroneously dated it 9 September 1945. The Smithsonian Institute's National Museum of American History and other sources have the correct date of 9 September 1947 (Object ID: 1994.0191.01). The Harvard Mark II computer was not complete until the summer of 1947.Photo in the public domain
  • Is this good or bad?
  • What is the problem here
  • What is the problem here
  • Is this good or bad?

Sagi Smolarski ITG - Enterprise Metrics on Agile Sagi Smolarski ITG - Enterprise Metrics on Agile Presentation Transcript

  • Enterprise Metrics on Agile
    Sagi Smolarski
    Director, Software Development
    Sagi.Smolarski@itg.com
    [CC] Daniel Goodwin via Wikimedia Commons
  • Disclaimers
    The information contained in this presentation is provided for informational purposes only. 
    All trademarks, service marks, and trade names not owned by ITG are owned by their respective owners.
    …in other words, the data presented has been modified and does not represent real data for ITG teams.
  • Agenda
    Why do we need metrics for agile?
    How do we generate those metrics?
    Which metrics do we look at?
    Pros and cons of looking at those metrics.
  • Investment Technology Group
    NYSE: ITG
    www.itg.com
    Leading provider of trading solutions
    Trading Front-Ends
    Trading Algorithms
    Research (fundamental analytic, pre/post-trade)
    Trading desks
    Electronic trading connectivity
    Liquidity pool
    Development operation: 300 Developers, 100 QA Analysts, ~60 teams
  • Iteration Metrics
    Quality Metrics
    Process Baseline
    ITG’s Agile Transition Timeline
    ITG Triton
    XP
    Enterprise rollout plan
    Rally
    +Scrum, +Lean
    90% of teams have been trained and are using Rally
    Best
    Execution
    Management
    System
    XP Pilot
    2000
    2001
    2002
    2003
    2004
    2005
    2006
    2007
    2008
    2009
    2010
    2011
    Massive transition
    Informal, spotty, inconsistent adoption
  • Our Process Baseline – How We Expect Teams to WorkExcerpt
  • The journey to Agile is a long, winding road…
    Are we moving forward?
    [CC] – Sten via Wikimedia Commons
  • Why Measure?
    “If you cannot measure it you cannot improve it.”
    Lord Kelvin
  • Why Metrics?
    Teams (Inspect & Adapt):
    • Are we doing okay?
    • How can we improve?
    • Are we doing what the company expect from us?
    Coaches, PMO (Train and Coach):
    • Are teams using what we taught?
    • Which teams need our help?
    Executives (Govern & Steer):
    What are we getting out of the agile transition?
    Are teams sticking to our process baseline and to enterprise initiatives?
    Is productivity/quality improving?
  • Team A
    Teams Process HealthData is readily available in Rally
    Team B
    Velocity
    Velocity
    vs.
    Burndown/Story Acceptance
    Burndown/Story Acceptance
    vs.
    Isn’t this enough?
  • Why Metrics? - Dumbing down

    This is too complex for team members to interpret and monitor…
    Are the charts okay?
  • Why Metrics – Scaling up
    How do we watch 80 teams?
    Lines of products?
    The whole enterprise?
    ?
  • Types of Metrics
    Qualitative
    Satisfaction of stakeholders
    Product Management, Client Services, Product Support (deployment, service), Delivery team
    Teams adherence to practices
    Agility questionnaire
    Quantitative
    Quality metrics
    Process health metrics
    Time-to-Market
    We’ll be focusing on these
  • What Would We Want to Measure?
    Productivity
    Quality
    ROI
    Satisfaction
    Some of these are not easy to measure so we have to find proxies
  • What We Are Actually Measuring
    Satisfaction
    Quality
    Process
    As a partial proxy for productivity
    Stakeholders Surveys
    Production defects
    Defects Debt
    Work-in -Progress
    Velocity Stability
    Full Completion or work
    Gradual acceptance of work
    Churn
    Small Stories
    Practices Radar map
  • How We Generate Metrics
    How do we obtain data?Rally’s Web Services API
    Rest and SOAP
    Access to virtually all data in the tool
    Users, Releases, Iterations, Stories, Tasks, Defects, Test cases…
    Ruby toolkit… or any WS library
    We use mostly C#
    Automated generation, monthly
    How do we make sense out of mounds of data?
  • The Metrics
  • Quality Metrics
  • Quality MetricsDefects Found in Production
    Metric #1
    Production Defects
    Goal
    Jan
    Feb
    Mar
    Apr
    May
    Jun
    Jul
    Aug
    Sep
    Oct
    Nov
    Dec
    Jan
    Feb
    2010
    2011
    Team Level & Enterprise Level
  • Metric #2
    Quality MetricsFix-Bugs-First
    If you are fixing defects first
    As part of existing development, within the iteration
    Before new development for regression defects
    Then…
    The number of defects in your backlog should be small
    The age of open defects should be low
    Together…
    Defects Debt should be low
    Defects Debt = [# open defects] x [avg age of open defects]
  • Quality MetricsProject-Level Defects Debt
    Metric #2
  • Quality MetricsEnterprise Defects Debt
    Metric #2
    Defects Debt
  • Process
    Health
    Metrics
  • Team Process HealthAgility Questionnaire
  • Scope CreepChurn
    Metric #3
    Iteration Cumulative Flow
    No Scope Creep
    Plan Estimates
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Churn (Scope variation) = 0
  • Work in Progress & Scope Creep
    Metric #3
    Iteration Cumulative Flow
    Plan Estimates
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Churn = 30%
  • Work in Progress & Scope Creep
    Metric #3
    Iteration Cumulative Flow
    Plan Estimates
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Scope Creep:
    Team Cannot Commit
    Disruptive (task switching?)
    Less efficient
    Day 12
    Day 13
    Day 14
    Churn = 30%
  • Work in ProgressWIP Average
    Metric #4
    Iteration Cumulative Flow
    No Scope Creep
    Work In Progress
    Plan Estimates
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Planned
    WIP average = 10%
    In Progress
    Accepted
  • Work in Progress & Scope Creep
    Metric #4
    Iteration Cumulative Flow
    Plan Estimates
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Planned
    WIP average = 70%
    In Progress
    Accepted
  • Work in Progress & Scope Creep
    Metric #4
    Iteration Cumulative Flow
    Large WIP =
    Long Cycle Time
    Risk of not completing work within iteration
    Plan Estimates
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Planned
    WIP average = 70%
    In Progress
    Accepted
  • Full Acceptance – Final Acceptance %
    Metric #5
    Iteration Burn-up
    100% Accepted
    Plan Estimates (Points)
    50% Acceptance
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Day 12
    Day 13
    Day 14
    Accepted
    Acceptance rate on the last day = 100%
  • Full Acceptance
    Metric #5
    Iteration Burn-up
    100% Accepted
    Plan Estimates (Points)
    50% Acceptance
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Day 13
    Day 14
    Day 12
    Day 13
    Day 14
    Accepted
    Acceptance rate on the last day = 55%
  • Full Acceptance
    Metric #5
    Iteration Burn-up
    100% Accepted
    Plan Estimates (Points)
    50% Acceptance
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Day 13
    Day 14
    Partial Completion
    Hard to plan
    Greater cycle time
    Inefficient
    Day 12
    Day 13
    Day 14
    Accepted
    Acceptance rate on the last day = 55%
  • Gradual Acceptance – 50% day
    Metric #6
    Iteration Burn-up
    100% Accepted
    Plan Estimates (Points)
    50% Acceptance
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    >50% points
    accepted
    Day 12
    Day 13
    Day 14
    Accepted
    Day of 50% Acceptance = 8days/14days = 57%
  • Gradual AcceptanceFull Acceptance
    Metric #6
    Iteration Burn-up
    100% Accepted
    Plan Estimates (Points)
    50% Acceptance
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Day 12
    Day 13
    Day 14
    Accepted
    Day of 50% Acceptance = 14/14 = 100%
  • Gradual Acceptance
    Metric #6
    Iteration Burn-up
    100% Accepted
    Plan Estimates (Points)
    50% Acceptance
    Day 1
    Day 2
    Day 3
    Day 4
    Day 5
    Day 6
    Day 7
    Day 8
    Day 9
    Day 10
    Day 11
    Day 12
    Day 13
    Day 14
    Day 12
    Day 13
    Day 14
    Accepted
    Late Acceptance
    High risk of not completing work
    Late feedback on work
    Day of 50% Acceptance = 14/14 = 100%
  • Velocity StabilityVelocity Variation
    Metric #7
    Velocity
    60
    50
    40
    30
    20
    10
    Iteration 23
    Iteration 24
    Iteration 25
    Iteration 26
    Iteration 27
    Iteration 28
    Iteration 29
    Iteration 30
    Iteration 31
    Iteration 32
    Day 12
    Day 13
    Day 14
    Accepted within iteration
    Velocity is fairly stable
    Accepted after iteration
    Variation = 7%
    Never accepted
  • Velocity StabilityVelocity Variation
    Metric #7
    Velocity
    60
    50
    40
    30
    20
    10
    Iteration 23
    Iteration 24
    Iteration 25
    Iteration 26
    Iteration 27
    Iteration 28
    Iteration 29
    Iteration 30
    Iteration 31
    Iteration 32
    Day 12
    Day 13
    Day 14
    Accepted within iteration
    Velocity is unstable
    Accepted after iteration
    Variation = 87%
    Never accepted
  • Velocity Stability
    Metric #7
    Velocity
    60
    50
    40
    30
    20
    10
    Iteration 23
    Iteration 24
    Iteration 25
    Iteration 26
    Iteration 27
    Iteration 28
    Iteration 29
    Iteration 30
    Iteration 31
    Iteration 32
    Day 12
    Day 13
    Day 14
    Unstable Velocity
    Low predictability
    Hard to plan
    Inefficient?
    Accepted within iteration
    Velocity is unstable
    Accepted after iteration
    Variation = 87%
    Never accepted
  • Defects Debt
    Velocity Stability
    Churn
    % Completion
    WIP, Day of 50% Acceptance, Story Sizing
    Our Process Baseline… and metrics we can collect
  • Team DashboardAll together now…
    Day 12
    Day 13
    Day 14
    Day 12
    Day 13
    Day 14
    Day 12
    Day 13
    Day 14
    Day 12
    Day 13
    Day 14
    Day 12
    Day 13
    Day 14
    Day 12
    Day 13
    Day 14
  • Teams Process HealthEnterprise Process Health Map
    (Probably) Healthy Process
    Line of Business #1
    Line of Business #2
    Line of Business #3
    Line of Business #4
    Line of Business #5
    ProjY
    Proj k
    Proj X
    Proj B
    Proj N
    Projא
    Project A
    Prohj Z
    Proj E
    Projב
    Proj 1
    Proj C
    Proj W
    Proj ג
    Proj 2
    Proj ד
    Proj 3
    Projה
    Proj D
    Proj 4
    Proj 5
    Proj M
    Proj L
    Projו
    (Possibly) Unhealthy Process
  • Problems with Metrics
    Metric-Driven Dysfunctions
    Teams misreporting defects
    Not tracking unpredictable work in Rally
    We can limit these by not using these metrics to reward, reprimand
    False positives, False negatives
    Some metrics are sensitive to how Rally is being use
    These metrics don’t cover some important aspects of teams’ process
    Metrics should be treated as an indication that further examination is needed
    Hey, I just figured out how we can double our quarterly sales. From now on, each quarter will last six months.
  • Looking at Next…
    Small Stories
    Time to Market
    Trends
    By Simon A. Eugster (EigenesWerk) [GFDL (www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (www.creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons
  • Q&A
    Q&A
    How are you measuring yourselves?