Improving your Agile Process
First...who are we?
We reduce home energy use
We reduce home energy use
by showing people how much they use
We reduce home energy use
by showing people how much they use
and then they actually use less!
Agile means:
Agile means:
short iterations
Agile means:
       short iterations
a metric sh*t-ton of practices
  chosen from a salad bar
TDD, Pair Programming,
 Scrum, Kanban Cards, BDD,
 Continuous Integration, MDD,
  Sprints, Stand-ups, Continuous
Deploymen...
Adopting agile?
Adopting agile?
Which practices?
Already agile-ish?
Already agile-ish?
Which practices will help us and
             why?
Pick what feels good?
Pick what feels good?
Study your process and target
       problem areas.
Scientific Method!
1. 	 Gather Data
	 2. 	 Form Hypothesis
	 3. 	 Perform Experiment
	 4. 	 Analyze Results
1. Gather Data
Quantify:
good things (to increase)
Quantify:
good things (to increase)
bad things (to decrease)
The Big Bad: Bugs
The Big Bad: Bugs
(you are using a bug tracker, right?)
“We wrote less bugs than the
  previous iteration. We
      rule!” (Right?)
V2.2   V2.3


# of Bugs   25     23
Wait...What?
            V2.2      V2.3


# of Bugs   25        23      Flu took out
                             half the...
Raw measurements must be put
        in context
What is the “size” of an
       iteration?
LOC?
LOC?
Doesn’t fit with agile
 Hard to measure
Hours/Days?
Hours/Days?
Fixed iterations
Hard to measure
We use “Story Points”
We use “Story Points”
1, 2, 4, 8 per user story
“Story Points” could be anything that:
“Story Points” could be anything that:
changes w/ amount of work
“Story Points” could be anything that:
changes w/ amount of work
  determined consistently
“Story Points” could be anything that:
changes w/ amount of work
  determined consistently
      easy to capture
Bugs ÷ Size ==
Defect Density
V2.2   V2.3

   Bugs        25     23

Story Points   14     10

  Density      1.79   2.3
Simple
Simple
Paints Broad Strokes:
  Increase == Bad
Simple
 Paints Broad Strokes:
   Increase == Bad
Almost enough to draw
      conclusions
With a   smallamount of additional
             meta-data...
With a    amount of additional
         small

         meta-data...
  ...you can gain incredible
           insights
•Severity
•Priority
•‣
 Where Introduced:
  bad requirements
 ‣bad programming
 ‣configuration/deployment
Defect Density: unclear requirements
          Defect Density: Programming Errors
2.2


1.9


1.6


1.3


1.0
   V2.3     ...
Defect Density - all types


2.500


2.125


1.750


1.375


1.000
     V2.3              V2.4              V2.5
Defect Density - all types
            Defect Density - Blockers

2.500


2.125


1.750


1.375


1.000
     V2.3         ...
2. Form Hypothesis
Metrics give us insight
to focus on problem areas
Form a hypothesis about
problem areas and potential
         solutions
Agile practices are a goldmine
Agile practices are a goldmine
...if used sensibly in the context
          of your process
Example:
“Increasing test coverage will
 reduce our defect density”
Example:
“Pair Programming will reduce
  ‘bad programmer’ bugs”
Example:
    “BDD will help clarify
requirements so we implement
       the right thing”
3. Perform Experiment
On the next iteration, test your
         hypothesis
Start slowly;
implement one change, chosen
     for maximum impact
4. Analyze Results
The next iteration’s metrics
should prove/disprove your
        hypothesis
Repeat until profit!
This improvement method isn’t
           perfect
This improvement method isn’t
              perfect
     but it’s a GREAT start
How has this helped OPOWER?
Iteration 1
Half of the user stories not
     being tracked :(
Iteration 2
Parts of the team using different
    scale for story points :(
Iteration 3
Data looked good, baseline
       established.
Iteration 4
Lots of deploy/config errors
Other numbers same/better
Iteration 5
 Automated deployment »
   deploy errors down.
But: Defect Density was up
Iteration 6
     Lowered velocity
Set up test coverage tracking
 (final results not in yet!)
Scientific Method
Measure
Hypothesize
Experiment
 Analyze
dave@opower.com

http://www.opower.com

   @davetron5000
Improving your Agile Process
Improving your Agile Process
Improving your Agile Process
Improving your Agile Process
Upcoming SlideShare
Loading in...5
×

Improving your Agile Process

2,035

Published on

Slides from my lightning talk at DevDays DC on how to improve your agile process by gathering small amounts of data and using it to make decisions.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,035
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
42
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Improving your Agile Process

  1. 1. Improving your Agile Process
  2. 2. First...who are we?
  3. 3. We reduce home energy use
  4. 4. We reduce home energy use by showing people how much they use
  5. 5. We reduce home energy use by showing people how much they use and then they actually use less!
  6. 6. Agile means:
  7. 7. Agile means: short iterations
  8. 8. Agile means: short iterations a metric sh*t-ton of practices chosen from a salad bar
  9. 9. TDD, Pair Programming, Scrum, Kanban Cards, BDD, Continuous Integration, MDD, Sprints, Stand-ups, Continuous Deployment, The Planning Game, User Stories, Planning Poker, Sustainable Pace, Collective Code Ownership...
  10. 10. Adopting agile?
  11. 11. Adopting agile? Which practices?
  12. 12. Already agile-ish?
  13. 13. Already agile-ish? Which practices will help us and why?
  14. 14. Pick what feels good?
  15. 15. Pick what feels good? Study your process and target problem areas.
  16. 16. Scientific Method!
  17. 17. 1. Gather Data 2. Form Hypothesis 3. Perform Experiment 4. Analyze Results
  18. 18. 1. Gather Data
  19. 19. Quantify: good things (to increase)
  20. 20. Quantify: good things (to increase) bad things (to decrease)
  21. 21. The Big Bad: Bugs
  22. 22. The Big Bad: Bugs (you are using a bug tracker, right?)
  23. 23. “We wrote less bugs than the previous iteration. We rule!” (Right?)
  24. 24. V2.2 V2.3 # of Bugs 25 23
  25. 25. Wait...What? V2.2 V2.3 # of Bugs 25 23 Flu took out half the team! Team Size 8 5
  26. 26. Raw measurements must be put in context
  27. 27. What is the “size” of an iteration?
  28. 28. LOC?
  29. 29. LOC? Doesn’t fit with agile Hard to measure
  30. 30. Hours/Days?
  31. 31. Hours/Days? Fixed iterations Hard to measure
  32. 32. We use “Story Points”
  33. 33. We use “Story Points” 1, 2, 4, 8 per user story
  34. 34. “Story Points” could be anything that:
  35. 35. “Story Points” could be anything that: changes w/ amount of work
  36. 36. “Story Points” could be anything that: changes w/ amount of work determined consistently
  37. 37. “Story Points” could be anything that: changes w/ amount of work determined consistently easy to capture
  38. 38. Bugs ÷ Size == Defect Density
  39. 39. V2.2 V2.3 Bugs 25 23 Story Points 14 10 Density 1.79 2.3
  40. 40. Simple
  41. 41. Simple Paints Broad Strokes: Increase == Bad
  42. 42. Simple Paints Broad Strokes: Increase == Bad Almost enough to draw conclusions
  43. 43. With a smallamount of additional meta-data...
  44. 44. With a amount of additional small meta-data... ...you can gain incredible insights
  45. 45. •Severity •Priority •‣ Where Introduced: bad requirements ‣bad programming ‣configuration/deployment
  46. 46. Defect Density: unclear requirements Defect Density: Programming Errors 2.2 1.9 1.6 1.3 1.0 V2.3 V2.4 V2.5
  47. 47. Defect Density - all types 2.500 2.125 1.750 1.375 1.000 V2.3 V2.4 V2.5
  48. 48. Defect Density - all types Defect Density - Blockers 2.500 2.125 1.750 1.375 1.000 V2.3 V2.4 V2.5
  49. 49. 2. Form Hypothesis
  50. 50. Metrics give us insight to focus on problem areas
  51. 51. Form a hypothesis about problem areas and potential solutions
  52. 52. Agile practices are a goldmine
  53. 53. Agile practices are a goldmine ...if used sensibly in the context of your process
  54. 54. Example: “Increasing test coverage will reduce our defect density”
  55. 55. Example: “Pair Programming will reduce ‘bad programmer’ bugs”
  56. 56. Example: “BDD will help clarify requirements so we implement the right thing”
  57. 57. 3. Perform Experiment
  58. 58. On the next iteration, test your hypothesis
  59. 59. Start slowly; implement one change, chosen for maximum impact
  60. 60. 4. Analyze Results
  61. 61. The next iteration’s metrics should prove/disprove your hypothesis
  62. 62. Repeat until profit!
  63. 63. This improvement method isn’t perfect
  64. 64. This improvement method isn’t perfect but it’s a GREAT start
  65. 65. How has this helped OPOWER?
  66. 66. Iteration 1 Half of the user stories not being tracked :(
  67. 67. Iteration 2 Parts of the team using different scale for story points :(
  68. 68. Iteration 3 Data looked good, baseline established.
  69. 69. Iteration 4 Lots of deploy/config errors Other numbers same/better
  70. 70. Iteration 5 Automated deployment » deploy errors down. But: Defect Density was up
  71. 71. Iteration 6 Lowered velocity Set up test coverage tracking (final results not in yet!)
  72. 72. Scientific Method
  73. 73. Measure Hypothesize Experiment Analyze
  74. 74. dave@opower.com http://www.opower.com @davetron5000
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×