• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Mon 1130 acacia_d_chethendricksonronjeffries
 

Mon 1130 acacia_d_chethendricksonronjeffries

on

  • 64 views

 

Statistics

Views

Total Views
64
Views on SlideShare
64
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Mon 1130 acacia_d_chethendricksonronjeffries Mon 1130 acacia_d_chethendricksonronjeffries Presentation Transcript

    • Guide to Technical Practices Chet Hendrickson - chet@hendricksonxp.com Ron Jeffries - ronjeffries@acm.org
    • There are two kinds of Scrum - XP and the kind that doesn’t work
    • While we are building the product, we also build defects Defects are “Negative Features” 3
    • We cannot predict how long defect repair will take 4
    • How can we avoid defects? Best known way: Test, extensively, as we go! 5
    • Customer tests show Product Owner that the Product actually WORKS! 6
    • Programmer tests show that the CODE actually works and points to causes of defects! 7
    • Tests must be automated! Why? There’s no other way to keep up with demand. 8
    • Customer Tests “Confirmation” from the 3c’s
    • Two Types of Tests Through the GUI Selenium Mercury Behind the GUI Fitnesse Cucumber
    • Fitnesse
    • Test Suite
    • Individual Test
    • YEA!!!
    • Not DONE Yet
    • Getting Close
    • Size in Kg 1000 900 800 700 600 500 400 300 200 100 0 Application Test
    • Programmer Tests
    • Tools xUnit gTest MS UnitTest
    • Assertions Added Per Iterations 600 540 480 420 360 300 240 180 120 60 0 1 2 3 4 5 6 7 8 Iteration 9 10 11 12 13 14
    • How ELSE might our apparent progress be wrong? 22
    • Expected Velocity Burn Up 300 270 240 User Stories Done 210 180 150 120 90 60 30 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14
    • Usual Velocity 150 135 120 105 User Stories Done 90 75 60 45 30 15 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14
    • Combined Burn Up 300 225 150 75 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14
    • What if the design is bad? We get more defects and slow down over time. The progress line lies. 26
    • We must start with a simple ... and therefore insufficient ... design 27
    • We need to wind up with a larger design ... and it needs to be good! 28
    • We need a continually improving design. How is that even possible? 29
    • How do we move from one good design ... to the next good design? 30
    • Refactoring Why? There’s no other way to ship every Sprint from the beginning and keep the code base alive. 31
    • Continuous Integration How else will you know?
    • Tools Jenkins/Hudson CruiseControl TeamCity Continuum Microsoft Team Foundation Server
    • CSD Techniques: The professional way to do Scrum. • Potentially shippable “DONE” Software Every Sprint • • Automated Acceptance Tests (ATDD) Test-Driven Development • • • Automated Programmer Tests Refactoring Continuous Integration 39
    • The Nature of Software Development: The only way we know today. • Potentially shippable “DONE” Software Every Sprint • • Automated Acceptance Tests (ATDD) Test-Driven Development • • • Automated Programmer Tests Refactoring Continuous Integration 40