Test Driven
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Test Driven

on

  • 4,313 views

Want to know the case for Test-Driven Development? Want to know style tips and gotchas for Testing and TDD? Let Alex Chaffee, former Mad Scientist at Pivotal Labs, tell you everything you didn't know ...

Want to know the case for Test-Driven Development? Want to know style tips and gotchas for Testing and TDD? Let Alex Chaffee, former Mad Scientist at Pivotal Labs, tell you everything you didn't know you didn't know about testing.

Statistics

Views

Total Views
4,313
Views on SlideShare
4,294
Embed Views
19

Actions

Likes
6
Downloads
119
Comments
1

4 Embeds 19

http://www.slideshare.net 12
http://pmomale-ld1 3
http://askqtp.blogspot.com 2
http://pmomale-mn1 2

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Test Driven Presentation Transcript

  • 1.
    • by Alex Chaffee
      • alex @ pivotallabs.com
      • Pivotal Labs
    Test-Driven
  • 2.
    • Developers or QA Engineers
    • Familiar with JUnit
    • Want more detail on Automated Testing in general
    • Want to know the case for Test-Driven Development
    • Want to know style tips and gotchas for Testing and TDD
    Intended Audience
  • 3. Part I: The Blank Page
  • 4. Let's test-drive a utility class...
  • 5. Red-Green-Refactor
  • 6.
    • Arrange (set up preconditions)
    • Act (call production code)
    • Assert (check the results)
    3A
  • 7.
    • The heart of a unit test
    • An Assert is a declaration of truth
    • Failed assert -> false (incorrect) behavior
    • “assert your postconditions”
    • Example:
        • Set set = new MySet();
        • set.add("Ice Cream");
        • assertTrue(set.contains("Ice Cream"));
    Assert
  • 8.
    • Don't be over-ambitious
    • Each test -- especially each new test -- should add one brick to the wall of knowledge
      • Code:Brick :: Test:Mortar
    • Pick tests (features) in order of growth
    One Step At A Time
  • 9.
    • A great first test to write
    • Input and output are trivial
    • Helps you focus on infrastructure and API
    The Null Test
  • 10.
    • When you get stuck on a test, try starting with the assertion(s) and then work your way backwards to the setup
    • Start with the assert
        • assertTrue(set.contains(“alex”));
    • Then add the code above it
        • Set set = new MySet();
        • set.add(“alex”);
    • Helps focus on goal
    Assert First
  • 11.
    • Start with hardcoded results and wait until later tests to force them to become real
    Fake It 'Til You Make It
  • 12.
    • Make the code abstract only when you have two or more examples
    Triangulate To Abstraction public void testSum() { assertEquals(4, plus(3,1)); } --- int plus(int x, y) { return 4; } public void testSum() { assertEquals(4, plus(3,1)); assertEquals(5, plus(3,2)); } --- int plus(int x, y) { return x + y; }
  • 13.
    • Before you begin, make a TODO list
    • Write down a bunch of operations
    • For each operation, list the null test and some others
    • Also put down refactorings that come to mind
    • Why not write all the tests in code first?
      • Could box you in
      • Interferes with red-green-refactor
    Test List
  • 14.
    • aka Don't Be Stupid
    • If you really, really, honestly know the right way to implement it, then write it that way
    • But keep track of how many times your "obvious implementation" was broken or untested
    • Edge cases, off-by-one errors, null handling... all deserve tests and often the Obvious Implementation is not covered
    Obvious Implementation
  • 15. Part II: Testing Philosophy
  • 16.
    • Unit
    • Integration
    • Acceptance
    • QA
    • UI
    • Performance
    • Monitoring
    Automated Testing Layers
  • 17.
    • Automated
    • Isolated
    • Repeatable
    • Fast
    • Easy to write and maintain
    • Focused
    • Easy to read
    • Robust (opposite: Brittle)
    A Good Test Is...
  • 18.
    • Someone should be able to understand your class by reading the tests
    • Live documentation (better than dead trees)
    • “ Any fool can write code that a computer can understand. Good programmers write code that humans can understand.”
    • – Martin Fowler
    Tests are “Executable Specifications”
  • 19. Why do you test?
  • 20.
    • Prevent bugs
    • Regress bugs ("bug repellant")
    • Localize defects
    • Understand design
    • Document (or specify) design
    • Improve design
    • Support refactorings
    • Enable experimentation and change
    • Confidence
    • Catch errors the language can't
    • Long-term sustainability and maintainability
    Why do you test?
  • 21. When do you test?
  • 22.
    • Before checkin
    • After update
    • Before deploy
    • While coding
    • In the background
    When do you test?
  • 23.
    • All the time
    When do you test?
  • 24.
    • Never
    • After coding
    • During coding
    • Before coding
    When do you write tests?
  • 25. Why test first?
  • 26.
    • Gets tests written
      • Easier than retrofitting tests onto an existing system
      • Guarantees 100% test coverage
      • In practice, you never have time after the code is written, but you always have the time before
        • Go figure :-)
    Why test first?
  • 27. Why test first?
    • Reduces scope of production code
      • Less scope -> less work
    • Proves that your objects have usable interfaces
      • more useful methods and fewer useless ones
    • Guarantees testability
      • Test-last code is often hard to test
    • Sustainable Feature Velocity
  • 28.
    • Think of tests as examples or specifications
    • One trick: write code in a test class, then extract into the production class
    • "If you can't write a test, then you don't know what the code should do. And what business do you have writing code in the first place when you can't say what it's supposed to do?" - Rob Mee
    How can you write tests for code that doesn't exist?
  • 29.
    • Simple Rule:
      • Test everything that could possibly break
    • Depends on definitions of “everything” and “possibly”
      • (and “break”)
    • This means, don’t test things that couldn’t possibly break
      • E.g. Getters and Setters
      • Unless you think they could fail!
      • Better safe than sorry
    • Full-Range Testing (positive, negative, boundary, null, exception)
    What to test?
  • 30.
    • Personal judgement, skill, experience
    • Usually, you start by testing too little, then you let a bug through
    • Then you start testing a lot more, then you gradually test less and less, until you let a bug through
    • Then you start testing too much again :-)
    • Eventually you reach homeostasis
    How much to test?
  • 31.
    • Not too big, not too small
    • Same concept as high coherence, low coupling
    Test for "essential complexity"
  • 32.
    • Write the tests first
    • Design for testability
    • Use the front door first
    • Communicate intent
    • Don't modify the SUT
    • Keep tests independent
    • Isolate the SUT
    • Minimize test overlap
    • Minimize untestable code
    • Keep test logic out of production code
    • Verify one condition per test
    • Test separate concerns separately
    • Ensure commensurate effort and responsibility
    Meszaros' Principles of Test Automation
  • 33.
    • Every time you write code, you write tests that exercise it
      • That means that if you change the code, and the tests break, you must either
        • Change the tests to match the new spec
        • Change the code to meet the old spec
    • Do not remove the failing tests
      • Unless they no longer apply to the new code’s design or API
      • Do not work around the failing tests
    • Test code is not "wasted" or "extra" -- tests are first-class citizens
      • If you feel like they're too much work, examine your process
      • Maybe you're writing the wrong tests, or your tests are too brittle, or they're not refactored enough
    Tests Are An Extension of Code
  • 34.
    • It forces you to really understand the code
    • It forces you to really understand the tests
    • It forces you to create code that is truly reusable and modular and testable
      • “put your money where your mouth is”
    • These forces drive you to keep your code and your tests simple and easy to understand
    Unit Testing Is Hard
  • 35.
    • Need to spend time on infrastructure, fixtures, getting comfortable with TDD
    • Business case for TDD: sustainable velocity
      • for feature velocity, stabilty > early oomph
    • Famous graph
    Test-Driving Is Slower At First
  • 36.
    • Test-Driven Development
      • Good old-fashioned coding, now with tests!
    • Test-Driven Design
      • Free your mind and the code will follow
    • Quite a lot of overlap, but worth keeping difference in mind
    • Lots of XP gurus are all about the Zen, but you don't need to buy into that
      • But it's actually pretty cool to try Zen Testing
    Two D's
  • 37. Part III: Advanced Techniques
  • 38.
    • Positive Tests
    • exercise normal conditions (“sunny day scenarios”)
    • E.g. Verify that after adding an element to a set, that element exists in the set
    • Negative Tests
    • Exercise failure conditions (“rainy day scenarios”)
    • E.g. verify that trying to remove an element from an empty set throws an exception
    • Boundary Conditions
    • Exercise the limits of the system (“cloudy day”)
    • E.g. adding the maximum number of elements to a set
    • E.g. test 0, -1, maximum, max+1
    Full Range Testing
  • 39.
    • instead of SetTest.testEmpty
    • how about SetTest.testShouldHaveSizeZeroWhenEmpty
    • or EmptySetTest.testHasSizeZero
    Verbose Test Naming
  • 40.
    • Optional first parameter to JUnit asserts is "message"
    • Assertion messages can be confusing
    • Example:
    • assertEquals(“set is empty”, set.isEmpty());
    • Does it mean “the set must be empty” or “the test is failing because the set is empty”?
    • Solution: should statements
    • assertEquals(“set should be empty”, set.isEmpty())
    • or even better:
    • assertEquals("a newly-created set should be empty", set.isEmpty())
    Should Statements
  • 41.
    • Philosophy: a test is a valid client of an object
    • Therefore don't be ashamed of adding a method just because it would make a test easier to write
    • Used -> Useful
    • Remember, tests are examples of use
    Test-Only Methods
  • 42.
    • Spend time refactoring your tests
    • It'll pay off later, when writing new tests or extending/debugging old ones
    • Refactor for readability, not necessarily for removing all duplication
      • Different priorities than for production code
    • Extract methods
    • Shorter lines
    • Break up long tests (scenario tests) into several short tests (feature tests)
    • One technique: "Refactor production code on green, Refactor test code on red."
      • for complex cases, break the code, make sure the refactored tests still reveal the breakage, then fix it
    Refactor Test Code
  • 43.
    • assertEquals(86400, new Day().getSeconds())
    • vs.
    • assertEquals(60 * 60 * 24, new Day().getSeconds())
    • vs.
    • secondsPerMinute = 60
    • minutesPerHour = 60
    • hoursPerDay = 24
    • assertEquals(secondsPerMinute * minutesPerHour * hoursPerDay, new Day().getSeconds())
    Evident Data
  • 44.
    • Problem: several axes of variability, combinatorial explosion
    • Solution: Loop through a matrix of data in your test, call a "check" function on each row
    Matrix Tests
  • 45.
    • aka "Golden Data Tests"
    • Grab the complete output of a routine, put it into the test
    • Not amenable to test-driven development
    • Effective for large or risky refactorings
    • Quite brittle, so often thrown away after the refactoring is done
    Characterization Tests
  • 46.
    • public void testUnknownCountry() {
    • try {
    • currencyConverter.getRate("Snozistan");
    • fail("Should have thrown an exception for unknown country");
    • } catch (UnknownCountryException e) {
    • // ok
    • }
    • }
    Exception Tests
  • 47.
    • A pair's job is to keep you focused
    • "Wait, let's write a test first."
    • "Wait, let's refactor first."
    • "Wait, let's discuss this."
    • "Can I drive?"
    Pair Programming
  • 48.
    • One pair writes a test
    • The other pair makes it pass and writes the next test
    • Repeat
    • Good way to get out of a rut, or cure a keyboard hog
    Ping-Pong Pairing
  • 49.
    • When a defect is reported...
      • The first step is to write a (failing) test that reproduces the bug
      • Fix the bug by writing code to make the test run successfully
      • Verify the bug in the running application
      • Add the bug test to the automated suite
      • Check in the bugfix code and test
      • Now it’s always run – instant regression test!
    • "Regression tests are test that you would have written originallly." - Kent Beck
    • May also want to write a failing Acceptance Test, but that's optional -- you definitely want a failing unit test
    Regression Test
  • 50.
    • Often the best thing to do is throw away your work and start again
    Do Over
  • 51.
    • At the end of the day, write a failing test and leave it there for tomorrow
    • Based on writer's trick: start a sentence and leave it unfinished
    Leave One For Tomorrow
  • 52.
    • Tests are only valuable if they're run all the time
    • If they're slow, people will not want to run them all the time
    • So keep them fast!
    • Difficult quest, but worth it
    • Don’t get stuck in molasses!
      • Refactor your code to be easier to write fast tests on
      • Replace slow tests with (one or more) fast tests that cover the same area
    The Need For Speed
  • 53. Retrofitting
    • What to do when you have an existing untested codebase?
    • Start small
      • Write one test, make it pass, check it in
      • Write tests for all new code
      • Write tests for all new bugs
      • Write tests before attempting refactoring
    • Usually easier to write characterization tests (UI/integration)
      • But don’t fall into the slow test trap
  • 54.
    • Any time all the tests are green, you can check in
    • Run all the tests all the time
    • Don’t check in until all tests pass
    • If you broke “someone else’s” tests, you are responsible for fixing “their” code
    • Remember, they are in the room, so go get them if you need help
    • Learn to Love the Orb
      • ccmenu
    Continuous Integration
  • 55.
    • Suites are a pain to maintain
    • Write code to automatically scan for tests and run them together
    • Possible to do in JUnit, but annoying
    Automatic Suites
  • 56.
    • Matter of preference
    • Both are useful at times
    Outside-in vs. Inside-out
  • 57.
    • Start with domain objects
    • Next layer of tests
    Inside-out
  • 58.
    • Start with customer story or user interface
    • Makes you think like a user
    • Tests capture these requirements
    • Lower layers implemented with Test Doubles (mocks)
      • After you're done, either replace mocks with real objects, or leave them there (perhaps at higher maintenance cost)
    Outside-in
  • 59.
    • Write a bunch of UI-level tests
    • Leave them there while you test-drive inside-out
    • When they all pass, you're done
    Outside-in design, inside-out development
  • 60.
    • A Test Double replaces the "real" instance of an object used by the production code with something suitable for the currently running test, but with the same interface
    • Stubs
      • Hard-coded values
    • Mocks
      • Pre-programmed with expectations
      • Fail-fast
      • Test Doubles in general are often called Mock Objects, so be careful about terminology
    • Fakes
      • Can store values across calls, but don't really do what the live object would do
      • E.g. in-memory database
    Test Doubles
  • 61. More Test Doubles
    • Spies
      • Remember what methods were called with what values
      • Tests can inspect these lists after code returns
    • Saboteurs
      • Blow up in ways that would be difficult to make happen for real
      • To test what would happen when, e.g., the database goes away, or the disk is full
    • Self Shunt
      • The test itself declares methods or classes implementing the above, and passes in a pointer to itself
  • 62.
    • Two types:
      • DI Frameworks
      • Complete Construction
        • This is the one I'm talking about
        • Pass in dependencies to the constructor (or, if necessary, to setters)
    • An object under test will receive references to all external services
    • Allows tests to inject Test Doubles at will
    • Forces objects to be isolated
    • Example: TBD
    Dependency Injection
  • 63.
    • Changes the language of tests to emphasize that they're specifications or examples
    • Replaces "assert" with "should"
    EDD/BDD (specs)
  • 64.
    • A natural progression of refactoring your test data
      • literals
      • constants
      • local variables
      • instance variables (defined in setUp())
      • creation methods
      • parameterized creation methods or objects ("object mothers")
    • Other patterns
      • test objects / graphs ("fixtures" or "cast of characters" or "menagerie")
      • external fixture files
    Fixtures and Object Mothers
  • 65. Mock Clock
  • 66. Part IV: Test-Driving UI
  • 67. Part V: Q&A
  • 68. Thanks!