Tdd Ugialtnet Jan2010

Uploaded on

Test Driven Development …

Test Driven Development
Thousands of Red-Green-Refactor after…

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads


Total Views
On Slideshare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide


  • 1. Test Driven Development
    Thousands of Red-Green-Refactor after…
    UGIALT.NET – Milano 23-01-2010
    Omid Ehsani
    Senior Consultant & Solution Architect
  • 2. Automated Testing
    Good Automated Tests should be:
    Isolated, Fast, Repeatable, Maintainable, …
    How can we achieve those goals?
    Do the same rules ensure good Automated Test Suites (ATS)?
    Any difference for one-man-band or team?
  • 3. Working in Team
    Consider typical scenarios
    Rebuilding development machines
    Introducing new developers
    Introducing Continuous Integration servers
    Make building and running ATS easy
    Put under source control build requirements
    Minimize test dependencies with external configuration
    Make tests running fast
  • 4. ATS Organization
    Include tests in the same Visual Studio solution (but in separate projects)
    Separate different test types (mostly Unit Tests from Integration Tests)
    Human factor: how frequently developers run automated tests?
    Create a safe green zone
  • 5. Finding tests
    Where are my project tests?
    Where are my class tests?
    Where are my method tests?
    Map tests to your system under test (SUT)
    Use naming and namespaces to define project/assembly tests
    Unit testing
    One test class per SUT class
    One or more test methods per SUT method
    Integration / user acceptance testing
    One test class per feature
    ATS Mapping
  • 6. Write ReadableTests
    Naming unit tests
    Naming variables
    Meaningful asserts
    Explicit test sections (AAA, SEV, …)
    Test method in a screen
    Setting up and tearing down
    Explicit Data
  • 7. Writing Trustworthy Tests
    When to remove or change tests
    Production or test bugs
    Semantics or API changes or Refactoring
    Duplicated tests
    Avoid logic in tests
    Don’t use loops and conditionals
    Harder to read, likely to have bugs, difficult to name
    Testing only one thing
    Improper naming, failing asserts
  • 8. Write Maintainable Tests
    Testing private o protected methods
    Making methods public
    Extracting methods to new classes
    Making methods internal
    Removing duplication
    Using helper methods
    Using [SetUp]
    Test class inheritance
    Use domain object names in comments and assertion messages
    Simplify refactoring/renaming with smart tools like ReSharper.
  • 9. Your Application’s Test API
    Test utilities and helpers
    Make test API known to developers
    Test class inheritance patterns
    DRY (Don’t Repeat Yourself)
    Creating testing guidance for developers
  • 10. Test class inheritance patterns
    Abstract test infrastructure class
    Base class containing essential common infrastructure.
    Common setup and teardown.
    Template test class
    Base class containing abstract test methods that derived classes must implement.
    Multiple implementations of same interface.
    Abstract test driver class
    Base class containing test method implementations inherited by all deriving classes.
    Testing class hierarchies.
  • 11. Assertions Antipatterns
    The Giant: Too many assertions in a test (God Object?)
    Multiple aspects of same object (state check, identity)
    The Dodger: Lots of minor asserts checking side-effects but none testing core desired behavior (database testing)
    The Free Ride: Instead of writing a new test case assertions are added to an existing one
    Overspecifying tests
    The Inspector: Specifying purely internal behavior
    Using mocks instead of stubs
    The Nitpicker: Assuming exact and complete match when not needed
  • 12. Setup Antipatterns
    Initialize objects in [SetUp] used only in some of tests
    Excessive Setup: setup “noise” make test hard to read
    Fakes and mocks initialized in [SetUp]
    The Mockery: too many mocks and stubs move the test focus on fake data instead of SUT
  • 13. Isolation Antipatterns
    Generous Leftovers: constrainted test order
    The Peeping Tom: Shared state corruption
    Hidden test call
    The Local Hero: Test running only on development box and failing elsewhere
    Have your Continuous Integration up and running
  • 14. Other Antipatterns
    The Loudmouth: test cluttering console with diagnostic messages.
    The Secret Catcher: Apparently no asserts, test success relying on framework exception catching
    The Enumerator: Test case method names like test1, test2, test3,…
    Success Against All Odds: test written as pass first rather than fail first
  • 15. Happy TDD!
    • Questions & Answers
    • 16. Resources
    Roy Osherove
    The art of unit testing
    Kent Beck
    Test-Driven Development by example