View stunning SlideShares in full-screen with the new iOS app!Introducing SlideShare for AndroidExplore all your favorite topics in the SlideShare appGet the SlideShare app to Save for Later — even offline
View stunning SlideShares in full-screen with the new Android app!View stunning SlideShares in full-screen with the new iOS app!
Test Driven Development Thousands of Red-Green-Refactor after… UGIALT.NET – Milano 23-01-2010 Omid Ehsani Senior Consultant & Solution Architect email@example.com
Automated Testing Good Automated Tests should be: Isolated, Fast, Repeatable, Maintainable, … How can we achieve those goals? Do the same rules ensure good Automated Test Suites (ATS)? Any difference for one-man-band or team?
Working in Team Consider typical scenarios Rebuilding development machines Introducing new developers Introducing Continuous Integration servers Make building and running ATS easy Put under source control build requirements Minimize test dependencies with external configuration Make tests running fast
ATS Organization Include tests in the same Visual Studio solution (but in separate projects) Separate different test types (mostly Unit Tests from Integration Tests) Human factor: how frequently developers run automated tests? Create a safe green zone
Finding tests Where are my project tests? Where are my class tests? Where are my method tests? Map tests to your system under test (SUT) Use naming and namespaces to define project/assembly tests Unit testing One test class per SUT class One or more test methods per SUT method Integration / user acceptance testing One test class per feature ATS Mapping
Write ReadableTests Naming Naming unit tests Naming variables Meaningful asserts Explicit test sections (AAA, SEV, …) Test method in a screen Setting up and tearing down Explicit Data
Writing Trustworthy Tests When to remove or change tests Production or test bugs Semantics or API changes or Refactoring Duplicated tests Avoid logic in tests Don’t use loops and conditionals Harder to read, likely to have bugs, difficult to name Testing only one thing Improper naming, failing asserts
Write Maintainable Tests Testing private o protected methods Making methods public Extracting methods to new classes Making methods internal Removing duplication Using helper methods Using [SetUp] Test class inheritance Use domain object names in comments and assertion messages Simplify refactoring/renaming with smart tools like ReSharper.
Your Application’s Test API Test utilities and helpers Make test API known to developers Test class inheritance patterns DRY (Don’t Repeat Yourself) Creating testing guidance for developers
Test class inheritance patterns Abstract test infrastructure class Base class containing essential common infrastructure. Common setup and teardown. Template test class Base class containing abstract test methods that derived classes must implement. Multiple implementations of same interface. Abstract test driver class Base class containing test method implementations inherited by all deriving classes. Testing class hierarchies.
Assertions Antipatterns The Giant: Too many assertions in a test (God Object?) Multiple aspects of same object (state check, identity) The Dodger: Lots of minor asserts checking side-effects but none testing core desired behavior (database testing) The Free Ride: Instead of writing a new test case assertions are added to an existing one Overspecifying tests The Inspector: Specifying purely internal behavior Using mocks instead of stubs The Nitpicker: Assuming exact and complete match when not needed
Setup Antipatterns Initialize objects in [SetUp] used only in some of tests Excessive Setup: setup “noise” make test hard to read Fakes and mocks initialized in [SetUp] The Mockery: too many mocks and stubs move the test focus on fake data instead of SUT
Isolation Antipatterns Generous Leftovers: constrainted test order The Peeping Tom: Shared state corruption Hidden test call The Local Hero: Test running only on development box and failing elsewhere Have your Continuous Integration up and running
Other Antipatterns The Loudmouth: test cluttering console with diagnostic messages. The Secret Catcher: Apparently no asserts, test success relying on framework exception catching The Enumerator: Test case method names like test1, test2, test3,… Success Against All Odds: test written as pass first rather than fail first