Patterns in Testing


Published on

Discussion of testing patterns in the areas of unit testing, functional testing and system testing.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Patterns in Testing

  1. 1. for Code, Components and Systems<br />Patterns In Testing<br />
  2. 2. overview<br />Why write tests?<br />Purist vs. Practitioner<br />New Code vs. Legacy Code<br />Unit Testing<br />Functional Testing<br />User Driven Testing<br />Test Automation<br />Ecosystem<br />5/20/2010<br />Presented by Joe Chavez<br />
  3. 3. Why write tests?<br />5/20/2010<br />Presented by Joe Chavez<br />Catch defects during implementation<br />Software testing is not the same as hardware testing<br />More moving parts<br />Documentation is often stale or non-existent<br />Tests provide opportunity to document the intent of a class, component or system with working code<br />Once written, tests are repeatable with very little overhead<br />Test help a team maintain and more importantly change code with reduced risk<br />Software tests can drive hardware testing<br />Automated tools cannot capture the intent of the code – especially dynamic intent<br />These tools complement the coding of tests, not replace<br />
  4. 4. PURIST vs. Practitioner<br />5/20/2010<br />Presented by Joe Chavez<br />
  5. 5. PURIST vs. Practitioner<br />Balance achieved with Pragmatism<br />Context is key<br />Type of code: new, legacy, generated, etc.<br />Robustness of tools<br />Complexity of design<br />Developer skillset<br />Management support<br />Perception is often that writing test code is not contributing to the bottom line<br />Reality a few tests now can save time during the integration/delivery phases of a project<br />5/20/2010<br />Presented by Joe Chavez<br />
  6. 6. New Code vs. Legacy Code<br />New Code<br />Can benefit greatly from a Test Driven approach<br />Identify critical areas in the requirements/design that deliver most value<br />Typically can use the best of breed testing tools & frameworks<br />Legacy Code<br />Improve – White box with Test Driven approach<br />Incorporate – Characterize actual behavior in isolation<br />Rewrite – Analyze existing code and write tests as part of new code<br />Pair programming is helpful<br />Can work in code review mode while writing tests<br />Ideal situation: original programmer is part of the pair<br />5/20/2010<br />Presented by Joe Chavez<br />
  7. 7. New Code vs. Legacy Code<br />Special Cases:<br />Open Source<br />Usually come with a test suite – use caution if there is no test suite<br />Generated Code <br />For C/C++ - characterize memory management<br />Explore usage of generated code through writing tests<br />5/20/2010<br />Presented by Joe Chavez<br />
  8. 8. Unit Testing<br />Single Purpose Testing<br />Focused on public methods of a class <br />View through the caller eyes<br />Typically one (1) assertion check for pass/fail<br />Code scenarios to verify correct behavior under normal and abusive conditions<br />Testing Frameworks<br />C# - NUnit<br />C++ - UnitTest++<br />Java – jUnit<br />Mocking Frameworks<br />5/20/2010<br />Presented by Joe Chavez<br />
  9. 9. Functional Testing<br />Verify the integration of several components and/or classes<br />Use same testing framework that unit tests use<br />Good for integration testing new/legacy/open source/generated code<br />Verification scenarios<br />Data flow between components<br />Actual vs. expected results at each stage<br />Use case scenarios<br />5/20/2010<br />Presented by Joe Chavez<br />
  10. 10. USER Driven Testing<br />Data Driven System<br />A system that can be configured for execution based on an input data set and will produced an expected set of output data<br />Test Application<br />May require development of a test application – typically a console application<br />Accepts file based input and/or command line options<br />Produces file based output<br />5/20/2010<br />Presented by Joe Chavez<br />
  11. 11. Test Automation<br />Run tests early and often<br />Integrate into the development/build process<br />Visual Studio post build event<br />msbuild/ant/make scripts<br />Continuous Integration support<br />Silent success and error on failure<br />If all tests succeed then report silently (X number of test passed)<br />If any test fails then fail the build<br />Report failed test with a context (i.e. failed assertion, stack trace)<br />Take corrective action before moving on<br />5/20/2010<br />Presented by Joe Chavez<br />
  12. 12. IN CONTEXT<br />5/20/2010<br />Presented by Joe Chavez<br />