• Save
Patterns in Testing
Upcoming SlideShare
Loading in...5
×
 

Patterns in Testing

on

  • 1,158 views

Discussion of testing patterns in the areas of unit testing, functional testing and system testing.

Discussion of testing patterns in the areas of unit testing, functional testing and system testing.

Statistics

Views

Total Views
1,158
Views on SlideShare
1,132
Embed Views
26

Actions

Likes
0
Downloads
0
Comments
0

3 Embeds 26

http://www.slideshare.net 13
http://www.linkedin.com 10
https://www.linkedin.com 3

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Patterns in Testing Patterns in Testing Presentation Transcript

  • for Code, Components and Systems
    Patterns In Testing
  • overview
    Why write tests?
    Purist vs. Practitioner
    New Code vs. Legacy Code
    Unit Testing
    Functional Testing
    User Driven Testing
    Test Automation
    Ecosystem
    5/20/2010
    Presented by Joe Chavez
  • Why write tests?
    5/20/2010
    Presented by Joe Chavez
    Catch defects during implementation
    Software testing is not the same as hardware testing
    More moving parts
    Documentation is often stale or non-existent
    Tests provide opportunity to document the intent of a class, component or system with working code
    Once written, tests are repeatable with very little overhead
    Test help a team maintain and more importantly change code with reduced risk
    Software tests can drive hardware testing
    Automated tools cannot capture the intent of the code – especially dynamic intent
    These tools complement the coding of tests, not replace
  • PURIST vs. Practitioner
    5/20/2010
    Presented by Joe Chavez
  • PURIST vs. Practitioner
    Balance achieved with Pragmatism
    Context is key
    Type of code: new, legacy, generated, etc.
    Robustness of tools
    Complexity of design
    Developer skillset
    Management support
    Perception is often that writing test code is not contributing to the bottom line
    Reality a few tests now can save time during the integration/delivery phases of a project
    5/20/2010
    Presented by Joe Chavez
  • New Code vs. Legacy Code
    New Code
    Can benefit greatly from a Test Driven approach
    Identify critical areas in the requirements/design that deliver most value
    Typically can use the best of breed testing tools & frameworks
    Legacy Code
    Improve – White box with Test Driven approach
    Incorporate – Characterize actual behavior in isolation
    Rewrite – Analyze existing code and write tests as part of new code
    Pair programming is helpful
    Can work in code review mode while writing tests
    Ideal situation: original programmer is part of the pair
    5/20/2010
    Presented by Joe Chavez
  • New Code vs. Legacy Code
    Special Cases:
    Open Source
    Usually come with a test suite – use caution if there is no test suite
    Generated Code
    For C/C++ - characterize memory management
    Explore usage of generated code through writing tests
    5/20/2010
    Presented by Joe Chavez
  • Unit Testing
    Single Purpose Testing
    Focused on public methods of a class
    View through the caller eyes
    Typically one (1) assertion check for pass/fail
    Code scenarios to verify correct behavior under normal and abusive conditions
    Testing Frameworks
    C# - NUnit
    C++ - UnitTest++
    Java – jUnit
    Mocking Frameworks
    5/20/2010
    Presented by Joe Chavez
  • Functional Testing
    Verify the integration of several components and/or classes
    Use same testing framework that unit tests use
    Good for integration testing new/legacy/open source/generated code
    Verification scenarios
    Data flow between components
    Actual vs. expected results at each stage
    Use case scenarios
    5/20/2010
    Presented by Joe Chavez
  • USER Driven Testing
    Data Driven System
    A system that can be configured for execution based on an input data set and will produced an expected set of output data
    Test Application
    May require development of a test application – typically a console application
    Accepts file based input and/or command line options
    Produces file based output
    5/20/2010
    Presented by Joe Chavez
  • Test Automation
    Run tests early and often
    Integrate into the development/build process
    Visual Studio post build event
    msbuild/ant/make scripts
    Continuous Integration support
    Silent success and error on failure
    If all tests succeed then report silently (X number of test passed)
    If any test fails then fail the build
    Report failed test with a context (i.e. failed assertion, stack trace)
    Take corrective action before moving on
    5/20/2010
    Presented by Joe Chavez
  • IN CONTEXT
    5/20/2010
    Presented by Joe Chavez