Test Automation Framework Designs

1,102 views
1,005 views

Published on

Test automation framework designs by Martin Lienhard. In this slide Martin describes the phases of designing a test automation framework, and why we should move far, far away from record & playback test scripts. Data-driven and parameterized tests from external files, DBs, etc. External UI maps of locators. Using multiple test tools (Selenium/WebDriver being the favorite, of course). Testing across multiple environments on parallel deployment paths with different application versions.

Online courses offered by Martin:
https://www.udemy.com/beginning-webdriver-and-java/

Published in: Technology
1 Comment
2 Likes
Statistics
Notes
No Downloads
Views
Total views
1,102
On SlideShare
0
From Embeds
0
Number of Embeds
32
Actions
Shares
0
Downloads
35
Comments
1
Likes
2
Embeds 0
No embeds

No notes for slide

Test Automation Framework Designs

  1. 1. Test Automation Framework Designs Designing standardized frameworks instead of simple scripting By Martin Lienhard
  2. 2. The Great Divide • There has been a divide separating QA and Dev due in large part by 3rd party tool vendors and their proprietary test tools • Open source tools have helped significantly to bridge that divide, so that we can work together to create better software • Most QAs have the bigger picture of what they want to test, and how to do it, but have trouble building robust automation frameworks, because they lack the technical skills. • Most Developers focus on testing code in isolation, but lack the testing experience from a system-wide or end-to-end perspective, so they have trouble building a test automation framework that suits the needs of QA.
  3. 3. 1st Phase of Test Progression • Most QA start out creating test scripts by using record & playback tools • Lack conditional logic and looping • Contain static data, UI locators, URLs, etc. • Cannot be executed iteratively • Cannot be reusable modules • Should not be chained together as dependencies • Lack configuration and global properties • Cannot abstract away from the test tool or test runner
  4. 4. Record & Playback Test Scripts
  5. 5. 2nd Phase of Test Progression • QA usually improves their tests by adding data-driven techniques • Create variables and arrays to store data • Create parameterized functions or methods to store data • Add looping to recorded scripts
  6. 6. 3rd Phase of Test Progression • QA usually improves their tests by creating a business function library • Modular, reusable test libraries of functions • Create file readers that pull in parameters from text files, CSV files, etc. • Abstract out the data from the tests by externalizing data • Create parameterized or iterative tests that are data-driven • Abstract out the UI locators from the tests by externalizing UI maps • Create global properties and configuration files • A vast array of disparate data pools and UI maps are created, which usually become unmanageable after a couple of years
  7. 7. 4th Phase of Test Progression • QA usually improves their tests by creating a keyword-driven library: • Create ODBC functions that read data from spreadsheets and databases • Create spreadsheets of keywords that represent business domain language – Low-level keywords describe test operations – High-level keywords describe business language – Intermediate-level keywords map business functions to test operations – Tests become readable by business users and non- technical QA
  8. 8. …continued • Create spreadsheets of data that drive the tests • Query data from the application database as input and verification points for tests • A hierarchy of abstracted keyword spreadsheets are created • Now non-technical BAs and QAs are creating automated tests • A vast array of disparate keyword spreadsheets are created, which usually become unmanageable after a couple of years • Nearly all 3rd party vendor test tools live in this area today – Most of these tools use legacy scripting languages, which are not compatible with today’s technology
  9. 9. 5th Phase of Test Progression • QAs become more technical and improve their tests by using object oriented testing techniques: • Objects are designed and created to further modularize and reuse code with very low maintenance • Test code is stored in objects – The page object design is used – Component object designs are used • Test data are stored in objects • Configuration and properties are stored in objects • Calls are made to the application and developer written APIs to make tests “smarter”
  10. 10. …continued • Continuous integration environments are utilized to run tests • QA moves into white box testing to drive quality deeper into the application – UI tests are ran “headless” to make them execute faster • QA expands the test framework to include multiple test tools and test runners – Databases, web services, and APIs are created to reuse code and data across multiple test tools – Test tools and 3rd party utilities can be upgraded to newer versions with enhanced functionality with minor impact to the test framework
  11. 11. …continued • Developers can now run the tests, and contribute to the overall test suites • More 3rd party vendors are rewriting their test tools to include object oriented languages to accommodate these technical QAs – Due to the cost of the tools and their proprietary nature, it is difficult to place these tools into a continuous integration environment to share with developers
  12. 12. What is a test automation framework? • A test automation framework is an application that allows you to write a series of tests without worrying about the constraints or limitations of the underlying test tools
  13. 13. Benefits of building a Test Framework • Tool agnostic – Abstracts away low level commands – Utilize many test tools • Test tools such as Selenium, WebDriver, etc. • Test runner such as unit test frameworks • Other common utilities – Perform multiple levels and types of testing • Functional, regression, load, performance, unit, integration, etc.
  14. 14. …continued • Generic test API – Common codebase and database – Exception and error handling – Modularized, reusable, and maintainable code and data – Standardized test idioms • Test configuration – Configurable test suites – Global test properties
  15. 15. …continued • Automatic test versioning – No need to get a specific version from the source control repository • Always get the latest version – Eliminates deployment to multiple machines for testing different versions of an application • Deploy to a single server machine
  16. 16. …continued • Multi-threading – Run tests in parallel – Runs on test machines in the grid or cloud – Test across multiple environments and application versions simultaneously • Continuous integration – Run in a continuous integration environment – Share tests and collaborate with other teams
  17. 17. Test Framework Architecture
  18. 18. Test Suite • A collection of tests to run • Have test parameters – i.e.: application, environment, version, etc. • Have test groups – i.e.: regression, build, etc.
  19. 19. Test Suite example
  20. 20. Test Configuration • Any configurable test item that is separate from the test suite – i.e.: web site URLs, database URLs, usernames, passwords, lab machine configurations, etc. • Separated from the rest of the framework for information security and regulatory compliance
  21. 21. Test Configuration example
  22. 22. Test Properties • Are thread specific • Have global scope and can be accessed by any class method
  23. 23. Test Properties example
  24. 24. Test Data Repository • Abstracts away changes made only to test data • Data can be stored in databases and files such as XML, JSON, XLS, TXT, CSV, etc. • Data can be retrieved from data sources such as databases, web services, APIs, EDI, etc. • Metadata can be used to describe additional information about the test data such as versions, application names, required fields, invalid formats, etc. • Share data across multiple screens, applications, and tools
  25. 25. Test Data example
  26. 26. Parameterized Test example JUnit4 TestNG
  27. 27. Test Parameters • Uses a hash map of key-value pairs – Very useful for both functional and load testing • Hash maps eliminate the need for – Creating fixed method parameters that change frequently – Changing data access objects that change frequently • Hash maps can be easily populated from any data source • Retrieve the test data and process the format for input to various test tools • Handles version differences between test data
  28. 28. Test Parameters example
  29. 29. User Interface (UI) Mapping • Abstracts away changes made only to UI object locators • UI object identifiers can be stored in databases and files such as XML, JSON, XLS, TXT, CSV, etc. • Metadata can be used to describe additional information about the UI objects such as versions, application names, form fields, etc. • Share UI identifiers across multiple screens, applications, and tools
  30. 30. UI Map example
  31. 31. Test Controls • Retrieve the UI locators and process the format for input to various test tools • Handles version differences between UI locators • Develop custom controls to mimic UI objects • Self test verification
  32. 32. Test Control Interface example
  33. 33. Test Control Class example
  34. 34. Test Components • A class object that represents a piece of a form, page, or screen • Groups of common objects that can be found across more than one page • Standardized component verification tests
  35. 35. Test Components example
  36. 36. Test Forms • A class object that represents a piece a form • Groups of components and controls that mimic the UI form • Standardized form verification tests
  37. 37. Test Form Interface example
  38. 38. Test Form Class example
  39. 39. …continued
  40. 40. Test Page or Screen • A class object that represents a web page or screen – commonly referred to as “page objects” • Groups of forms, components, and controls that mimic the UI page or screen • Self test verification
  41. 41. Test Page Class example
  42. 42. Instances when NOT to return a new Page Object for each Page • Submitting a page can return one of many different pages, or versions of pages, which would be represented by many different classes – Page A could return B, C, or D – Page A could return B v1.0, B v2.0, or Cv1.0, C v2.0, or D v1.0, D v2.0, etc. – This means that you would need to do the following: • Create multiple methods with similar names that return different page objects – Page A.submitB() returns B – Page A.submitC() returns C
  43. 43. …continued • Return an page object of an inherited type, and then cast the type of the page in the script – Page A.submit() returns page X – Pages B and C inherit from page X – Script checks the type of X and casts to B or C – This can get very ugly in a script… • Submitting a page may initiate an AJAX call, which can rewrite the same page – Page A could display an AJAX (modal) dialog – Page A could display some other AJAX widget – Page A could be rewritten to look like page B
  44. 44. …continued • If the page flow is not what is being tested, then an unexpected page or AJAX call could fail and kill the test – Test case: • We are trying to test the functionality of page Z • The primary page flow is as follows: – A to B, B to C, C to D, and D to Z • The alternative page flows could also correctly occur, but are unpredictable base on many unknown variables: – A to H, H to L, L to M, and M to Z – A to R, R to S, and S to Z – A to X, X to Y, and Y to Z
  45. 45. …continued – What if page A unexpectedly returned page H, R, or X instead of page B? – If page H, R, or X verified itself and asserted a failure, then how would the test ever reach page Z?
  46. 46. JavaScript & HTML DOM • There are many instances when we need the test framework to mimic a human reading the content of a page and then making a decision on what action to perform next – Verifying the list of items on a search results page – Selecting items to purchase from a list – Selecting elements to edit, update, or delete • Scenarios like these usually occur as a result of some previous action, and display content generated dynamically to the user • The test framework may have to query an application database or web service to provide the input to trigger the dynamic page content
  47. 47. …continued • The test framework will then need to interrogate the page to determine if the expected elements, attributes, or content are present, and then make a decision on whether to perform an action on some element • JavaScript can be used to inspect the HTML DOM to find these elements and their attributes and content • JavaScript expressions can be evaluated by the test tool, and then return some result that the test framework can use to make a decision • Checking if an element is present or getting a value will not work on a dynamically generated element
  48. 48. JS DOM example HTML JavaScript
  49. 49. Test Application Container • A class object that represents the application, and contains the major page and component objects by version • It provides the test services available to the test scripts • Is thread specific
  50. 50. Container example Interface Class
  51. 51. Parallel Environment Tracks with Different Build Versions • Parallel development tracks competing to deploy to production • Run automated build verification tests • Run automated functional and regression tests • Run automated user acceptance tests • Run automated tests for production patches and fixes • Run automated sanity tests against production post- deployment • Run automated tests across versions for production rollback strategy
  52. 52. Parallel Environment Tracks with Different Build Versions
  53. 53. Test Scripts • Test scripts can be written using standard unit test frameworks • Classes and methods are written using domain language • Script details and complexity are abstracted to the framework layers • Scripts are shorter and easier to read
  54. 54. Test Script Class
  55. 55. Test Reports and Logs • Reports can be generated from – The unit test framework – External reporting tools – Custom report utilities can be written • Test execution activity can be written to standard logging utilities • Test failures, errors, and exceptions can be written to the reports and logs
  56. 56. Q&A Thank you for your time! Martin Lienhard testautomaton1@gmail.com

×