Introducing Keyword-Driven Test Automation
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Introducing Keyword-Driven Test Automation

on

  • 349 views

In both agile and traditional projects, keyword-driven testing has proven to be a powerful way to attain a high level of automation—when it is done correctly. Many testing organizations use ...

In both agile and traditional projects, keyword-driven testing has proven to be a powerful way to attain a high level of automation—when it is done correctly. Many testing organizations use keyword-driven testing but aren’t realizing the full benefits of scalability and maintainability that are essential to keep up with the demands of testing today’s software. Hans Buwalda outlines how you can meet what he calls the “5 percent challenges”—automate 95 percent of your tests with no more than 5 percent of your total testing effort—using his proven, keyword-driven test method. Hans also discusses how the keyword approach relates to other automation techniques like scripting and data-driven testing. Use the information and real-world application Hans presents to attain a very high level of automation with the lowest possible effort.

Statistics

Views

Total Views
349
Views on SlideShare
341
Embed Views
8

Actions

Likes
0
Downloads
6
Comments
0

1 Embed 8

http://www.stickyminds.com 8

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Introducing Keyword-Driven Test Automation Document Transcript

  • 1. TO PM Tutorial 10/1/2013 1:00:00 PM "Introducing Keyword-Driven Test Automation" Presented by: Hans Buwalda LogiGear Corporation Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
  • 2. Hans Buwalda LogiGear An internationally recognized expert in testing, Hans Buwalda is a pioneer of keyword-driven test automation, an approach now widely adopted throughout the testing industry. Originally from the Netherlands, Hans is the CTO of LogiGear, directing the development of the successful Action Based Testing™ methodology for keyword-driven test automation and its supporting TestArchitect™ toolset. Prior to joining LogiGear, Hans served as project director at CMG (now CFI).
  • 3. 8/20/2013 STAREAST 2013, Tutorial TO Orlando, Tuesday April 30 Introducing Keyword-driven Test Automation Hans Buwalda LogiGear © 2013 LogiGear Corporation. All Rights Reserved Introduction  industries  roles in testing © 2013 LogiGear Corporation. All Rights Reserved 1
  • 4. 8/20/2013 About LogiGear www.logigear.com www.testarchitect.com  Software testing company, around since 1994  Testing and test automation expertise, services and tooling  consultancy, training  test development and automation services  "test integrated" development services  Aims to be thought leader, in particular for large and complex test projects  Products:  TestArchitect™, TestArchitect for Visual Studio™  integrating test development with test management and automation  based on modularized keyword-driven testing © 2013 LogiGear Corporation. All Rights Reserved About Hans www.happytester.com hans @ logigear.com  Dutch guy, living and working in California since 2001, as CTO of LogiGear  Background in math, computer science, management  Original career in management consultancy, since 1994 focusing on testing and test automation  keywords, agile testing, big testing, . . . © 2013 LogiGear Corporation. All Rights Reserved 2
  • 5. 8/20/2013 Topics for this tutorial  Introduction to keyword driven testing  including "Action Based Testing", my own flavor of it...  Comparison to other techniques for automation  Recommendations for a successful application of keyword driven testing  test design  automation  organization  Some ideas for specific situations:      data driven testing non-ui testing multi-media protocols initial data  Not everything will be equally interesting, or accessible, to everybody © 2013 LogiGear Corporation. All Rights Reserved Testing Under Pressure DEADLINE specification development test © 2013 LogiGear Corporation. All Rights Reserved 3
  • 6. 8/20/2013 Testing Under Pressure DEADLINE specification development test Develop tests in time: • Test design • Auditing, acceptance • Preparations • Automation © 2013 LogiGear Corporation. All Rights Reserved The 5% Rules of Test Automation  No more than 5% of all test cases should be executed manually  No more than 5% of all efforts around testing should involve automating the tests © 2013 LogiGear Corporation. All Rights Reserved 4
  • 7. 8/20/2013 Why a High Automation Degree ??  The best way to prepare for efficiency in the crunch zone  good manual test cases can help too, but marginally  Buy time to do more "exploratory testing", and better test development  Credible pay-off for the cost of introduction of automation  initial costs are: tooling, learning curve, adaptation of existing tests  Automation is better positioned to identify “bonus bugs”  on average 15% of fixes cause new bugs  many of these bugs are hard to find without integral testing • • often a result of violating overall architectures the bugs occur because data is left in inconsistent state  Automated tests have a better chance of being kept up to date if they form the majority of the testware  Automation can be re-run, for example as part of the continuous integration process  either specific based on code changes, or integral, to also catch bonus bugs © 2013 LogiGear Corporation. All Rights Reserved Why < 5% Automation Efforts ??  Automation should not dominate testing  it is not a goal in itself  may never be a bottleneck automation should deliver, not dominate…  Testers should be able to focus on testing  better tests (higher ambition level)  communication with stake holders  High automation efforts can aggravate the “crunch zone”, instead of relieving it  “invitation to Murphy’s law” © 2013 LogiGear Corporation. All Rights Reserved 5
  • 8. 8/20/2013 Record and Playback select window "Logon" enter text "username", "administrator" enter text "password", "testonly" push button "Ok" select window "Main" push button "New Customer" expect window "Customer Information" select field "First Name" type "Paul" select field "Last Name" type "Jones" select field "Address" type "54321 Space Drive" . . . © 2013 LogiGear Corporation. All Rights Reserved Scripting Test Case Design TEST DESIGNER Test Case Automation AUTOMATION ENGINEER Test Case Execution MR. PLAYBACK © 2013 LogiGear Corporation. All Rights Reserved 6
  • 9. 8/20/2013 Example scripting  State of the art, but stuff for coders . . . /// <summary> /// AddItems - Use 'AddItemsParams' to pass parameters into this method. /// </summary> public void AddItems() { #region Variable Declarations WinControl uICalculatorDialog = this.UICalculatorWindow.UICalculatorDialog; WinEdit uIItemEdit = this.UICalculatorWindow.UIItemWindow.UIItemEdit; #endregion // Type '{NumPad7}' in 'Calculator' Dialog Keyboard.SendKeys(uICalculatorDialog, this.AddItemsParams.UICalculatorDialogSendKeys, ModifierKeys.None); // Type '{Add}{NumPad2}{Enter}' in 'Unknown Name' text box Keyboard.SendKeys(uIItemEdit, this.AddItemsParams.UIItemEditSendKeys, ModifierKeys.None); } © 2013 LogiGear Corporation. All Rights Reserved Keywords, Action Words  Common in automation tools nowadays (but with different styles)  Identify tasks for both test development and automation  The test developer creates tests using actions  Each action consists of a keyword and arguments  The automation task focuses on automating the actions  Each action is automated only once fragment from a test with actions number actions, each with a keyword and arguments Sledge Hammer 5 quantity P-9009 P-9009 P-9009 20 3 6 number add quantity add quantity add quantity name P-9009 number new product quantity check quantity P-9009 34 quantity read from top to bottom "34" is the expected value here © 2013 LogiGear Corporation. All Rights Reserved 7
  • 10. 8/20/2013 Potential benefits of keywords  More tests, better tests  more breadth  more depth  Fast, results can be quickly available  the design directly drives the automation  Separates the tests from the technical scripting language  easier to involve business subject matter experts  the action format allows for easy readability  Less efforts for automation  "script free" in most cases  Automation more stable and maintainable  limited and manageable impact of changes in the system under test  Develop tests more early in the life cycle  deal with execution details later  ... © 2013 LogiGear Corporation. All Rights Reserved Risks of keyword approaches  Often seen as silver bullet, complications are underestimated  often treated as a technical "trick"  testers can get squeezed and marginalized • developers and users dictating tests • automation engineers dictating actions  or testers get the automation responsibility, thus becoming pseudo programmers  The method needs understanding and experience to be successful  pitfalls are many, and can have a negative effect on the outcome  Lack of method and structure can risk manageability  maintainability not as good as hoped  results can be disappointing, approach will be blamed © 2013 LogiGear Corporation. All Rights Reserved 8
  • 11. 8/20/2013 Combining Approaches . . .  Use keywords for the automation-ready description of test cases  Use scripting to set up structured automation for the actions  Use record and playback to record keywords © 2013 LogiGear Corporation. All Rights Reserved Welcome to . . . "Complete Test Make-Over" Comparing Formats Most values are implicit. The tester has to figure them out during execution…. . . Execution instructions are repeated in multiple test cases classic format Enter a user id that is greater than 10 characters, enter proper information for all other fields, and click on the "Continue" button There should be an error message stating that "User Id must be less than 10 characters". Enter a User Id with special character's), enter proper information for all other fields and click on the "Continue" button An error message should be displayed indicating that "User Id cannot contain some special characters". Enter the information, with a password of 4 characters and click on the "Continue" button Check for an error message saying: "Password must contain at least 5 characters". keywords user id check registration dialog User Id must be less than 10 characters message résoudre User Id cannot contain some special characters password check registration dialog aaaaabbbbbc user id check registration dialog message message test Password must contain at least 5 characters © 2013 LogiGear Corporation. All Rights Reserved 9
  • 12. 8/20/2013 Keywords is not just test automation  Can also be used for other than testing:  data entry chores  training purposes  Can also be used for manual testing  for example with a manual testing dialog  even can show instructions, with placeholders for values Action login <user> <password> Enter "<user>" in the user name field, and "<password>" in the password field. Test line user name login password hansb starwest What the manual tester would see Enter "hansb" in the user name field, and "starwest" in the password field. © 2013 LogiGear Corporation. All Rights Reserved Keywords need a method  By themselves keywords don't provide much scalability  they can even backfire and make automation more cumbersome  a method can help tell you which keywords to use when, and how to organize the process  Today we'll look at Action Based Testing (ABT)  addresses test management, test development and automation  large focus on test design as the main driver for automation success  Central deliveries in ABT are the "Test Modules"  developed in spreadsheets  each test module contains "test objectives" and "test cases"  each test module is a separate (mini) project, each test module can involve different stake holders © 2013 LogiGear Corporation. All Rights Reserved 10
  • 13. 8/20/2013 Don't just automate manual testing © 2013 LogiGear Corporation. All Rights Reserved Don't just automate manual testing © 2013 LogiGear Corporation. All Rights Reserved 11
  • 14. 8/20/2013 Don't just automate manual testing Good automated testing is not the same as automating good manual testing. . . © 2013 LogiGear Corporation. All Rights Reserved Action Based Testing (ABT)  Uses the keyword format as a basis for a method  covers test management, test development and automation  with a large focus on test design as the main driver for automation success  method is specific, but concepts are generic  The central product in ABT is the "Test Module", not the test case  like chapters in a book  test cases are part of the test modules, they are typically the result (rather than the input) of test development  test development is seen as having both analytical and creative aspects  developed as spread sheets, external from the automation, with a well defined flow  easier to manage: each test module is a separate (mini) project, each test module can involve different stake holders © 2013 LogiGear Corporation. All Rights Reserved 12
  • 15. 8/20/2013 Action Based Testing Test Development Plan Break down Test Module 1 Test Module 2 Test Module N Test Objectives Test Objectives Test Objectives Test Cases Test Cases ... Test Cases Automate Actions ACTION AUTOMATION © 2013 LogiGear Corporation. All Rights Reserved Example of business level test module  Consists of an (1) initial part, (2) test cases and (3) a final part  Focus is on business functionality, with a clear business scope  Navigation details are avoided TEST MODULE Car Rental Payments user start system john TEST CASE TC 01 Rent some cars first name last name car John John Doe Doe Ford Escape Chevvy Volt last name amount Doe 140.4 rent car rent car check payment FINAL close application © 2013 LogiGear Corporation. All Rights Reserved 13
  • 16. 8/20/2013 Example of an interaction level test module  Lay-out the same, with an initial part, test cases and a final part  Interaction details that are the target of the test are not hidden  Focus is not on business ("is the payment amount correct"), but on interaction ("do I see the payment amount") TEST MODULE Screen Flow user start system john TEST CASE TC 01 Order button window button main create order click window check window exists new order FINAL close application © 2013 LogiGear Corporation. All Rights Reserved Variables and expressions with keywords TEST CASE >> volts last name car John John Doe Doe Chevvy Volt Chevvy Volt car check quantity available Chevvy Volt first name rent car rent car Rent some more cars car get quantity TC 02 expected Chevvy Volt # volts - 2  This test does not need an absolute number for the available cars, just wants to see if a stock is updated  As a convention we denote an assignment with ">>"  The "#" indicates an expression © 2013 LogiGear Corporation. All Rights Reserved 14
  • 17. 8/20/2013 Data driven testing with keywords TEST CASE TC 03 Check stocks use data set /cars car get quantity available check quantity >> quantity last name car # first # last # car car rent car # car first name expected # car DATA SET cars car data set first last value Chevvy Volt Ford Escape Chrysler 300 Buick Verano BMW 750 Toyota Corolla John Mary Jane Tom Henry Vivian Doe Kane Collins Anderson Smyth Major 40000 22500 29000 23000 87000 16000 # quantity - 1 repeat for data set  The test lines will be repeated for each row in the data set  The values represented by "car", "first" and "last" come from the selected row of the data set © 2013 LogiGear Corporation. All Rights Reserved Automating keyword tests Keywords are useful, but technical not complex. It is not hard to make a simple keyword interpreter. Many test tools also have keyword options in some form or another. Interpreter Function Interpret While not end of test Read next line Split the line into arguments Look up the keyword in the "action list" Execute the function belonging to the keyword Report the results of this line Repeat for next line End Report © 2013 LogiGear Corporation. All Rights Reserved 15
  • 18. 8/20/2013 Automation: example test lines System Under Test Test Lines key click key click key click key click key one plus one equals expected check display 2. © 2013 LogiGear Corporation. All Rights Reserved Create an "action function" for each action Code module: "mod_Calculator" # map an action to its function def divertCalculator(action): if action == "click key": action_ClickKey() elif action == "check display": action_CheckDisplay() . . . else: Error("Don't know action: " + action) . . . # action "click key", click a key on the calculator def action_ClickKey(): keyName = Argument("key") key = OpenElement("calculator", Argument("key")) key.click() # action "check display", verify the value of the display def action_CheckDisplay(): display = OpenElement("calculator", "display") display.Check(Argument("expected")) get the value for argument "key" from the test line identify the UI element perform the operation identify the UI element perform the check © 2013 LogiGear Corporation. All Rights Reserved 16
  • 19. 8/20/2013 Example: script for an action "check sort order" The following action script will verify whether the rows in a table are sorted: def action_checkSortOrder(): # get table object, column number and column count windowName = LIBRARY.NamedArgument("window") tableName = LIBRARY.NamedArgument("table") columnName = LIBRARY.NamedArgument("column") table = ABT.OpenElement(windowName, tableName) column = table.GetColumnIndex(columnName) rowCount = table.GetRowCount() get arguments from the test line find the table in the UI # check the sort order, row by row if a value is smaller than before, fail the test previous = table.GetCellText(0, column) for i in range(1, rowCount): current = table.GetCellText(i, column) if current < previous : LIBRARY.AdministerCheck("order", "sorted", "fails " + str(i+1), 0) return if all rows are ascending, pass the test previous = current LIBRARY.AdministerCheck("order", "sorted", "all rows in order", 1) © 2013 LogiGear Corporation. All Rights Reserved Example application  For try-outs and demonstrations only  Application is made in WPF © 2013 LogiGear Corporation. All Rights Reserved 17
  • 20. 8/20/2013 A Test Module for the application  We click a tree node, and then do a check  The actions here are built-in in the framework © 2013 LogiGear Corporation. All Rights Reserved Making a new "action" name of the new action the arguments of the new action create a node path from the first two arguments the expected value is given by the 3rd argument   This action definition uses existing actions to create a new action called "check bitrate" Argument names can be used in cell expressions, that start with "#", and support the usual string and numeric operators © 2013 LogiGear Corporation. All Rights Reserved 18
  • 21. 8/20/2013 Using the action in a test  These test lines don't care about the navigation in the UI of the application, the focus is functional: verify data  Such functional tests are easier to read with high level actions, and the reduced dependency on navigation makes them (much) easier to maintain in the long term © 2013 LogiGear Corporation. All Rights Reserved Do the count down . . . In a good application of the keywords approach a large increase in test cases (like doubling the amount) should result in a modest increase in actions, and a minor increase, if any, in programmed action functions. 4000 tests 2000 tests 200 actions 250 actions 20 functions 22 functions © 2013 LogiGear Corporation. All Rights Reserved 19
  • 22. 8/20/2013 Identifying controls  Identify windows and controls, and assign names to them  These names encapsulate the properties that the tool can use to identify the windows and controls when executing the tests © 2013 LogiGear Corporation. All Rights Reserved Mapping the interface INTERFACE ENTITY library interface entity setting title {.*Music Library} ta name interface element interface element interface element ta class label title artist file size text text text Title: Artist: File size (Kb): ta name interface element interface element interface element ta class position    textbox 4 textbox 5 textbox 6 ta name interface element playing time text file type text bitrate text ta class position music treeview treeview 1 An interface mapping (common in test tools) will map windows and controls to names When the interface of an application changes, you only have to update this in one place The interface mapping is a key step in your automation success, allocate time to design it well © 2013 LogiGear Corporation. All Rights Reserved 20
  • 23. 8/20/2013 Some Tips to Get Stable Automation  Make the system under test automation-friendly  Use "active" timing  Test your automation  Use automation to identify differences between versions of the system under test (in particular the interfaces)  Keep an eye on the test design © 2013 LogiGear Corporation. All Rights Reserved Automation-friendly design: hidden properties  Look for properties a human user can't see, but a test tool can  This approach can lead to speedier and more stable automation     interface mapping is often bottleneck, and source of maintenance problems with predefined identifying property values in interface map can be created without "spy" tools not sensitive to changes in the system under test not sensitive to languages and localizations  Examples:    "id" attribute for HTML elements "name" field for Java controls "AccessibleName" property in .Net controls (see below) © 2013 LogiGear Corporation. All Rights Reserved 21
  • 24. 8/20/2013 Mapping the interface INTERFACE ENTITY library interface entity setting automation id MusicLibraryWindow ta name interface element    ta class automation id title artist file size playing time file type bitrate text text text text text text TitleTextBox SongArtistTextBox SizeTextBox TimeTextBox TypeTextBox BitrateTextBox ta name interface element interface element interface element interface element interface element interface element ta class automation id music treeview MusicTreeView Instead of positions or language dependent labels, an internal property "automation id" has been used The interface definition will be less dependent on modifications in the UI of the application under test If the information can be agreed upon with the developers, for example in an agile team, it can be entered (or pasted) manually and early on © 2013 LogiGear Corporation. All Rights Reserved Active Timing  Passive timing  wait a set amount of time  in large scale testing, try to avoid passive timing altogether: • • if wait too short, test will be interrupted if wait too long, time is wasted  Active timing  wait for a measurable event  usually the wait is up to a, generous, maximum time  common example: wait for a window or control to appear (usually the test tool will do this for you)  Even if not obvious, find something to wait for...  Involve developers if needed  relatively easy in an agile team, but also in traditional projects, give this priority  If using a waiting loop  make sure to use a "sleep" function in each cycle that frees up the processor (giving the AUT time to respond)  wait for an end time, rather then a set amount of cycles © 2013 LogiGear Corporation. All Rights Reserved 22
  • 25. 8/20/2013 Things to wait for...  Wait for a last control or elements to load  developers can help knowing which one that is  Non-UI criteria  API function  existence of a file  Criteria added in development specifically for this purpose, like:  "disabling" big slow controls (like lists or trees) until they're done loading  API functions or UI window or control properties  Use a "delta" approach:  every wait cycle, test if there was a change; if no change, assume that the loading time is over:  examples of changes: • the controls on a window • count of items in a list • size a file (like a log file) © 2013 LogiGear Corporation. All Rights Reserved Test Design  Effective test breakdown (into test modules)  make sure every test module has a clear focus  keep different kinds and levels of tests separate  Right level of actions  as “high level” if possible, hiding as many details as much as possible  but not if the details are relevant for the test It is my believe that successful automation is not a technical challenge. It is most of all a test design challenge. © 2013 LogiGear Corporation. All Rights Reserved 23
  • 26. 8/20/2013 The Three “Holy Grails” of Test Design  Metaphor to depict three main steps in test design  Using "grail" to illustrate that there is no one perfect solution, but that it matters to pay attention (to search)  About quality of tests, but most of all about scalability and maintainability in BIG projects Organization of tests into test modules Right approach for each test module Proper level of detail in the test specification © 2013 LogiGear Corporation. All Rights Reserved What's the trick... © 2013 LogiGear Corporation. All Rights Reserved 24
  • 27. 8/20/2013 What's the trick...  Have or acquire facilities to store and organize you content  Edit your stuff  Decide where to put what  assign and label the shelves  Put it there  If the organization is not sufficient anymore, add to it or change it © 2013 LogiGear Corporation. All Rights Reserved Breakdown Criteria  Straightforward Criteria  Business tests versus interaction tests  Architecture of the system under test (client, server, protocol, sub systems, components, modules, ...)  Functionality (customers, finances, management information, ...)  Kind of test (navigation flow, negative tests, response time, ...)  Ambition level (smoke test, regression, aggressive, …)  Additional Criteria      Stakeholders (like "Accounting", "Compliance", "HR", ...) Complexity of the test (put complex tests in separate modules) Technical aspects of execution (special hardware, multi-station, ...) Overall project planning (availability of information, timelines, sprints, ...) Risks involved (extra test modules for high risk areas) © 2013 LogiGear Corporation. All Rights Reserved 25
  • 28. 8/20/2013 Properties of a good Breakdown  Reflects the level of tests  Well differentiated and clear in scope  Balanced in size and amount  Modules mutually independent  Fitting the priorities and planning of the project © 2013 LogiGear Corporation. All Rights Reserved Example breakdown  Tests of user interface     Form Tests, do all the forms (dialogs, screens, pages) work:     does function key F4 work does listbox xyz the right values is the tab order correct can data be entered and is it stored well is displayed data correct split these from everything else Function tests, do individual functions work  can I count the orders  Alternate paths in use cases  End-to-end tests    can I cancel a transaction do all components of a system work well together in implementing the business processes like enter sale order, then check inventory and accounting  Tests with specific automation needs  Tests of non-UI functions  High ambition tests (aggressive tests)   like multi station tests can I break the system under test © 2013 LogiGear Corporation. All Rights Reserved 26
  • 29. 8/20/2013 Example of an application under test    Various item types       tests actions interface definitions data sets folders ... Various operations     open cut, copy, paste check out ... Various ways to initiate an operation         context menu, with or without accelerator key main menu, with or without accelerator key toolbar short cut key function key drag and drop double click .... © 2013 LogiGear Corporation. All Rights Reserved Defining some modules  Test modules for operations      primary and alternate paths various values for fields like "comment" in check-in paste in other projects copy and paste various groups not necessarily on each item type  Test modules for items  UI handling  Concurrency  address all item types at least once  on each item type perform each operation  not necessarily each variant of each operation  try for UI command if it starts the intended operation  not necessarily on each item type or operation variant     try concurrency sensitive operations with multiple stations in varying concurrency scenarios, with and without local "refreshes" not necessarily each item type or operation variant certainly not each UI command included © 2013 LogiGear Corporation. All Rights Reserved 27
  • 30. 8/20/2013 What I have seen not work  "Over-Checking": having checks (for example based on navigation) that do not fit the scope of the test  Forcing data driven: make all tests data driven (variables, data files) without clear reason  Combinatorial explosions: test all ... for all ... in all ...  All actions high level (or all actions low level)  Splitting the process in "test designers" and "test implementers"  it's ok for one party to define tests for another party, but let him/her focus on "what" and not "how"  One-directional focus on the forms in an application  think about data, transactions, states, ... © 2013 LogiGear Corporation. All Rights Reserved "Thou Shall Not Debug Tests..."  Large and complex test projects can be hard to "get to run"  If they are however, start with taking a good look again at your test design...  Rule of thumb: don't debug tests. If tests don't run smoothly, make sure:  lower level tests have been successfully executed first -> UI flow in the AUT is stable  actions and interface definitions have been tested sufficiently with their own test modules -> automation can be trusted  are you test modules not too long and complex? © 2013 LogiGear Corporation. All Rights Reserved 28
  • 31. 8/20/2013 What about existing tests?  Compare to moving house:  some effort can't be avoided  decide where to put what, then put it there  consider a moving company to help  Adopt the module model  define the modules, and their scope  worry about the existing test cases later  Moving considerations  be selective, moving is a chance, unlikely you get that opportunity again  for the important modules: design as normal but harvest from existing set  avoid porting over test cases "step by step", in particular avoid over-checking © 2013 LogiGear Corporation. All Rights Reserved Grail 2: Approach per Test Module  Organize the test design process around the test modules  Plan the test module:  when to develop: is enough specification available  when to execute: make sure the functionality at action level is welltested and working already  Process:  analysis of requirements  formulation of "test objectives"  create "test cases"  Identify stakeholders and their involvement:  users, subject matter experts  developers  auditors  Choose testing techniques if applicable:  boundary analysis, decision tables, transition diagrams, soap opera testing, ... © 2013 LogiGear Corporation. All Rights Reserved 29
  • 32. 8/20/2013 Eye on the ball, Scope  Always know the scope of the test module  The scope should be unambiguous  The scope determines many things:     what the test objectives are which test cases to expect what level of actions to use what the checks are about and which events should generate a warning or error (if a “lower” functionality is wrong) © 2013 LogiGear Corporation. All Rights Reserved State your Objectives . . . ... TO-3.51 The exit date must be after the entry date ... test objective TO-3.51 name enter employment check error message entry date exit date Bill Goodfellow 2002-10-02 2002-10-01 The exit date must be after the entry date. © 2013 LogiGear Corporation. All Rights Reserved 30
  • 33. 8/20/2013 Grail 3: Specification Level, choosing actions  Scope of the test determines the specification level  As high level as appropriate, as little arguments as possible  Use default values for non-relevant arguments  Clear names (usually verb + noun usually works well)  to standardize action names: standardize both the verbs and the nouns, so "check customer" versus "verify client" (or vice versa)  tests are not C++ code: avoid "technical habits", like mixed case and (worse) underlines  Manage the Actions  Document the Actions  By-product of the test design © 2013 LogiGear Corporation. All Rights Reserved Example of using actions In this real world example the first "sequence number" for teller transactions for a given day is retrieved, using a search function • the "#" means an expression, in this case a variable • the ">>" assign to a variable for use later on in the test key key navigate key navigate F7 3 page tab locate page tab Scan Criteria w indow wait for controls loaded search text check breadcrumb general functions > search w indow enter value control search scan direction Backward w indow select value control value click search business date match # bus date source control search go w indow wait for controls loaded search results w indow get control search results sequence number variable >> seq num © 2013 LogiGear Corporation. All Rights Reserved 31
  • 34. 8/20/2013 Example of using actions In this real world example the first "sequence number" for teller transactions for a given day is retrieved, using a search function • the "#" means an expression, in this case a variable • the ">>" assign to a variable for use later on in the test variable get sequence number >> seq num © 2013 LogiGear Corporation. All Rights Reserved Environments, configurations  Many factors can influence details of automation     language, localization hardware version of the system under test system components, like OS or browser  Test design can reflect these  certain test modules are more general  others are specific, for example for a language  But for tests that do not care about the differences, the automation just needs to "deal" with them  shield them from the tests © 2013 LogiGear Corporation. All Rights Reserved 32
  • 35. 8/20/2013 "Variations" Capture variations of the system under test in the actions and interface definitions, rather than in the tests (unless relevant there). Can be a feature in a test playback tool, or something you do with a global variable or setting. . . . Actions, Interface Definitions "Master Switch" Variation Variation Variation © 2013 LogiGear Corporation. All Rights Reserved Possible set up of variations linked variation keyworded variation Specify for example in a dialog when you start an execution: © 2013 LogiGear Corporation. All Rights Reserved 33
  • 36. 8/20/2013 Non-UI Testing  Examples         application programming interfaces (API’s) embedded software protocols files, batches databases command line interfaces (CLI’s) multi-media mobile devices  Impact is mainly on the automation  test design should in most cases be transparent towards the specific interfaces  Often non-UI automation can speed up functional tests that do not address the UI © 2013 LogiGear Corporation. All Rights Reserved Multiple System Access Test Modules, driving either one or multiple interfaces Automation Scheme protocol access API access UI access System (part) Under Test database access © 2013 LogiGear Corporation. All Rights Reserved 34
  • 37. 8/20/2013 Device Testing Testing Host Device ABT Automation Interface Info Agent Andr oid Software Under Test © 2013 LogiGear Corporation. All Rights Reserved Multimedia: The "Play List" Approach  Approach applicable for graphics, videos, sound fragments, etc  The test includes "questions":  what the tester should see or hear  like "Are the matching areas blue?"  actions like "check picture"  The test tool keeps a "play list"  during the run items are captured and stored  after the run, the tester is presented with the items, and the matching questions  the tester acknowledges/falsifies  the system stores those passed items  if during the next run the items are the same as earlier passed ones, the tester is not asked again © 2013 LogiGear Corporation. All Rights Reserved 35
  • 38. 8/20/2013 Performance Testing  The topic is complex, but to create tests can be quite straightforward  actions like "generate load <how much>" and "check response time <max wait>"  use one tool to generate load (like JMeter), another to run the "normal" functional test  Often performance testing isn't testing, but more close to research  analysis bottle necks and hot spots (for example discontinuities in response times, means buffers are full)  application of statistical techniques like queuing theory  how to realistically mimic large scale productions situations in smaller test environments  The three controls you can/should address: hardware (equipment, infrastructure, data centers, etc) software (programs, database models, settings, etc) demands (1 second may cost 10 times more than 2 seconds) See also: "Load Testing for Dummies", Scott Barber,All Rights Reserved © 2013 LogiGear Corporation. gomez.com Organization  Much of the success is gained or lost in how you organize the process     part of the teams who does test design who does automation what to outsource, what to keep in-house  Write a plan of approach for the test development and automation       scope, assumptions, risks, planning methods, best practices tools, technologies, architecture stake holders, including roles and processes for input and approvals team ... Test design is a skill . . .  Assemble the right resources  testers, lead testers  automation engineer(s)  managers, ambassadors, ... Automation is a skill . . . Management is a skill . . . . . . and those skills are different . . . © 2013 LogiGear Corporation. All Rights Reserved 36
  • 39. 8/20/2013 Life Cycles  Product life cycles, rather than task life cycles  The project planning and execution largely determines when the products are created system development test development test automation © 2013 LogiGear Corporation. All Rights Reserved Typical Time Allocation efforts TEST DEVELOPMENT AUTOMATION time © 2013 LogiGear Corporation. All Rights Reserved 37
  • 40. 8/20/2013 Keywords and ABT in Agile  Keywords are suitable for agile projects:  tests are easier to create and understand, in particular for non-programmers  they allow test development without a need for details that haven't been defined yet  automated tests can quickly follow changes in the system under test  Action Based Testing in itself is quite agile  focused on products and cooperation  flexible in process, in fact each test module can have its own process  test modules are usually very suitable to drive system development  However, ABT relies on high level test design for best results  identifying test modules  in larger scale projects this may require at least some overall test planning activities that are not necessarily easy to do in a single scrum team © 2013 LogiGear Corporation. All Rights Reserved Test Development and Automation in sprints Agile life cycle product owner team Product Backlog Sprint Test Module Definition (optional) prod owner & team Products Test Module Development Test re-use Interface Definition Automation re-use Action Automation Test Execution Test development User stories Documentation Domain understanding Acceptance Criteria PO Questions Situations Relations Main Level Test Modules Interaction Test Modules Cross over Test Modules © 2013 LogiGear Corporation. All Rights Reserved 38
  • 41. 8/20/2013 Test automation in sprints  Try keep the main test modules at a similar level as the user stories and acceptance criteria  test modules can double as modeling device for the sprint  Aim for "sprint + zero", meaning: try to get test development and automation "done" in the same sprint, not the next one  next one means work clutters up, part of team is not working on the same sprint, work is done double (manually and automated), ...  Make sure you can do the interface mapping by hand (using developer provided identifications)  can do earlier, before UI is finalized, and  recording of actions (not tests) will go better  Also plan for additional test modules:  low-level testing of the interaction with the system under test (like UI's)  crossing over to other parts of the system under test © 2013 LogiGear Corporation. All Rights Reserved Fitting in sprints  Agree on the approach:  questions like does "done" include tests developed and automated?  do we see testing and automation as distinguishable tasks and skillsets  is testability a requirement for the software  Create good starting conditions for a sprint:  automation technology available (like hooks, calling functions, etc)  how to deal with data and environments  understanding of subject matter, testing, automation, etc  Make testing and automation part of the evaluations  Address tests and automation also in hardening sprints  Just like for development, use discussions with the team and product owners to deepen understanding:  also to help identify negative, alternate and unexpected situations © 2013 LogiGear Corporation. All Rights Reserved 39
  • 42. 8/20/2013 Summary  Keywords is one of the techniques for automated testing, in addition to record & playback and scripting  In itself keywords are not a silver bullet, it needs a good approach, careful planning and good organization to be successful  Keywords can work for GUI testing, but equally well for a variety of other purposes © 2013 LogiGear Corporation. All Rights Reserved Some References 1. Testing Computer Software, Cem Kaner, Hung Nguyen, Jack Falk, Wiley 2. Lessons Learned in Software Testing, Cem Kaner, James Bach, Bret Pettichord, Wiley 3. Experiences of Test Automation, Dorothy Graham, Mark Fewster, Addison Wesley, 2012 4. Automating Software Testing, Dorothy Graham, Mark Fewster, Addison Wesley 5. Action Based Testing (overview article), Hans Buwalda, Better Software, March 2011 6. Action Figures (on model-based testing), Hans Buwalda, Better Software, March 2003 7. Integrated Test Design & Automation, Hans Buwalda, Dennis Janssen and Iris Pinkster, Addison Wesley 8. Soap Opera Testing (article), Hans Buwalda, Better Software Magazine, February 2005 9. Testing with Action Words, Abandoning Record and Playback, Hans Buwalda, Eurostar 1996 10. QA All Stars, Building Your Dream Team, Hans Buwalda, Better Software, September 2006 11. The 5% Solutions, Hans Buwalda, Software Test & Performance Magazine, September 2006 12. Happy About Global Software Test Automation, Hung Nguyen, Michael Hackett, e.a., Happy About © 2013 LogiGear Corporation. All Rights Reserved 40