Better Test Designs to Drive Test Automation Excellence

  • 176 views
Uploaded on

Test execution automation is often seen as a technical challenge-a matter of applying the right technology, tools, and smart programming talent. However, such efforts and projects often fail to meet …

Test execution automation is often seen as a technical challenge-a matter of applying the right technology, tools, and smart programming talent. However, such efforts and projects often fail to meet expectations with results that are difficult to manage and maintain-especially for large and complex systems. Hans Buwalda describes how the choices you make for designing tests can make-or break-a test automation project. Join Hans to discover why good automated tests are not the same as the automation of good manual tests and how to break down tests into modules-building blocks-in which each has a clear scope and purpose. See how to design test cases within each module to reflect that module's scope and nothing more. Hans explains how to tie modules together with a keyword-based test automation framework that separates the automation details from the test itself to enhance maintainability and improve ROI.

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
176
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
11
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1.           BT8 Concurrent Session  11/8/2012 2:15 PM                "Better Test Designs to Drive Test Automation Excellence"       Presented by: Hans Buwalda LogiGear Corporation                 Brought to you by:        340 Corporate Way, Suite 300, Orange Park, FL 32073  888‐268‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
  • 2. Hans Buwalda LogiGear Corporation An internationally recognized expert in testing, Hans Buwalda is the pioneer of keyword-driven test automation, an approach now widely used throughout the testing industry. Originally from the Netherlands, Hans is the CTO of California-based LogiGear, directing the development of the successful Action Based Testing™ methodology for keyword-driven test automation and its supporting TestArchitect™ toolset. Prior to joining LogiGear, he served as project director at CMG (now Logica) in Europe. Hans speaks frequently at international conferences on concepts such as Soap Opera Testing, Three Holy Grails of Test Development, Testing in the Cold, and Jungle Testing. Hans is coauthor of Integrated Test Design and Automation.  
  • 3. BETTER SOFTWARE EAST 2012 Presentation BT8 Better Test Designs to Drive Test Automation Excellence Mr. Playback Hans Buwalda, LogiGear © 2012 LogiGear Corporation. All rights reserved. Testing Under Pressure DEADLINE specification development test © 2012 LogiGear Corporation. All rights reserved. © 2011 LogiGear Corporation. All rights reserved. 2009 1
  • 4. Testing Under Pressure DEADLINE specification development test Develop tests in time: • Test design • Auditing, acceptance • Preparations • Automation © 2012 LogiGear Corporation. All rights reserved. © 2011 LogiGear Corporation. All rights reserved. 2009 Key Components for Success in Testing • Appropriate Test design • Comprehensive automation architecture • manageable, maintainable • Management of the tests • tests and test scripts are products that need to be managed • Management of the test process • managers want to know what is going on • Documentation • Clear and useful reporting • progress, results • Quality Assurance • efficient and effective involvement of stake holders, users, auditors © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 2
  • 5. Typical Problems with Automated Testing • No clear direction • Test design not well thought through • Automation lacking architecture not transparent and hard to manage • Test process not well organized • test designers and test automators need/have different skill sets • stake holders don't know what is happening • T i underestimated or avoided Testing d i d id d • testing is difficult and expensive • it looks unattractive to spend money on testing • not considered something to think about • Focus on tools and technology • engineers at the wheel © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. The 5% Rules of Test Automation • No more than 5% of all test cases should be executed manually • No more than 5% of all efforts around testing should involve automating the tests © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 3
  • 6. Keywords, Action Words • Common in automation tools nowadays (but with different styles) • Identify tasks for both test development and automation • The test developer creates tests using actions • Each action consists of a keyword and arguments • The automation task focuses on automating the actions • Each action is automated only once number name new product P-9009 Sledge Hammer 5 number quantity add quantity add quantity add quantity P-9009 P-9009 P-9009 20 3 6 def action_AddQuantity .... number quantity def action_CheckQuantity .... check quantity P-9009 quantity 34 def action_NewProduct .... © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Keywords need a method • By themselves keywords don't provide much scalability they can even backfire and make automation more cumbersome a method can help tell you which keywords to use when, and how to organize the process • Today we'll look at Action Based Testing (ABT) addresses test management, test development and automation large focus on test design as the main driver for automation success • Central deliveries in ABT are the "Test Modules" developed in spreadsheets each test module contains "test objectives" and "test cases" each test module is a separate (mini) project, each test module can involve different stake holders © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 4
  • 7. Action Based Testing Test Development Plan Break down Test Module 1 Test Module 2 Test Module N Test Objectives Test Objectives Test Objectives Test Cases Test Cases ... Test Cases Automate Actions ACTION AUTOMATION © 2012 LogiGear Corporation. All rights reserved. © 2011 LogiGear Corporation. All rights reserved. 2009 Example of a test module • Consists of an (1) initial part, (2) test cases and (3) a final part • Focus is on readability, and a clear scope • Navigation details are avoided, unless they're meant to be tested © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 5
  • 8. Example of a test module, "low level" In "low level" tests the focus is typically on the user interaction with the UI (or other interactions with the system interfaces). The details of the interaction are visible in the test, since they are the target of the test. The "right level" of abstraction depends on the scope of the test, and is a matter of test design. TEST MODULE Screen Flow user start system TEST CASE john click TC 01 Order Form window control main new order window check window exists new order FINAL Exit application window close application welcome © 2012 LogiGear Corporation. All rights reserved. © 2011 LogiGear Corporation. All rights reserved. 2009 Test Design • Effective test breakdown (into test modules) make sure every test module has a clear focus keep different kinds and levels of tests separate • Right level of actions as “high level” if possible, hiding as many details as much as possible but not if the details are relevant for the test It is my believe that successful automation is not a technical challenge. It is most of all a test design challenge. © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 6
  • 9. The Three “Holy Grails” of Test Design • Metaphor to depict three main steps in test design • Using "grail" to illustrate that there is no one perfect solution, but that it matters to pay attention (to search) • About quality of tests, but also about scalability and maintainability, in particular in big projects Organization of tests into Test Modules Right approach for each Test Module Proper level of detail in the test specification © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. What's the trick... © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 7
  • 10. What's the trick... • Have or acquire facilities to store and organize you content • Edit your stuff • Decide where to put what assign and label the shelves • P t it there Put th • If the organization is not sufficient anymore, add to it or change it © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Breakdown Criteria • Straightforward Criteria Architecture of the system under test (client, server, protocol, sub systems, components, modules, ...) Functionality ( y (customers, finances, management information, ...) , , g , ) Kind of test (navigation flow, negative tests, response time, ...) Ambition level (smoke test, regression, requirement based, aggressive, …) • Additional Criteria Stakeholders • accounting, compliance, Complexity of the test • keep complex, hard to understand tests in separate modules Technical aspects of execution • special hardware, multi-station testing, Overall project planning • availability of information, timelines, priorities, sprints,... Risks involved • devote extra test modules to high risk areas (high complexity, high impact) © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 8
  • 11. Properties of a good Breakdown • Reflects the level of tests • Well differentiated and clear in scope • Balanced in size and amount • Modules mutually independent • Fitting the priorities and planning of the project © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Example breakdown • Tests of user interface does function key F4 work does listbox xyz the right values is the tab order correct • Form Tests do all the forms (dialogs screens pages) work: Tests, (dialogs, screens, can data be entered and is it stored well is displayed data correct split these from everything else • Function tests, do individual functions work • Alternate paths in use cases can I count the orders can I cancel a transaction • End-to-end tests do all components of a system work well together in implementing the business p p y g p g processes like enter sale order, then check inventory and accounting • Tests with specific automation needs • Tests of non-UI functions • High ambition tests (aggressive tests) like multi station tests can I break the system under test © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 9
  • 12. Example of an application under test • Various item types tests actions interface definitions data t d t sets folders ... • Various operations open cut, copy, paste check out ... • Various ways to initiate an operation context menu, with or without accelerator key , y main menu, with or without accelerator key toolbar short cut key function key drag and drop double click .... © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Defining some modules • Test modules for operations primary and alternate paths various values for fields like "comment" in check-in paste in other projects copy and paste various groups d t i not necessarily on each item type • Test modules for items address all item types at least once on each item type perform each operation not necessarily each variant of each operation • UI handling try for UI command if it starts the intended operation y p not necessarily on each item type or operation variant • Concurrency try concurrency sensitive operations with multiple stations in varying concurrency scenarios, with and without local "refreshes" not necessarily each item type or operation variant certainly not each UI command included © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 10
  • 13. What is probably not a good design • Navigational and functional tests are mixed "over checking": like a test for a insurance policy premium calculation also checks the existence of the window for data entry • You have to change all of them for every new release of the system under test • All test modules have a similar design • Test modules are dependent on each other • You can’t start developing any test modules early in the life cycle © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Symptoms • Tediousness in the test and test automation process • No sense of control • Complaining people • Unnecessary high test maintenance changes in the system under test impact many tests g y p y hard to understand which tests need to be modified • Difficulties in running any test teams start "debugging" tests © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 11
  • 14. "Thou Shall Not Debug Tests..." • Large and complex test projects can be hard to "get to run" however, • If they are however start with taking a good look again at your test design... • Rule of thumb: don't debug tests. If tests don't run smoothly, make sure: lower level tests have been successfully executed first -> UI flow in the AUT is stable actions and interface definitions have been tested sufficiently with their own test modules -> automation can be trusted are you test modules not too long and complex? © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Case Study • Large IT provider • New version of one of their major web-sites web sites • Test scope was user acceptance test (functional acceptance) the users were the “business owners” • Development was off-shore © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 12
  • 15. Case Study • Test development was done separate from automation time-line time line for test development: May – Oct time-line for automation (roughly): Jan – Feb • All tests were reviewed and approved by the business owners acceptance was finished by the end of the test development cycle © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Example of a Test Development Plan Nr 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 Module Portal Navigation, Audience Portal Navigation, Search Membership, registration Portal Navigation, Category Portal Navigation, Topic and Expert Access Control Portal Navigation, Task Contact DSPP Portal search Membership, review and update Program contact assignment Company, registration Catalog, view and query Site map Membership, affiliation Learn about DSPP Products and services What's new Company, lif cycle C life l Specialized programs Customer surveys Software downloads Newsletters Internationalization and localization Membership, life cycles Collaboration, forums Collaboration, blogs Collaboration, mailing lists Business Owner Date to BO Robyn Peterson 05 / 23 Ted Jones 05 / 27 Steve Shao 06 / 03 Ted Jones 06 / 08 Ted Jones 06 / 13 Mike Soderfeldt 06 / 17 Ted Jones 06 / 22 Ted Jones 06 / 27 Mike Soderfeldt 07 / 01 Steve Shao 07 / 05 Alan Lai 07 / 11 Steve Shao 07 / 14 Robyn Peterson 07 / 19 Ted Jones 07 / 25 Steve Shao 07 / 28 Ted Jones 08 / 01 Steve Shao, Robyn Peterson 08 / 08 Ted Jones 08 / 11 Steve Shao, Al L i St Sh Alan Lai 08 / 17 Ted Jones, Steve Shao 08 / 22 Ted Jones 08 / 29 Mike Soderfeldt 09 / 01 Ted Jones 09 / 06 Ted Jones 09 / 13 Steve Shao 09 / 19 Ted Jones 09 / 23 Mike Soderfeldt 09 / 28 Ted Jones 10 / 03 © 2012 LogiGear Corporation. All rights reserved. © 2011 LogiGear Corporation. All rights reserved. 2009 13
  • 16. Review Process with Stake Holders START Test Team sends draft Module to Stake Holder Stake Holder reviews: - coverage - correctness changes needed? no Stake Holder returns notice of approval yes Test Team receives and processes notes Stake Holder returns notes: - additions - corrections Test Team marks the Module as "Final" END © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Case Study, Results • All tests were developed and reviewed on schedule many notes and questions during test development phase • The automation was 100% of the tests all actions were automated, thus automating all test modules • The test development took an estimated 18 person months one on-shore resource, two off-shore resources • Th automation took between one and two months The t ti t k b t dt th focused on actions most time was spent in handling changes in the interface (layout of pages etc) © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 14
  • 17. Grail 2: Approach per Test Module • Organize the test design process around the test modules • Plan the test module: when to develop: is enough specification available p g p when to execute: make sure the functionality at action level is well-tested and working already • Process: analysis of requirements formulation of "test objectives" create "test cases" • Identify stakeholders and their involvement: users, users subject matter experts developers auditors • Choose testing techniques if applicable: boundary analysis, decision tables, transition diagrams, soap opera testing, ... © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Eye on the ball, Scope • Always know the scope of the test module • Th scope should b unambiguous The h ld be bi • The scope determines many things: what the test objectives are which test cases to expect what level of actions to use what the checks are about and which events should generate a warning or error (if a “lower” functionality is wrong) © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 15
  • 18. Make explicit test objectives ... TO-3.51 The exit date must be after the entry date ... test objective TO-3.51 name enter employment check error message entry date exit date Bill Goodfellow 2013-10-02 2013-10-01 The exit date must be after the entry date. © 2012 LogiGear Corporation. All rights reserved. © 2011 LogiGear Corporation. All rights reserved. 2009 Grail 3: Specification Level, choosing actions • Scope of the test determines the specification level • As high level as appropriate, as little arguments as possible ibl Use default values for non-relevant arguments • Clear names (usually verb + noun usually works well) to standardize action names: standardize both the verbs and the nouns, so "check customer" versus "verify client" (or vice versa) tests are not C++ code: avoid "technical habits", like mixed case and (worse) underlines • Manage the Actions • Document the Actions • By-product of the test design © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 16
  • 19. Grail 3: Example of using actions In this real world example the first "sequence number" for teller transactions for a given day is retrieved, using a search function • the "#" means an expression, in this case a variable • the ">>" assign to a variable for use later on in the test section Capture initial number key key navigate key navigate F7 3 page tab locate page tab Scan Criteria window wait for controls loaded search text # search breadcrumb general functions > search check breadcrumb window scan direction Backward window enter value control search select control value # bus date search control search click business date match source value go window wait for controls loaded search results store as capture sequence num >> seq num © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Grail 3: Example of using actions In this real world example the first "sequence number" for teller transactions for a given day is retrieved, using a search function • the "#" means an expression, in this case a variable • the ">>" assign to a variable for use later on in the test variable get sequence number >> seq num © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 17
  • 20. Mid-level actions • Most tests will have low level and high level actions low level: generic operations, know the interface, don't know the functionality • examples: "selection menu item", "expand tree node", ... high level: business oriented operations know the functionality don't know the operations, functionality, don t interface • examples: "enter purchase order", "check inventory of article" • For systems with many screens and fields, consider an inbetween layer: mid level: screen oriented, know the interface, know bit of functionality as well • Examples of mid-level actions: "assign all address fields" "check all address fields" © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Test Design and Agile • Keywords are suitable for agile projects: tests are easier to create and understand, in particular for non-programmers they allow test development without a need for details that haven't been defined yet automated tests can quickly follow changes in the system under test • Action Based Testing in itself is quite agile focused on products and cooperation flexible in process, in fact each test module can have its own process test modules are usually very suitable to drive system development • Test design will need to find its place in agile projects identifying test modules in larger scale projects this may require at least some overall test planning © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 18
  • 21. Test Development and Automation in sprints product owner team Product Backlog Sprint Test Module Definition (optional) Test Module Development prod owner & team Products Test re-use Interface Definition Automation re-use Action Automation Test Execution © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. Summary • Keywords is one of the techniques for automated testing, in addition to record & playback and scripting • In itself keywords are not a silver bullet, it needs a good approach, careful planning and good organization to be successful • T t design, not technology, is usually th Test d i tt h l i ll the dominating success factor for automation success © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 19
  • 22. References Action Based Testing, Hans Buwalda, Better Software, March 2011 Action Figures (on model-based testing), Hans Buwalda, Better Software, March 2003 Automating Software Testing, Dorothy Graham, Mark Fewster, Addison Wesley, 1999 Experiences of Test Automation, Dorothy Graham, Mark Fewster, Addison Wesley, 2012 Happy About Global Software Test Automation, Hung Nguyen, Michael Hackett et al, Happy About, 2006 Integrated Test Design & Automation, Hans Buwalda et al, Addison Wesley, 2002 Lessons Learned in Software Testing, Cem Kaner, James Bach, Bret Pettichord, Wiley 2002 QA All Stars, Building Your Dream Team, Hans Buwalda, Better Software, September 2006 Soap Opera Testing, Hans Buwalda, Better Software Magazine, February 2005 Testing Computer Software, Cem Kaner, Hung Nguyen, Jack Falk, Wiley, 1999 Testing with Action Words, Abandoning Record and Playback, Hans Buwalda, Eurostar 1996 The 5% Solutions, Hans Buwalda, Software Test & Performance Magazine, September 2006 © 2012 LogiGear Corporation. All rights reserved. © 2009 LogiGear Corporation. All rights reserved. 20