Your SlideShare is downloading. ×
  • Like
Test Automation Frameworks   Final
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Test Automation Frameworks Final

  • 1,733 views
Published

Gain a deeper understand to the strategy and design approaches to automation frameworks. Warning: One size does not fit all! Call Utopia (630) 566-4722 to learn more.

Gain a deeper understand to the strategy and design approaches to automation frameworks. Warning: One size does not fit all! Call Utopia (630) 566-4722 to learn more.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,733
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Structured testing experience: Created test plans, strategies, scripts, manual testing, results analysis, etc. Exposure to test tool: Has worked with (or at least seen) one of the contemporary tools and is familiar with their high-level capabilities such as object recognition, programming capabilities, etc.
  • Also called – architecture Used the word “code” intentionally. The test engine can be built utilizing one of the commercially available tools, or purchased via one of the
  • Or another way of saying “Can’t I just hit the record button?” Reliable (executes to completion with accurate results) Maintainable (test suite maintenance can be performed within available time windows while maintaining positive ROI) Scalable (test coverage can be expanded efficiently – with existing test resources while maintaining positive ROI)
  • This is pretty well accepted – whatever your approach, it better be modular and data-driven (which is a general term for test suite execution being governed at some level by an external data source)
  • In my experience the leading cause of automation failure is attempting to implement with resources that don’t have the skills and background. What’s more, you end up taking focus away from their strengths – software testing and domain expertise. Separating these two functions allows everyone involved to focus on their strengths If test def & execution are all rolled up into the same code, additional test cases must be added by the automation engineers. Your framework should have a simple test definition interface that testers can use to create their tests and scenarios.
  • The primary goal of building an automation architecture, rather than defaulting to record and playback, is to separate the process of test definition and test execution. This eliminates that need for a “super tester” that has both subject matter expertise AND automation expertise. If done correctly, the SME’s can define their test scenarios using a simplified instruction set (stored in an Excel spreadsheet or some other external data source). The test engine, built and maintained by a core group of automation experts, executes the scenarios and reports the results In this example, the test engine consists of reusable scripts and functions that “know how” to perform a common set of test actions such as logging in, navigation, data input, verification, etc. The definition of the tests which contains which actions to perform and what specific data should be used is built in an external data source.
  • Consistent means that if I have a “Create Order” business process – each time I execute that process as part of my testing I go through pretty much the same steps each time. Finite means I know how many and what the BP’s are before I start. End-to-end means I’m concerned with the result of the BP, not incremental, lower-level test cases like screen navigation, user interface state, etc.
  • Consistent means that if I have a “Create Order” business process – each time I execute that process as part of my testing I go through pretty much the same steps each time. Finite means I know how many and what the BP’s are before I start. End-to-end means I’m concerned with the result of the BP, not incremental, lower-level test cases like screen navigation, user interface state, etc.
  • Consistent means that if I have a “Create Order” business process – each time I execute that process as part of my testing I go through pretty much the same steps each time. Finite means I know how many and what the BP’s are before I start. End-to-end means I’m concerned with the result of the BP, not incremental, lower-level test cases like screen navigation, user interface state, etc.
  • Many times we see test scripts that were created directly from requirements – any analysis of the test conditions (i.e. things to be tested) is often treated as work product and not kept. However, because we have different types of requirements (e.g. functional, security, user interface, etc.) we have different types of test cases. These varying types of test cases are often jammed together in manual test scripts because we want to spend as little time as possible performing manual tests. <CLICK> Why go through a particular set of screens multiple times, when you can jam all of your testing related to a particular process into just one pass? There’s nothing wrong with that – it’s just efficient manual testing. However, it’s not structured very well for automation. Why? Because as we’ll discuss in more detail later, successful automation requires some type of reusable, data-driven architecture. This manual test script is not reusable for other testing purposes. <CLICK> However, if we extract the test cases from the test scripts and look at them as a whole, we can start to visualize what our automation approach might be. Why, because we start to see types of test cases that are similar and repetitive – a good indication of an automation candidate. Let’s look at this a little deeper on the next slide.
  • Many times we see test scripts that were created directly from requirements – any analysis of the test conditions (i.e. things to be tested) is often treated as work product and not kept. However, because we have different types of requirements (e.g. functional, security, user interface, etc.) we have different types of test cases. These varying types of test cases are often jammed together in manual test scripts because we want to spend as little time as possible performing manual tests. <CLICK> Why go through a particular set of screens multiple times, when you can jam all of your testing related to a particular process into just one pass? There’s nothing wrong with that – it’s just efficient manual testing. However, it’s not structured very well for automation. Why? Because as we’ll discuss in more detail later, successful automation requires some type of reusable, data-driven architecture. This manual test script is not reusable for other testing purposes. <CLICK> However, if we extract the test cases from the test scripts and look at them as a whole, we can start to visualize what our automation approach might be. Why, because we start to see types of test cases that are similar and repetitive – a good indication of an automation candidate. Let’s look at this a little deeper on the next slide.
  • Many times we see test scripts that were created directly from requirements – any analysis of the test conditions (i.e. things to be tested) is often treated as work product and not kept. However, because we have different types of requirements (e.g. functional, security, user interface, etc.) we have different types of test cases. These varying types of test cases are often jammed together in manual test scripts because we want to spend as little time as possible performing manual tests. <CLICK> Why go through a particular set of screens multiple times, when you can jam all of your testing related to a particular process into just one pass? There’s nothing wrong with that – it’s just efficient manual testing. However, it’s not structured very well for automation. Why? Because as we’ll discuss in more detail later, successful automation requires some type of reusable, data-driven architecture. This manual test script is not reusable for other testing purposes. <CLICK> However, if we extract the test cases from the test scripts and look at them as a whole, we can start to visualize what our automation approach might be. Why, because we start to see types of test cases that are similar and repetitive – a good indication of an automation candidate. Let’s look at this a little deeper on the next slide.
  • To highlight the importance of the capturing and tracking test cases separately I want to take a look at a typical manual test script. As I’ve highlighted, we often see that manual test scripts often contain many types of test cases. CLICK If we look at the manual steps we see that we have some business process level test case, user interface, input validation, etc. This doesn’t seem too bad from automation perspective, but we’re not looking at the big picture. CLICK If we take into consideration that we have dozens, hundreds, or even thousands of manual scripts – we quickly see that we’re going to have mess in terms of understanding how to approach automation. So how do we begin to untangle this mess? As we saw in the last slide, we need to extract, or “distill” our test cases into similar levels and types. CLICK Once we have grouped our test cases into similar levels and types, we can start to envision what automation approaches (i.e. architectures) we might want to use. CLICK As the slide indicates, you likely won’t use the same automation approach for all types of testing – most successful automation functions that we have seen have specific architectures adapted to specific testing needs. We’ll discuss automation architectures in a little more detail later in the presentation
  • Actual implementation specifics will depend on the capabilities of your test tool

Transcript

  • 1. Advanced Test Automation Practical Application of Test Automation Frameworks Lee Barnes Founder & Chief Technology Officer Utopia Solutions
  • 2. Agenda
    • Assumptions & Definitions
    • Introduction to Automation Frameworks
    • Overview of Common Frameworks
    • Selecting the Right Framework
    • Case Study
    • Q & A
  • 3. Assumptions
    • Audience has:
      • Exposure to automation best practices
      • Exposure to contemporary test automation tool (e.g. WinRunner, QuickTest Pro, SilkTest, QA Run, Robot)
    • Addressing technical issues such as custom interface objects not in scope
    • Presentation is tool agnostic
  • 4. Definitions
    • Data-driven
    • Business Process
    • Keyword
    • Test case
    • Test scenario (script)
    • Test engine
  • 5. Introduction to Automation Frameworks
  • 6. What is an “Automation Framework” ?
    • Structured approach to automation utilizing data-driven techniques, reusable test code assets and standards/guidelines
    Automation Standards & Guidelines Test Scenario Input Test Scenario Results Test Engine Reusable Scripts/ Modules Utility Functions
  • 7. Why?
    • Promote the three key characteristics automated test suites must possess to be successful:
      • Reliable
      • Maintainable
      • Scalable
  • 8. How?
    • A properly designed framework will possess these characteristics if they incorporate three key attributes:
      • Modular data-driven design
      • Separation of test definition and test execution
      • Error detection and recovery
  • 9. Modular & Data-Driven Design
    • Promotes maintainability through reuse and reduced test suite size
    • Promotes scalability by abstracting test data from test code
  • 10. Separation of Test Definition and Execution
    • Promotes scalability by allowing test creation to be performed by non-technical testers and SME’s without test code modification
    • Allows testers/business analysts and automation engineers to focus on their strengths – separately
    • Enables execution of single test scenario across multiple platforms
  • 11. Separation of Test Definition and Test Execution Test Engine Software Testing and Functional Subject Matter Experts Automation Experts Test Definition Interface Test Scenario Input Test Scenario Results Reusable Scripts/ Modules Utility Functions
  • 12. Error Detection and Recovery
    • Promotes reliability by enabling unattended execution
    • Best Practices
      • Use tool capabilities for the simple stuff
        • Trapping unexpected pop-ups
        • Handling/logging unexpected return values from built-in tool functions (e.g. object recognition errors)
      • Keep it simple and consistent
      • Carefully plan execution status into framework
  • 13. Handling Execution Status
    • Need to track progress and handle recovery after restart
    • Assess possible statuses - PASS, FAIL, INCOMPLETE, etc.
    • Build into test suite modules & input files…
    Business Process Module Initialize status to INCOMPLETE … Perform business process … If validation condition = TRUE Set status = PASS Else Set status = FAIL Log execution status Test Scenario Input File
  • 14. Common Automation Frameworks Business Process Framework Keyword Framework Integrated Framework
  • 15. Business Process (BP) Framework
  • 16. Business Process Framework
    • Main components of test engine parallel AUT functionality at a business process level
    • Sample Components:
    • Login
    • PickOrder
    • AdjustInventory
    • VerifyInventory
    BP Functions
    • LogResults
    • LogError
    • OpenTestInput
    • ReadTestInput
    Supporting Functions
    • Design/Coding Standards
    • Test Definition Guide
    • Maintenance &
    • Execution Guide
    Standards & Guidelines
  • 17. Business Process Framework Test Engine Login, test_user_01, password_01 Create_Order, <ord1>,SKU10045,100,… Create_Order, <ord2>,SKU10045,100,… Ship_Order,<ord1>,… Ship_Order,<ord2>,… Verify_Inventory,SKU10045,… Test Scenario Files Test Scenario Results Business Process Scripts Utility Functions
  • 18. Business Process Framework
    • Relatively simple to implement
    • Straight forward test definition interface for test creators
    • Efficient execution
    Advantages Disadvantages
    • Requires automation expertise if business process steps change
    • Requires automation expertise to support new business processes
    • Not suitable for other types of testing
  • 19. Business Process Framework
    • Ideal for testing that involves the consistent execution of a finite number of end-to-end business processes
  • 20. Keyword (KW) Framework
  • 21. Keyword Framework
    • Main components of test engine parallel object/action level functionality required to perform test actions
    • Sample Components:
    • FieldSetText
    • FieldGetText
    • ListSelectItem
    • ButtonClick
    Object/Action Functions
    • LogResults
    • LogError
    • OpenTestInput
    • ReadTestInput
    Supporting Functions
    • Design/Coding Standards
    • Test Definition Guide
    • Maintenance &
    • Execution Guide
    Standards & Guidelines
  • 22. Keyword-Driven Model Test Engine VerifyState, login_page, EXISTS SetText, user_id, testuser01 SetTextSecure, password, j4ghjs39 ClickButton, submit VerifyState, main_menu, EXISTS … Object/ Action Functions Utility Functions Test Scenario Results Test Scenario Files
  • 23. Keyword Framework
    • Suitable for many types of testing
    • Automation expertise not required to maintain or add business processes
    Advantages Disadvantages
    • More complex test definition interface
    • Can result in large test scenario input files
    • Can result in inefficient execution
  • 24. Keyword Framework
    • Ideal framework for:
      • Verbatim conversion of manual test scripts
      • Detailed field-level testing
      • Teams with technical test creators
      • Serving as base for integrated frameworks (discussed next)
  • 25. Integrated Business Process / Keyword Framework
  • 26. Integrated Framework
    • Business Process framework wrapped around a keyword framework
    • Contains all Keyword framework components plus business process templates to allow test scenarios to be defined at the BP level
  • 27. Integrated BP/KW Framework Test Scenario File VerifyState, login_page, EXISTS SetText, user_id, <user_id> SetTextSecure, password, <pw> ClickButton, submit Test Engine Login, test_user_01, password_01 Create_Order, <ord1>,SKU10045,100,… Create_Order, <ord2>,SKU10045,100,… … Business Process Templates Test Scenario Results Object/ Action Functions Utility Functions
  • 28. Integrated Framework
    • Incorporates the benefits of both BP and KW frameworks
    s Advantages Disadvantages
    • Can be complex to implement and maintain
    • Can muddy the functional/technical boundary
  • 29. Integrated Framework
    • Ideal for testing that requires both business process and keyword frameworks
  • 30. Selecting the Right Framework
  • 31. Aligning Automation Framework with Test Objectives
    • Testing objectives come in different sizes and shapes
    • Choose the framework that best aligns with your needs:
      • Test coverage
      • Type of test cases (GUI, BP, data validation, etc.)
      • Quantity of test cases
    • “ Distill” your test cases (discussed later)
  • 32. Understanding Test Coverage Test Script
    • Login as user test01
    • Select Create PO link and verify user has access
    • Verify Create PO page is defaulted with the following data …
    • Verify SKU field rejects non-numeric data
    • Create PO using the following data …
    • Verify …
    Requirements
    • Functional
      • Order Mgmt.
      • Inventory Mgmt.
      • Customer Mgmt.
    • Security
      • User access
    • User Interface
      • Screen navigation
      • Input validation
  • 33. What’s Missing ? Test Script
    • Login as user test01
    • Select Create PO link and verify user has access
    • Verify Create PO page is defaulted with the following data …
    • Verify SKU field rejects non-numeric data
    • Create PO using the following data …
    • Verify …
    Requirements
    • Functional
      • Order Mgmt.
      • Inventory Mgmt.
      • Customer Mgmt.
    • Security
      • User access
    • User Interface
      • Screen navigation
      • Input validation
  • 34. Distilling Test Cases Test Cases
    • Create PO
      • Create PO under credit limit
      • Create PO over credit limit
    • Receive Order
      • Full receipt
      • Partial receipt
    • Input Field Validation
      • Valid values
      • Max length
    Test Script
    • Login as user test01
    • Select Create PO link and verify user has access
    • Verify Create PO page is defaulted with the following data …
    • Verify SKU field rejects non-numeric data
    • Create PO using the following data …
    • Verify …
    Requirements
    • Functional
      • Order Mgmt.
      • Inventory Mgmt.
      • Customer Mgmt.
    • Security
      • User access
    • User Interface
      • Screen navigation
      • Input validation
  • 35. Why Distill?
    • Login as user test01
    • Select Create PO link and verify user has access
    • Verify Create PO page is defaulted with the following data …
    • Verify SKU field rejects non-numeric data
    • Create PO using the following data …
    • Verify Customer credit limit...
    Manual Testing User Interface Input Validation Business Process User Interface User Interface User Interface User Interface User Interface Input Validation Input Validation Input Validation Input Validation Input Validation Business Process Business Process Business Process Business Process Business Process
  • 36. Distilled Test Cases Mapped To Frameworks Automated Testing Automated Test Suite Business Process Testing Framework User Interface Testing Framework Input Validation Testing Framework Business Process User Interface Input Validation
  • 37. Case Study Multi-Platform Warehouse Management System
  • 38. Situation
    • Warehouse Management System
      • Web-based client used by office staff
      • Subset of functionality available to warehouse staff via hand-held RF device
      • One-to-one mapping of screens & fields between platforms
    • Existing manual regression test suite
      • Focused on BP-level testing
      • 35% of test scenarios executed on both platforms
    • Objectives
      • Create automated test suite to include existing test coverage
      • Single input source for multi-platform test scenarios
  • 39. Multi-Platform Framework Automation Options
    • BP framework with parallel modules for each business process requiring execution on both platforms
    • KW framework with parallel functions for each object/action requiring support on both platforms
    • KW framework with “intelligent” functions that “sense” the execution platform and execute appropriately
    • Integrated (BP wrapped around KW) framework
    What criteria would you use to decide?
  • 40. Multi-Platform Framework Conceptual Design Test Definition Interface Test Engine Desktop Execution Framework Hand-held Execution Framework Functional SME’s Automation Engineers Test Scenario Input Test Scenario Results
  • 41. Best Framework Choice?
    • Integrated framework to incorporate the advantages of both BP and KW frameworks
    • Low-level “intelligent” object/action functions best support multi-platform execution
  • 42. Integrated BP/KW Framework Test Scenario File VerifyState, login_page, EXISTS SetText, user_id, <user_id> SetTextSecure, password, <pw> ClickButton, submit Test Engine Login, test_user_01, password_01 Create_Order, <ord1>,SKU10045,100,… Create_Order, <ord2>,SKU10045,100,… … Business Process Templates Test Scenario Results Object/ Action Functions Utility Functions
  • 43. Translate Manual Scenario to Input File
    • Log into warehouse A3 as testuser01
    • Set inventory levels for SKU1001/Bin 47 and SKU1010/Bin 49 to 150
    • Pick order ORD0034
    • Pick order ORD0046
    • Verify inventory level for SKU1001/Bin 47 is 140
    • Verify inventory level for SKU1010/Bin 49 is 135
    Test Scenario Test Scenario Input File Login, test01, test01, warehouseA3 Adjust_Inventory, SKU1001, BIN047, 150 Adjust_Inventory, SKU1010, BIN049, 150 Pick_Order, ORD0034 Pick_Order, ORD0046 Verify_Inventory, SKU1001, BIN047, 140 Verify_Inventory, SKU1010, BIN049, 135
  • 44. Translate BP’s to Keyword Steps Test Scenario Input File Business Process Template
  • 45. Test Driver Get BP Instruction Login, test01, test01, warehouse3 Load BP Template Map BP Data to BP Steps Execute BP Steps VerifyState, Login Page, EXISTS SetText, User ID, <user_id> SetText, Password, <pw> SetText, Warehouse, <warehouse> ClickButton, Submit VerifyState, Login Page, EXISTS SetText, User ID, test01 SetText, Password, test01 SetText, Warehouse, warehouse3 ClickButton, Submit
  • 46. Required Keyword Functions ?
    • All object/action pairs required to execute multi-platform test scenarios
    • Additional hybrid keywords for common step sequences (e.g. login, navigation, etc.)
    • Review existing manual test scenarios and get input from your customers – test suite users
  • 47. Multi-Platform Keyword Functions Function Set_Text, Parameters Object, Value Get execution_platform Select Case execution_platform Case BROWSER status = Set web Object = Value Case RF status = Set RF Object = Value Get debug level If debug level = TRACE log execution detail Return status Multi-Platform Code
    • Mange multi-platform differences at the tool level if possible
  • 48. Supporting Functions
    • Reading input files
    • Results logging, progress tracking, etc.
    • Test suite metrics (pass/fail, component usage, etc.)
    • Error handling
    • Custom object interaction
    • Custom synchronization
  • 49. Supporting Functions Results Logging
    • Start with the end in mind
      • Data
      • Level of detail
      • Format
      • Medium
  • 50. Sample Summary Result File ACCUMULATED TOTALS PERCENTAGE ------------------------------------------ STEPS EXECUTED: 14 TEST CASES EXECUTED: 5 TEST CASES PASSED: 3 60.0 TEST CASES FAILED: 1 20.0 TEST CASES INCOMPLETE: 1 20.0 SCRIPT/APPLICATION ERRORS: 3 TEST CASE RESULT --------------------------------------------- TC0001.........................PASS TC0002.........................PASS TC0003.........................PASS TC0004.........................INCOMPLETE TC0005.........................FAIL Summarized Test Case Results Accumulated Execution Metrics
  • 51. Sample Detail Log File BEGIN STEP: BP0001 START TIME: 14:46:48 S0001 Verify_State.............OK S0002 Set_Text.................OK S0003 Set_Text.................OK S0004 Set_Text.................OK S0005 Click_Button.............OK BP0001.........................OK END STEP: BP0001 END TIME: 14:48:04 BEGIN STEP: BP0002 START TIME: 14:48:06 S0001 TSL error handler invoked S0001 Err: -10011, function: set_window S0001 Verify_State............FAIL One or more script errors were detected BP0002........................FAIL END STEP: BP0002 END TIME: 14:50:21 Page 1 Page 2
  • 52. Results Logging/Metrics Functions Test Driver Keyword Functions CreateLogFiles StepBegin StepEnd Writes step start time to log files Writes step status, end time to log files, input file, metric totals & other mediums Creates log files LogMetrics Write accumulated test execution metrics to log files Writes detailed step execution status to log files StepDetail Logs component usage LogUsage
  • 53.
    • …and Answers
    Questions… Please direct any future questions to: [email_address]