Test automation scripts are largely run against stable functionality with repeatable results. But automation does not have to be just about running reliable tests against a fixed code base to make them effective; rather, you can determine the right level of automation you need to meet your project’s needs. Three levels of test automation will be discussed in this presentation: Level 1 tests exercise the simplest aspect of functionality in a module, Level 2 tests explore all module aspects except interfaces to other components, and Level 3 tests examine the deepest level of functionality in a module, including those that interface to other components. Join Jim Trentadue as he presents the academic views of each stage and provides practical examples. This session will arm QA & test professionals with the ability to improve the logic in their automation scripts and increase the chances of detecting more defects by defining their appropriate levels of test automation.
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
Defining the Optimal Level of Test Automation
1. W4
Optimizing Test Automation
10/18/2017 10:15:00 AM
Defining the Optimal Level of Test
Automation
Presented by:
Jim Trentadue
Independent Consultant
Brought to you by:
350 Corporate Way, Suite 400, Orange Park, FL 32073
888-‐268-‐8770 ·∙ 904-‐278-‐0524 - info@techwell.com - https://www.techwell.com/
2. Jim Trentadue
Independent Consultant
Jim Trentadue has more than seventeen years of experience as a coordinator and
manager in the software testing field. In his various career roles in testing, Jim
has focused on test execution, automation, management, environment
management, standards deployment, and test tool implementation. In the area
of offshore testing, Jim has worked with multiple large firms to develop and
coordinate cohesive relationships. As a guest speaker at the University of South
Florida's software testing class, Jim mentors students on the testing industry and
trends for establishing future job searches and continued training.Currently, Jim
has started his own Test Automation Foundations & Principles workshop aimed
at helping manual testers get started with automation.
5. 10/26/2017
3
Overview of Automation Levels
LevelLevelLevel 1 test exercises the simplest aspect of the
LevelLevel
22
LevelLevel
1
Level 1 test exercises the simplest aspect of the
functionality in a module
Level 2 tests explore all module aspects except
interfaces to other components
Level Level
33
Level 3 tests exercise the deepest module
functions, including interfaces to other systems
Case Study by Steve Allott, Software Test Automation: Fewster & Graham
Level 1 Automation ‐ Definition
LevelLevel Level 1 test exercises the simplest aspect of the
f i li i d l1
Level 1
It is usually straightforward to test manually
It is easy to automate
functionality in a module
The automated test is likely to work
It is unlikely to find a new bug
Case Study by Steve Allott, Software Test Automation: Fewster & Graham
6. 10/26/2017
4
L1 Automation ‐ Regression
rderrder
Existing Automation
nhance Sales Ornhance Sales Or
Feature Story 1:
Order Placement
Feature Story 2:
Order Processing
Feature Story 3:
Sprint 1
• Place New Order
• Generate Quote Data
Sprint 2
• Work Order Entries
• Additional Data Parameters
Confirmation Generation
Additional Data Parameters
Generate Quote Data
Epic Epic ––EnEn
Feature Story 3:
Order Fulfillment Sprint 3
• Confirmation Generation
• Ship To / Mail To Screens
Automation Candidates
Ship To / Mail To Screens
Level 2 Automation ‐ Definition
LevelLevel Level 2 tests explore all module aspects except
i f h2
Level 2
It is possible but time consuming to test manually
It looks easy to automate, but doesn’t always turn out so
interfaces to other components
The automated test is likely to have bugs
It sometimes finds a bug
Case Study by Steve Allott, Software Test Automation: Fewster & Graham
8. 10/26/2017
6
Level 3 Automation ‐ Definition
LevelLevel Level 3 tests exercise the deepest module
f i i l di i f h3
Level 3
Difficult if not impossible to test manually
Hard to automate
functions, including interfaces to other systems
Unlikely to run successfully, repeatedly
Very likely to find a bug
Case Study by Steve Allott, Software Test Automation: Fewster & Graham
L3 Automation – Problem Statement
Test Case 1 Bolster by Automation
Data
Navigation
Timing
Facts:
▪ Executed 12 times in
release
▪ Module has been
modified in each
sprint (10)
▪ TC1 passes, but
▪ Frequent & random
data injection
▪ Navigation paved
more paths
▪ Sporadic timeouts on
forms / pages are
captured precisely
related complex tests
are failing
11. 10/26/2017
9
Session Recap
D fi th diff t l l f t ti d lDefine the different levels of automation you can develop
Outline practical models for each level
Apply various levels of automation on each release
Maintain metrics on the success of the higher levels
Thank you!
Jim Trentadue
Senior QA Manager, VGT
Jim.Trentadue@vgt.net
October 18th, 2017