2. 2
MEET THE SPEAKER
Sudhrity Mondal
Director of Sales Engineering
Sudhrity is a technology advisor and technical sales leader at Testim.io.
He specializes in Automated functional/performance Testing, Test Modeling,
Test Data Management, Service Virtualization, Continuous Delivery/Testing
and DevOps.
With 20+ years of software development, architecture and consulting
experience, he is passionate about making sure that his customers see and
derive value from successful adoption of technology.
He has architected and helped clients deploy solutions using variety of
technologies including agile, cloud, virtualization and web across different
industries - telecommunications, media, retail, finance, insurance, banking,
utilities, and transportation.
3. GOOD PRACTICETEST AUTOMATION, CHALLENGES
BUILDING TEST AUTOMATION
STRATEGY FOR AGILE TEAMS
NEW TRENDS
AGENDA
1
2
3
4
3
5. Software testing technique to test and
validate actual test result with expected
test result with minimal or no touch.
● Author & execute tests using a Test Automation tool
● Tools emulate user interactions & verifies test steps using
programmatic assertions
● Run repetitive tests with minimal manual intervention
● Reduces testing Time & Costs and improves Quality & TTM
What is Test
Automation
5
6. Why Test
Automation?
6
Wider Test Data
Coverage
Comprehensive testing
of business scenarios
Faster TTM
Saves Time, Quicker
Release Cycles
Fast Feedback
Results delivered
quickly
Automated Safety Net
Reusable and repeatable
Automated Smoke &
Regression Testing
Less Error Prone
Minimal human
interactions
Free up Testers
Invest in exploratory
testing or other tasks
Automated Test Coverage
Software Quality based on Risk
factors Accurate benchmarking
Continuous Testing
Integration with SDLC
pipeline (CI/CD)
Quality
Improve Test Coverage
Test Automation
Benefits
7. Barriers to Test Automation, why some fail?
Resources & Priorities
Initial Investment (Cost, Time, Effort)
Competing Corporate initiatives and priorities
Lack of clear mandate and realistic goals
Not treated like other software dev projects
Culture & Skillset
Team’s Approach, Attitude, Resistance to change
Lack of experience, False sense of security
Underestimate the amount of time needed
Relies on programming languages only
Creating large, end-to-end tests
7
Tools & Environment
Not using proper tools & framework
Legacy or Constantly changing Code
Not having a controlled and stable test environment
Process
Not reusing automation code
Not having a test data strategy in place
Not making your automated tests readable
11. Test Automation Strategy considerations
Where Do We Begin? Applying Agile Principles
to Test Automation
Test Automation
Candidates
Choosing Automation
Tools & Implementation
Test Data & Service
Virtualization
Managing & Maintaining
Automated Tests
Continuous
Testing
ROI: The Cost of
Test Automation
1 2 3
5 6 7
9 10
Test Modeling and
Test Coverage4
Test Automation
Metrics8
11
12. Test Automation Strategy considerations
12
01 Where do I Begin?
02
Applying Agile Principles
to Test Automation
03 Test Automation
Candidates
04
Test Modeling
and Test Coverage
05
Automation
Tools & Implementation
● Identify what hurts the most?
Cost, TTM, Quality
● Identify what delivers the biggest value?
Application, Pain Point
● Choose Automation based on overall needs
Implement a steel thread for experience and validation
● Continue with additional implementations
Additional Applications, Pain Points
● Identify Scope. Use Multi-layered approach
Use “Testing Pyramid” (Mike Cohn) to locate your starting point
13. 04: Design & Execute
Apply Agile & Best Practices
06: Continuous Testing
Integrate with CI/CD
02: Framework
Plan Strategy &
Identify Framework
03: Tool
Select Tool based
on Strategy
01: Scope
Identify Type of
Test to Automate
05: Adapt & Refine
Learn and adapt
based on needs
Test
Automation
Getting Started
13
14. Test Automation Strategy considerations
14
01 Where do We Begin?
02
Applying Agile Principles
to Test Automation
03 Test Automation
Candidates
04
Test Modeling
and Test Coverage
05
Automation
Tools & Implementation
● Keep it Simple
Use Incremental and Iterative approach to Test Automation and Test Design
● Every DEV iteration has its own testing phase
Implement regression testing every time new functions or logic released.
User acceptance tests executed at the end of each sprint.
● Whole team approach
Testers and developers work together
● Apply Agile Coding practices to creating tests
Use BDD, TDD as appropriate for your environment
● Invest time to do it right & Learn by doing
Applying Agile Principles
15. NEW AUTOMATED TESTS WRITTEN EACH SPRINT
NEW AUTOMATED
TESTS FROM
SPRINTS ADDED
TO EXISTING
REGRESSION
TEST SUITE
HIGH LEVEL
ACCEPTANCE
TEST PLAN
CREATED AND
EXECUTED
SPRINT 1 SPRINT 2 .... SPRINT N REGRESSION
TESTING
ACCEPTANCE
TESTING
RELEASE
AGILE SOFTWARE DELIVERY LIFE CYCLE
15
Test Automation Strategy considerations
Agile Test Automation
16. Test Automation Strategy considerations
16
Safety net through automated smoke & regression
What should be Automated?
Unit, component, API, Load/Perf, GUI level tests
Repetitive, Tests that run against different data sets
Tests integrated with CI/CD builds and releases
One-time tests, Applications not testable
What shouldn’t be Automated?
Usability, Exploratory, Ad-hoc tests
Tests that don’t fail or have predictable results
01 Where do We Begin?
02
Applying Agile Principles
to Test Automation
03 Test Automation
Candidates
04
Test Modeling
and Test Coverage
05
Automation
Tools & Implementation
17. Model Tests based on business
scenario or User Story
Create Model
Input Data
Expected results
Specify
Input/Output
Use heuristics and risk
factors to identify paths to be
tested
Identify Tests
Author & Execute
automated tests
Create & Execute
Compare actual against
expected results
Evaluate Results
Reduce overtesting of
already tested paths
Reduce Overtesting
Tune factors and
model to improve
test coverage
Improve
Coverage
01 Where do We Begin?
02
Applying Agile Principles
to Test Automation
03 Test Automation
Candidates
04
Test Modeling
and Test Coverage
05
Automation
Tools & Implementation
Test Automation Strategy considerations
17
18. Test Automation Strategy considerations
18
01 Where do We Begin?
02
Applying Agile Principles
to Test Automation
03 Test Automation
Candidates
04
Test Modeling
and Test Coverage
05
Automation
Tools & Implementation
Evaluating Tools
Implementation
● Identify Requirements & key criteria
● Leverage Pugh Matrix Technique for Analysis if needed
● Choose Agile friendly tools
● Evaluate One tool at a time
● Code vs. Codeless/Scriptless
● Target users (manual testers, test automation engineers)
● Execution environment (desktop/server, web, mobile device,
mobile emulators)
19. 19
Test Automation Strategy considerations
06
Test Data & Service
Virtualization
07
Managing, Maintaining
Automated Tests
08 Test Automation
Metrics
09 Continuous Testing
10
ROI: The Cost of Test
Automation
Test Data
Virtualized Services for Integration Testing
● Identify Test Data Requirements
● Subset, Mask Production data and provision test data
● Synthetic Test Data Generation
● Data Security, GDPR
● Create Mocks & Stubs for dependent services
● Use Service Virtualization techniques to simulate dependent APIs
● Virtualize database to reduce dependence on shared DBs
20. 20
Test Automation Strategy considerations
06
Test Data & Service
Virtualization
07
Managing, Maintaining
Automated Tests
08 Test Automation
Metrics
09 Continuous Testing
10
ROI: The Cost of Test
Automation
Test Management
Test Maintenance
● Deliver features as described in the requirements
● Link Requirements to Test Cases & Test Results for easy visibility
● Use Agile friendly Test Management Tool
● Use Test Automation solution that integrates with your Test Management solution
● Costly & Time consuming
● Use AI/ML based Self-Healing Test Automation solution to cut down Test Maintenance
of Automated Tests
● Use Test Design best practices (later in this presentation)
21. 21
Test Automation Strategy considerations
06
Test Data & Service
Virtualization
07
Managing, Maintaining
Automated Tests
08 Test Automation
Metrics
09 Continuous Testing
10
ROI: The Cost of Test
Automation
Software Test Metrics
Test Automation Metrics
● Monitor & Measure software testing & Test automation activities
● Gives insights into team’s progress, productivity, quality of AUT
● Usually conveys Result (absolute measure) or a Prediction (derivatives)
● Examples: Time taken to run a test (Result)
Defects created vs. Defects fixed (Predictive)
● # of Manual vs. Automated Tests
● Mean Time to Diagnosis (MTD)
● Bugs found in Automation
● Test Automation flakiness
22. 22
Test Automation Strategy considerations
06
Test Data & Service
Virtualization
07
Managing, Maintaining
Automated Tests
08 Test Automation
Metrics
09 Continuous Testing
10
ROI: The Cost of Test
Automation
Continuous Testing
● Shift left testing and integrate with build/release
● Continuous Testing with CI, CD integration
● Continuous Component Level Performance testing
23. 23
Test Automation Strategy considerations
06
Test Data & Service
Virtualization
07
Managing, Maintaining
Automated Tests
08 Test Automation
Metrics
09 Continuous Testing
10
ROI: The Cost of Test
Automation
ROI : Cost of Test Automation
● Automation Cost =
Tools cost + Labor costs to create an automated test + Test Maintenance Costs
● If automation cost < manual execution cost of the test, it’s an indicator that
automation is a good choice
● ROI quickly adds up with each re-run of your automated test suite
● Because it’s critical that you get a good return on your test automation
investment, there are some things you may not automate
25. Test Automation Design Patterns
Single
Responsibility
Principle
Model behavior of
application using page
objects - e.g., login,
home page. Reusability,
easy debugging and
maintenance
Screenplay Pattern
Breaks down page
objects to smaller more
maintainable, reliable
and readable
Ports and
Adapters
Decouple test code to
swap slow components
with fast simulators to
run test and app being
tested in the same
process
Presenter
First
Organize code and
development behaviors
to create completely
tested software using
TDD. Modification of
MVC approach.
25
26. Test Automation Process
1. Analyze
Understand test objectives, data,
what needs to be tested and
verified.
2. Author (Write/Record)
Identify start and end conditions are for
each test. Code or record the requirements
scenario into an independent automated
solution with assertions.
3. Execute
Run each test multiple times to
make sure it is reliable including
running with CI/CD.
5. Communicate
Share results with team, to gain confidence
in automation.
6. Repeat/refactor
Refactor test to make it more
reliable if needed.
4. Evaluate
Verify automated test is doing what
is expected.
26
27. Areas of testing – GUI Testing
Size and position of GUI
elements
Clear and well-aligned images
Font and alignment of text
Date and numeric fields
Screen Validations
Navigations (links)
Usability conditions
and data integrity
Error messages
Required fields
Abbreviations inconsistencies
Progress bars
Shortcuts
Screen rendering
27
1
2
3
4
5
6
7
8
9
10
11
13
12
28. Test Automation Design
Strategy & Best Practices
Prioritize
Reduce, Recycle, Reuse
Create Structured, Short
Single-Purpose Tests
Write Independent and
Isolated Tests
Compose Complex Tests
from Simple Steps
Tests’ Initial State Should
Always be Consistent
1
2
3
4
5
6
Use Wait for or similar
mechanisms to add waits
for synchronization.
Use Abstractions
Add Validations in
Turnover Points
Reduce the Occurrences
of Conditions
Use Setup and tear-down
7
Divide and Conquer
8
9
10
11
12
Use data-driven instead
of repeated tests
13
Use Test Design Patterns
and principles
14
Use a stable test
environment
15
Create Automated Tests
that are Resistant to
small changes in the UI.
16
Name your tests wisely17
Take screenshots for
failure investigation
18
Setup detailed automation
tests reporting
19
20 Do not rely ONLY on UI test
automation
21 ...
29. Top 10 reasons for Flaky Automated Tests
Not having a framework
Using hardcoded test data
Using X,Y coordinates or XPath for element
recognition
Using shared test environments. Not using a stable
test environment
Having tests that are dependent on one another
1
2
3
4
5
6
Tests not managing their own test data7
Not treating automation like any other software
development effort
8
Failure to use proper synchronization
Badly written tests10
Source: https://www.joecolantonio.com/top-10-reasons-for-flaky-automated-tests/ 29
9
Test not starting in a known state
31. Open SourceCriteria/What to consider? Vendor Sourced
A combination of set protocols, rules, standards and
guidelines that can be incorporated or followed as a whole
so as to leverage the benefits of the scaffolding provided
by the Test Automation Framework.
Test
Automation
Framework
Easy and fast Authoring
Reusability & Test Coverage
Script based, Scriptless or hybrid
Stability of Tests, low cost maintenance
Minimal Manual intervention
Root Cause Analysis/Debugging
Reports
Integration with APM
Testers (Developers, QA, Manual)
Open Source/Vendor Sourced
31
● Selenium
● Carina
● Google EarlGrey
● Cucumber
● Watir
● Appium
● Robot Framework
● JMeter
● Gauge
● Robotium
● Testim
● Tricentis
● Mabl
● BlazeMeter
● UFT/QTP
● LeanFT
● Automation Anywhere
● CodedUI
● TestComplete
● Sikuli
32. Test
Automation
Framework
Continuous Integration & Continuous Testing
Test Execution
Reports
Logs Exceptions Notifications
Execution Environments
Common Libraries
Application
Input
Application
Logic
Data
Readers
Test Automation Tool Set
Scripts and Resources
Object
Repository
Test Data
Configuration
Files
Constants
Environment
Settings
33. Test Automation Tools – Guidelines & Categories
33
Categories
● Test Management Tools
● Automated Testing Tools
● Cross-browser Testing Tools
● Web UI Testing Tools
● Mobile UI Testing Tools
● Load Testing Tools
● Defect Tracking Tools
● Security Testing Tools
● API Testing Tools
Criteria/Guidelines
● No “correct” tool for Test Automation
● Depends on your unique needs
● Execute a 2-week POC before selection
● Review extensibility, ease of use,
reporting, debugging, integration,
version control and other features
● Review team feedback
● Review product roadmap
● Evaluate cost including maintenance
Categories
● Test Data Management Tools
● Test Modeling Tools
● Visual Pixel Validation
● Visual Non-Pixel Validation
● Service Virtualization
● Database Virtualization
● …
● Continuous Integration
● Continuous Delivery
37. AI/ML in Test Automation
37
Auto heal test scripts for small
application changes for effective
regression testing and reduce test
maintenance.
Auto-validate test inputs based on ML
without manual user inputs validations
Auto-generate test cases based on
user interactions & use cases
● Testim
● Applitools
● TestCraft
● AccelQ
● Mabl
● AutonomIQ
● AppvanceIQ
AI/ML based Tools● Applications complexity has
increased enormously & so is
Software Testing
● AI/ML is already starting to play a
major role in Software Testing
● AI enables non deterministic tests
● AI Identifies problem areas based
on past tests (defects, results,
logs, test cases, source code etc)
● AI helps understand system
behavior better
ScenariosBrief Overview
38. Autonomous Software Testing
38
● Autonomous Testing isn’t just for Automobiles
anymore!
● AI/ML expected to play a key role in delivering
Autonomous Software Testing
● Moving from Driver to Driverless, Monitored to
Non-Monitored Tests
● Manual creation, execution, maintenance of Test
Cases to automated without human involvement
● AI based Learning from failures is helping make
decisions on how new tests will be created and
executed even under slightly changed conditions
Brief Overview
Image source: https://www.smartcitiesworld.net
39. LEARN BY
OBSERVATION OF
REAL CUSTOMERS
IN PRODUCTION
AGGREGATE USER-
ACTIONS INTO FLOWS
(REUSED COMPONENTS)
TESTS PRODUCED
FROM FLOWS
39
Autonomous Software Testing & Testim