1
What Top 10% Testing
Teams Are Doing That
You Can Do Too.
Achieving Continuous Visual
Quality Using Visual AI
Patrick McCartney, Director, Customer
Success Engineering
James Lamberti, Chief Marketing Officer
June 2019
Today’s Discussion
2
Got a question? Send it in chat, note the slide number.
● Introductions
● Live Polling – How To Participate
● Testing Today – Pain Points and the Reasons Why
● Top 10% Testing Teams – Four Things You Can Do To Join Them
● Real World Examples
LIVE POLL #1
How Many Escaped Visual Bugs Do You See Per Release?
3
● Just 1 or 2 (Lucky You!)
● 3 to 5
● 6 to 10
● 11 to 20
● More Than 20
The Untold Trap of Functional Automation
Functional vs. Visual
5
Functional
Testing
• Assessing if an
application
functions the way
it was designed
• Functional
requirements/user
stories
Visual Testing
• Assessing if the
user interface
looks the way it
was intended
• Wireframes/mock-
ups
Passed Tests ≠ Quality
MANUAL TESTERS PERFORMED
BOTH FUNCTIONAL TESTING AND
VISUAL TESTING
AUTOMATION WAS SOLD AS A
REPLACEMENT FOR MANUAL
TESTING
AUTOMATED FUNCTIONAL TESTS
WOULD PASS, BUT VISUAL
DEFECTS STILL ESCAPE!
6
It happens to Amazon
7
It happens to Microsoft
8
It happens to UPS
9
It happens to Twitter
10
11
Why?
• Our automation tools are ill-equipped
to find visual defects
• Testing functionality is very binary – it
often either works or does not
• Assessing the correctness of a UI is
very difficult and subjective
2019 State of Automated Visual Testing*:
Frequency and Cost of Escaped Visual Bugs Per Release
12*Source: 2019 State of Visual Testing Research Report (n=438)
13*Source: 2019 State of Visual Testing Research Report (n=438)
2019 State of
Automated
Visual Testing*:
Annual Cost To
Fix Escaped
Visual Bugs
Live Poll #1 Results:
How Many Visual Escape Bugs Do You See Per Release?
(Please watch the on-demand recording for the poll results)
14
The Digital Transformation
Live Poll #2:
How Many Pages or Screens Do You Have In Your Apps?
16
● Less than 10
● 11 to 20
● 21 to 50
● 51 to 100
● More Than 100
2019 State of Automated Visual Testing*:
Digital Transformation Quantified
18*Source: 2019 State of Visual Testing Research Report (n=438)
Live Poll #2 Results:
How Many Pages or Screens Do You Have In Your Apps?
(Please watch the on-demand recording for the poll results)
19
Effects of the Digital Transformation
20
Digital applications have created new challenges in testing
Cross-browser/device testing is required to verify applications work for all users
Your digital experience is being compared to every other brand, even if it’s not in your market
space
Greater digital adoption means your apps have to work for everyone
When apps don’t work, people fall back to more expensive non-digital interactions (if that still
exists)
Step 1 Step 2 Step 3 Step 4
21
How We Test
Step 1 Step 4Step 2 Step 3
22
How We Think We Find Defects
Step 4Step 2Step 1 Step 3
23
How We Actually Find Defects
A Story of Diminishing Returns
Looking for
functional
defects in
cross-browser
scenarios is
largely a waste
of time
Traditional
automation
solutions are
very bad at
assessing
visual
correctness
Worse, it will
likely result in
finding fewer
real defects
(especially
when we use
automation)
The more
browsers you
execute
functional tests
on, the better
chance the
tests fail
Using
functional
testing tools to
assess visual
quality doesn’t
work
24
Top 10% Testing Team:
Four Things You Can Do to Join Them
#1 – Test Smarter Using Visual AI
Live Poll #3:
How Much Visual Test Coverage Do You Have Today?
27
● Less Than 10%
● 11% to 25%
● 26% to 50%
● 51% to 80%
● Over 80%
Method #1: Manual Testing
28
• Low technical skill requirements
• Relatively high chance of finding a
visual defect on any tested screen
• Not scalable in digital apps or high
velocity development
• Susceptible to human error
Method #2: Assertions Using Functional Testing Solutions
29
• Very code-heavy
approach and likely to
break with minor app
updates
• Provides minimal
additional coverage
• Doesn’t really assess
how the UI looks
Method #3: Non-AI Based Visual Comparisons
30
CSS/
attributes
matching
Open
source
image
compare
solutions
DOM
compare
Pixel-to-
pixel
compare
• Typically free or low cost
• Often very high maintenance
• Too many false positives
make it impossible to scale
A Better Way: Visual AI
● Use artificial intelligence to quickly find UI differences across all
viewport sizes, browsers, and devices
● Extremely low code
(as little as one line
per screen)
● Extremely high coverage
● Scales effortlessly
31
*Source: 2019 State of Visual Testing Research Report (n=438)
2019 State of
Automated
Visual Testing*:
Overall Test
Coverage Based
on Level of
Automated Visual
Testing
32
Live Poll #3 Results:
How Much Visual Test Coverage Do You Have Today?
(Please watch the on-demand recording for the poll results)
33
#2 – Release Faster
Live Poll #4:
How Many Times Per Month Do You Release?
35
● Up to 4 times per month
● 5 to 10 times per month
● 11 to 20 times per month
● 21 to 30 times per month
● Continuously (For Real!)
36*Source: 2019 State of Visual Testing Research Report (n=438)
2019 State of
Automated
Visual Testing*:
CI-CD Initiative
and Progress
Ratings Among
R&D Teams
Underwhelming ROI From Testing
37
• CI/CD initiatives often fail
when software quality suffers
• Code-heavy tests require lots
of effort to stay in sync with
the AUT
• Flaky tests cause builds to fail
needlessly
• Software still ends up being
manually tested
Releasing Faster with Automated Visual Testing
38
● Leverage unit and API tests for reliable coverage
● Eliminate UI test code wherever possible – focus on critical paths
● Use automated visual testing to assess UI correctness of each screen
● Leverage Visual Grid to improve test speed, reliability, and defect capture rate
on web apps
● Tie test execution to SCM events
● Use visual tests early in the dev cycle – defects caught late are expensive!
● Eliminate manual effort
Use one line of code: eyes.CheckWindow()
39
Low code approach minimizes effort, and maximizes test ROI
Test smarter
with
Applitools
Visual Grid
eyes.checkWindow();
LOCAL OR CLOUD SELENIUM
SERVICE
40
● Coverage increases historically
came at the expense of velocity.
● Yet despite the increased
coverage and the large
applications, companies that are
mostly automated release much
faster.
41
201 State of Automated
Visual Testing*:
Releases Per Month By Level
of Automated Visual Testing
*Source: 2019 State of Visual Testing Research Report (n=438)
● Improvements to quality,
coverage, and release
velocity, are all a reality for
these innovative companies
despite managing
applications that are 2.2x
larger than the rest of the
market on average.
42
201 State of Automated
Visual Testing*:
Releases Per Month By Level
of Automated Visual Testing
*Source: 2019 State of Visual Testing Research Report (n=438)
Live Poll #4 Results:
How Many Times Per Month Do You Release?
(Please watch the on-demand recording for the poll results)
43
#3 – Partnering with the Business
Live Poll #5:
Who Has Primary Responsibility for Visual Quality?
45
● Testing & QA
● Front End Developers
● UI/UX Team
● Product
● Marketing
46
CI-CD Initiative Success
Rate
2019 State of
Automated Visual
Testing*:
Digital
Transformation
Initiative and
Progress Ratings
Among R&D
Teams
*Source: 2019 State of Visual Testing Research Report (n=438)
Solving For Visual Bugs: Art v. Science
47
Severity of Visual Bug
Business Team
Authority &
Responsibility
R&D Team
• Visual appearance is often
subjective
• Business teams are far more
involved as a result
• Who is involved, and how you
involve them to get decisions
and move quickly, is key to
continuous visual quality
• Top 10% teams have
process and use existing
tools to manage this reality
Design &
Branding Issues
Compliance &
Legal Issues
Minor Visual
Defects
Functional Bugs
Cross-Functional Best Practices
48
● Integrate Visual AI to drive automated visual testing
● Socialize visual testing software with all stakeholder personas
● Create a collaboration process using tools like Slack, Teams, etc.
for quick triage
● Bug tracking integration (e.g. Jira, etc.)
● Code repo and CI integration for early discovery
49
2019 State of Automated Visual Testing*:
Visual Quality Responsibility By Team
*Source: 2019 State of Visual Testing Research Report (n=438)
Live Poll #5 Results:
How Has Primary Responsibility for Visual Quality?
(Please watch the on-demand recording for the poll results)
50
#4 – Lead This Change
• Authority and
responsibility are
not well aligned
when it comes to
visual quality
52
2019 State of Automated Visual Testing*:
Visual Quality Responsibility By Team
*Source: 2019 State of Visual Testing Research Report (n=438)
Skill Up!
53
Skill Up at TestAutomationU.com
54
Clearly Explain How You Add Value
55
• Read The 2019 State of Visual
Testing Report
• Clearly Explain Your Value to the
Business Team
• 52% of Companies Will Have
Deployed Some Level Automated
Visual Testing by the end of
2019. Will You Be Among Them?
56
2019 State of Automated Visual Testing*:
Percentage of Teams Deploying Automated Visual Testing in 2019
*Source: 2019 State of Visual Testing Research Report (n=438)
Wrong Tool, Wrong Time: Re-Thinking Test Automation -- w/ State of Visual Testing Survey Results

Wrong Tool, Wrong Time: Re-Thinking Test Automation -- w/ State of Visual Testing Survey Results

  • 1.
    1 What Top 10%Testing Teams Are Doing That You Can Do Too. Achieving Continuous Visual Quality Using Visual AI Patrick McCartney, Director, Customer Success Engineering James Lamberti, Chief Marketing Officer June 2019
  • 2.
    Today’s Discussion 2 Got aquestion? Send it in chat, note the slide number. ● Introductions ● Live Polling – How To Participate ● Testing Today – Pain Points and the Reasons Why ● Top 10% Testing Teams – Four Things You Can Do To Join Them ● Real World Examples
  • 3.
    LIVE POLL #1 HowMany Escaped Visual Bugs Do You See Per Release? 3 ● Just 1 or 2 (Lucky You!) ● 3 to 5 ● 6 to 10 ● 11 to 20 ● More Than 20
  • 4.
    The Untold Trapof Functional Automation
  • 5.
    Functional vs. Visual 5 Functional Testing •Assessing if an application functions the way it was designed • Functional requirements/user stories Visual Testing • Assessing if the user interface looks the way it was intended • Wireframes/mock- ups
  • 6.
    Passed Tests ≠Quality MANUAL TESTERS PERFORMED BOTH FUNCTIONAL TESTING AND VISUAL TESTING AUTOMATION WAS SOLD AS A REPLACEMENT FOR MANUAL TESTING AUTOMATED FUNCTIONAL TESTS WOULD PASS, BUT VISUAL DEFECTS STILL ESCAPE! 6
  • 7.
    It happens toAmazon 7
  • 8.
    It happens toMicrosoft 8
  • 9.
  • 10.
    It happens toTwitter 10
  • 11.
    11 Why? • Our automationtools are ill-equipped to find visual defects • Testing functionality is very binary – it often either works or does not • Assessing the correctness of a UI is very difficult and subjective
  • 12.
    2019 State ofAutomated Visual Testing*: Frequency and Cost of Escaped Visual Bugs Per Release 12*Source: 2019 State of Visual Testing Research Report (n=438)
  • 13.
    13*Source: 2019 Stateof Visual Testing Research Report (n=438) 2019 State of Automated Visual Testing*: Annual Cost To Fix Escaped Visual Bugs
  • 14.
    Live Poll #1Results: How Many Visual Escape Bugs Do You See Per Release? (Please watch the on-demand recording for the poll results) 14
  • 15.
  • 16.
    Live Poll #2: HowMany Pages or Screens Do You Have In Your Apps? 16 ● Less than 10 ● 11 to 20 ● 21 to 50 ● 51 to 100 ● More Than 100
  • 17.
    2019 State ofAutomated Visual Testing*: Digital Transformation Quantified 18*Source: 2019 State of Visual Testing Research Report (n=438)
  • 18.
    Live Poll #2Results: How Many Pages or Screens Do You Have In Your Apps? (Please watch the on-demand recording for the poll results) 19
  • 19.
    Effects of theDigital Transformation 20 Digital applications have created new challenges in testing Cross-browser/device testing is required to verify applications work for all users Your digital experience is being compared to every other brand, even if it’s not in your market space Greater digital adoption means your apps have to work for everyone When apps don’t work, people fall back to more expensive non-digital interactions (if that still exists)
  • 20.
    Step 1 Step2 Step 3 Step 4 21 How We Test
  • 21.
    Step 1 Step4Step 2 Step 3 22 How We Think We Find Defects
  • 22.
    Step 4Step 2Step1 Step 3 23 How We Actually Find Defects
  • 23.
    A Story ofDiminishing Returns Looking for functional defects in cross-browser scenarios is largely a waste of time Traditional automation solutions are very bad at assessing visual correctness Worse, it will likely result in finding fewer real defects (especially when we use automation) The more browsers you execute functional tests on, the better chance the tests fail Using functional testing tools to assess visual quality doesn’t work 24
  • 24.
    Top 10% TestingTeam: Four Things You Can Do to Join Them
  • 25.
    #1 – TestSmarter Using Visual AI
  • 26.
    Live Poll #3: HowMuch Visual Test Coverage Do You Have Today? 27 ● Less Than 10% ● 11% to 25% ● 26% to 50% ● 51% to 80% ● Over 80%
  • 27.
    Method #1: ManualTesting 28 • Low technical skill requirements • Relatively high chance of finding a visual defect on any tested screen • Not scalable in digital apps or high velocity development • Susceptible to human error
  • 28.
    Method #2: AssertionsUsing Functional Testing Solutions 29 • Very code-heavy approach and likely to break with minor app updates • Provides minimal additional coverage • Doesn’t really assess how the UI looks
  • 29.
    Method #3: Non-AIBased Visual Comparisons 30 CSS/ attributes matching Open source image compare solutions DOM compare Pixel-to- pixel compare • Typically free or low cost • Often very high maintenance • Too many false positives make it impossible to scale
  • 30.
    A Better Way:Visual AI ● Use artificial intelligence to quickly find UI differences across all viewport sizes, browsers, and devices ● Extremely low code (as little as one line per screen) ● Extremely high coverage ● Scales effortlessly 31
  • 31.
    *Source: 2019 Stateof Visual Testing Research Report (n=438) 2019 State of Automated Visual Testing*: Overall Test Coverage Based on Level of Automated Visual Testing 32
  • 32.
    Live Poll #3Results: How Much Visual Test Coverage Do You Have Today? (Please watch the on-demand recording for the poll results) 33
  • 33.
  • 34.
    Live Poll #4: HowMany Times Per Month Do You Release? 35 ● Up to 4 times per month ● 5 to 10 times per month ● 11 to 20 times per month ● 21 to 30 times per month ● Continuously (For Real!)
  • 35.
    36*Source: 2019 Stateof Visual Testing Research Report (n=438) 2019 State of Automated Visual Testing*: CI-CD Initiative and Progress Ratings Among R&D Teams
  • 36.
    Underwhelming ROI FromTesting 37 • CI/CD initiatives often fail when software quality suffers • Code-heavy tests require lots of effort to stay in sync with the AUT • Flaky tests cause builds to fail needlessly • Software still ends up being manually tested
  • 37.
    Releasing Faster withAutomated Visual Testing 38 ● Leverage unit and API tests for reliable coverage ● Eliminate UI test code wherever possible – focus on critical paths ● Use automated visual testing to assess UI correctness of each screen ● Leverage Visual Grid to improve test speed, reliability, and defect capture rate on web apps ● Tie test execution to SCM events ● Use visual tests early in the dev cycle – defects caught late are expensive! ● Eliminate manual effort
  • 38.
    Use one lineof code: eyes.CheckWindow() 39 Low code approach minimizes effort, and maximizes test ROI
  • 39.
  • 40.
    ● Coverage increaseshistorically came at the expense of velocity. ● Yet despite the increased coverage and the large applications, companies that are mostly automated release much faster. 41 201 State of Automated Visual Testing*: Releases Per Month By Level of Automated Visual Testing *Source: 2019 State of Visual Testing Research Report (n=438)
  • 41.
    ● Improvements toquality, coverage, and release velocity, are all a reality for these innovative companies despite managing applications that are 2.2x larger than the rest of the market on average. 42 201 State of Automated Visual Testing*: Releases Per Month By Level of Automated Visual Testing *Source: 2019 State of Visual Testing Research Report (n=438)
  • 42.
    Live Poll #4Results: How Many Times Per Month Do You Release? (Please watch the on-demand recording for the poll results) 43
  • 43.
    #3 – Partneringwith the Business
  • 44.
    Live Poll #5: WhoHas Primary Responsibility for Visual Quality? 45 ● Testing & QA ● Front End Developers ● UI/UX Team ● Product ● Marketing
  • 45.
    46 CI-CD Initiative Success Rate 2019State of Automated Visual Testing*: Digital Transformation Initiative and Progress Ratings Among R&D Teams *Source: 2019 State of Visual Testing Research Report (n=438)
  • 46.
    Solving For VisualBugs: Art v. Science 47 Severity of Visual Bug Business Team Authority & Responsibility R&D Team • Visual appearance is often subjective • Business teams are far more involved as a result • Who is involved, and how you involve them to get decisions and move quickly, is key to continuous visual quality • Top 10% teams have process and use existing tools to manage this reality Design & Branding Issues Compliance & Legal Issues Minor Visual Defects Functional Bugs
  • 47.
    Cross-Functional Best Practices 48 ●Integrate Visual AI to drive automated visual testing ● Socialize visual testing software with all stakeholder personas ● Create a collaboration process using tools like Slack, Teams, etc. for quick triage ● Bug tracking integration (e.g. Jira, etc.) ● Code repo and CI integration for early discovery
  • 48.
    49 2019 State ofAutomated Visual Testing*: Visual Quality Responsibility By Team *Source: 2019 State of Visual Testing Research Report (n=438)
  • 49.
    Live Poll #5Results: How Has Primary Responsibility for Visual Quality? (Please watch the on-demand recording for the poll results) 50
  • 50.
    #4 – LeadThis Change
  • 51.
    • Authority and responsibilityare not well aligned when it comes to visual quality 52 2019 State of Automated Visual Testing*: Visual Quality Responsibility By Team *Source: 2019 State of Visual Testing Research Report (n=438)
  • 52.
  • 53.
    Skill Up atTestAutomationU.com 54
  • 54.
    Clearly Explain HowYou Add Value 55 • Read The 2019 State of Visual Testing Report • Clearly Explain Your Value to the Business Team
  • 55.
    • 52% ofCompanies Will Have Deployed Some Level Automated Visual Testing by the end of 2019. Will You Be Among Them? 56 2019 State of Automated Visual Testing*: Percentage of Teams Deploying Automated Visual Testing in 2019 *Source: 2019 State of Visual Testing Research Report (n=438)